tf.contrib.constrained_optimization.ConstrainedMinimizationProblem
Stay organized with collections
Save and categorize content based on your preferences.
Abstract class representing a ConstrainedMinimizationProblem
.
A ConstrainedMinimizationProblem consists of an objective function to
minimize, and a set of constraint functions that are constrained to be
nonpositive.
In addition to the constraint functions, there may (optionally) be proxy
constraint functions: a ConstrainedOptimizer will attempt to penalize these
proxy constraint functions so as to satisfy the (non-proxy) constraints. Proxy
constraints could be used if the constraints functions are difficult or
impossible to optimize (e.g. if they're piecewise constant), in which case the
proxy constraints should be some approximation of the original constraints
that is well-enough behaved to permit successful optimization.
Attributes |
constraints
|
Returns the vector of constraint functions.
Letting g_i be the ith element of the constraints vector, the ith constraint
will be g_i <= 0.
|
num_constraints
|
Returns the number of constraints.
|
objective
|
Returns the objective function.
|
pre_train_ops
|
Returns a list of Operation s to run before the train_op.
When a ConstrainedOptimizer creates a train_op (in minimize
minimize_unconstrained , or minimize_constrained ), it will include these
ops before the main training step.
|
proxy_constraints
|
Returns the optional vector of proxy constraint functions.
The difference between constraints and proxy_constraints is that, when
proxy constraints are present, the constraints are merely EVALUATED during
optimization, whereas the proxy_constraints are DIFFERENTIATED. If there
are no proxy constraints, then the constraints are both evaluated and
differentiated.
For example, if we want to impose constraints on step functions, then we
could use these functions for constraints . However, because a step
function has zero gradient almost everywhere, we can't differentiate these
functions, so we would take proxy_constraints to be some differentiable
approximation of constraints .
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2020-10-01 UTC.
[null,null,["Last updated 2020-10-01 UTC."],[],[],null,["# tf.contrib.constrained_optimization.ConstrainedMinimizationProblem\n\n\u003cbr /\u003e\n\n|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v1.15.0/tensorflow/contrib/constrained_optimization/python/constrained_minimization_problem.py#L31-L140) |\n\nAbstract class representing a `ConstrainedMinimizationProblem`.\n\nA ConstrainedMinimizationProblem consists of an objective function to\nminimize, and a set of constraint functions that are constrained to be\nnonpositive.\n\nIn addition to the constraint functions, there may (optionally) be proxy\nconstraint functions: a ConstrainedOptimizer will attempt to penalize these\nproxy constraint functions so as to satisfy the (non-proxy) constraints. Proxy\nconstraints could be used if the constraints functions are difficult or\nimpossible to optimize (e.g. if they're piecewise constant), in which case the\nproxy constraints should be some approximation of the original constraints\nthat is well-enough behaved to permit successful optimization.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Attributes ---------- ||\n|---------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `constraints` | Returns the vector of constraint functions. \u003cbr /\u003e Letting g_i be the ith element of the constraints vector, the ith constraint will be g_i \\\u003c= 0. |\n| `num_constraints` | Returns the number of constraints. |\n| `objective` | Returns the objective function. |\n| `pre_train_ops` | Returns a list of `Operation`s to run before the train_op. \u003cbr /\u003e When a `ConstrainedOptimizer` creates a train_op (in `minimize` `minimize_unconstrained`, or `minimize_constrained`), it will include these ops before the main training step. |\n| `proxy_constraints` | Returns the optional vector of proxy constraint functions. \u003cbr /\u003e The difference between `constraints` and `proxy_constraints` is that, when proxy constraints are present, the `constraints` are merely EVALUATED during optimization, whereas the `proxy_constraints` are DIFFERENTIATED. If there are no proxy constraints, then the `constraints` are both evaluated and differentiated. For example, if we want to impose constraints on step functions, then we could use these functions for `constraints`. However, because a step function has zero gradient almost everywhere, we can't differentiate these functions, so we would take `proxy_constraints` to be some differentiable approximation of `constraints`. |\n\n\u003cbr /\u003e"]]