Gradient of a nonlinear pyomo constraint at a given point - python

I (repeatedly) need numeric gradient information of a nonlinear pyomo constraint con at a given point (i.e. the variables of the corresponding pyomo model are all set to a specific value). I have read this post and decided that the (slightly modified) lines
from pyomo.core.base.symbolic import differentiate
var_list = list(model.component_objects(Var, active=True))
grad_num = [value(partial) for partial in differentiate(g_nu.body, wrt_list=vars)]
should serve my purpose.
However, the example below already fails, presumably due to the appearance of the exponential function:
from pyomo.environ import *
model = ConcreteModel()
model.x_1 = Var()
model.x_2 = Var()
model.constr = Constraint(expr = 2*(model.x_1)**4+exp(model.x_2)<=3)
model.x_1.set_value(1)
model.x_2.set_value(1)
varList = list(model.component_objects(Var, active=True))
grad = [value(partial) for partial in differentiate(model.constr.body, wrt_list=varList)]
DeveloperError: Internal Pyomo implementation error:
"sympy expression type 'exp' not found in the operator map for expression >exp(x1)"
Please report this to the Pyomo Developers.
So, my question is: Can pyomo generally differentiate expressions like the exponential function/ square root etc. and is my example just an unfortunate coincidence which can be easily fixed? I will deal with various models from the MINLPLIB and some tool for differentiating the appearing expressions is crucial.

This error existed through Pyomo 5.2 and was resolved in Pyomo 5.3. Upgrading to 5.3 fixes the problem, and your example works fine (after adding from pyomo.core.base.symbolic import differentiate).

Related

Using logarithms with PuLP?

I am using the package PuLP to solve several linear programming problems. Note: I am a beginner in LP.
In order to express one of the constraints for my LP problem as a linear expression, I used a logarithmic transformation. In detail, my constraint has the following form:
log(variable 1) + log(variable 2) <= log(1.2) + log(variable 3)
Is it possible to write this into PuLP?
I tried the following as a test:
model += np.log ( pulp.lpSum( [x[i]*label_values[i]*female[i] for i in students ] ) ) <= np.log(1.2)
and received the following error: TypeError: loop of ufunc does not support argument 0 of type LpAffineExpression which has no callable log method It looks like the numpy log method isn't interfacing well with pulp.lpSum.
Any idea on whether or not this is possible?
Thanks!

How to make statsmodels GLM.fit_constrained result picklable/store-and-reloadable

A GLS (or thus also OLS) regression with constraints on parameters can readily be run using statsmodels GLM.fit_constrained() method, as with the code below (or here).
How can I make the GLMresults object resulting from such a statsmodels GLM.fit_constrained() regression picklable, so that the estimation result can be stored for re-use for prediction in a new session anytime later?
The GLMresults object obtained from fit_constrained() and containing the relevant estimation result has its .save() method that would normally readily pickle the object into a file.
This .save() works for the result from a standard (unconstrained) GLM regression, sm.glm.fit(). However, it doesn't work with the result for sm.glm.fit_unconstrained(). Instead, it throws a pickling error, seemingly because patsy DesignMatrixBuilder is not Picklable, so it links to the never resolved issue here. This at least for my Python 3.6.3 (running on Windows).
An example:
import statsmodels
import statsmodels.api as sm
import pandas as pd
# Define exapmle data & Constraints:
import numpy as np
df = pd.DataFrame(np.random.randint(0,100,size=(100, 5)), columns=list('ABCDF'))
y = df['A']
X = df[['B','C','D','F']]
constraints = ['B + C + D', 'C - F'] # Add two linear constraints on parameters: B+C+D = 0 & C-F = 0
statsmodels.genmod.families.links.identity()
OLS_from_GLM = sm.GLM(y, X)
# Unconstrained regression:
result_u = OLS_from_GLM.fit()
result_u.save('myfile_u.pickle') # This works
# Constrained regression - save() fails
result_c = OLS_from_GLM.fit_constrained(constraints)
result_c.save('myfile_c.pickle') # This fails with pickling error (tested in Python 3.6.3 on Windows): "NotImplementedError: Sorry, pickling not yet supported. See https://github.com/pydata/patsy/issues/26 if you want to help."
Is there a way to readily make the result from fit_unconstrained() picklable i.e./or storable?
I below suggest a first workaround answer; it is trivial and works well for me so far. I do not know, however, whether it is truly advisable or whether its risks are large and/or any preferable alternative solution exists.
I got this to work by simply removing (commenting out) the line
res._results.constraints = lc
in the function definition of fit_constrained() within statsmodels' active generalized_linear_model.py script (in my case in the virtualenv folder \env\Lib\site-packages\statsmodels\genmod\generalized_linear_model.py).
Idling this line seems to have created no problem for my work; I can now readily save and reload the pickled file and use it to make correct predictions based on the stored estimation; the imposed parameter constraints remain respected and predictions made using .predict() remain unchanged after reloading.
I wonder though whether there is any major risk attached to this procedure. I am not familiar with the inner workings of the statsmodels library, or with its glm.fit_constrained() method in particular. i reckon it's unadvisable to change anything in a pre-existing module one does not understand. However, it is the only way I am conveniently able to impose various constraints to my GLM parameters and to be able to save the regression results to readily re-use it for prediction in a later session.

Is there any differences between Pyomo Set and Python set?

First time pyomo user here.
Is there any difference between declaring sets in an optimization model as
model.A = Set(initialize = list_values)
and using a built in python set like this one?
A = set(list_values)
Are there any advantages of one over the other?
First you can call the model variables, sets, parameters in the constraint rules via:
def some_constraint_rule(model):
return model.some_set
So you cant do something like this via python sets because the model you input in the constraint rule does not have your python set.
Second you cannot index python sets (so far I know, therefore u need a dict??), but with pyomo sets, u can do much more stuff eg.:
m.t = pyomo.Set(
within=m.index,
initialize=something,
ordered=True,
doc='some documentation for your set?')

Pyomo: How to use the final data point in an abstract model's objective?

I have a Pyomo model which has the form:
from pyomo.environ import *
from pyomo.dae import *
m = AbstractModel()
m.t = ContinuousSet(bounds=(0,120))
m.T = Param(default=120)
m.S = Var(m.t, bounds=(0,None))
m.Sdot = DerivativeVar(m.S)
m.obj = Objective(expr=m.S[120],sense=maximize)
Note that the objective m.obj relies on the parameter m.T. Attempting to run this gives the error:
TypeError: unhashable type: 'SimpleParam'
Using a value, such as expr=m.S[120] gives the error:
ValueError: Error retrieving component S[120]: The component has not been constructed.
In both cases, my goal is the same: to optimize for the largest possible value of S at the horizon.
How can I create an abstract model which expresses this?
You are hitting on two somewhat separate issues:
TypeError: unhashable type: 'SimpleParam'
Is due to a bug in Pyomo 4.3 where you cannot directly use simple Params as indexes into other components. That said, the fix for this particular problem will not fix your example model.
The trick to fixing your Objective declaration is to encapsulate the Objective expression within a rule:
def obj_rule(m):
return m.S[120]
# or better yet:
# return m.S[m.T]
# or
# return m.S[m.t.last()]
m.obj = Objective(rule=obj_rule,sense=maximize)
The problem is that when you are writing an Abstract model, each component is only being declared, but not defined. So, the Var S is declared to exist, but has not been defined (it is an empty shell with no members). This causes a problem because Python (not Pyomo) attempts to resolve the m.S[120] to a specific variable immediately before calling the Objective constructor. The use of rules (functions) in Abstract models allows you to defer the resolution of the expression until Pyomo is actually constructing the model instance. Pyomo constructs the instance components in the same order that you declared them on the Abstract model, so when it fires the obj_rule, the previous components (S, T, and t) are all constructed and S has valid members at the known points of the ContinuousSet (in this case, the bounds).

Converting `AMPL` or `GAMS` files to `Python`

I would like to make some experiments in Python with constraint satisfaction problems from this database: all of the examples there are given in both the AMPL and GAMS file format. Is there a tool to convert these equations to simple python functions that looks like
def quadratic(X):
return (X[0]**2 - X[1]**2, 2*X[0]*X[1])
and similar?
I have started reading this manual but am not sure if that's the right direction. (I'm not very strong at programming.) I will be grateful for possible hints.
I've recently written a parser for a subset of AMPL in Python which is available here. It is incomplete but can already handle many AMPL expressions and can be easily extended.
Here's an example of converting an objective function from AMPL into Python:
import ampl, sys
class AMPLToPythonConverter:
def __init__(self, stream):
self.stream = stream
def visit_reference(self, expr):
self.stream.write(expr.name)
def visit_binary(self, expr):
expr.lhs.accept(self)
self.stream.write(' {} '.format(expr.op))
expr.rhs.accept(self)
def visit_decl(self, decl):
if decl.kind == 'minimize':
self.stream.write("def {}(x):\n".format(decl.name))
self.stream.write(" return ");
decl.body.accept(self)
self.stream.write('\n')
compound_stmt = ampl.parse(
"""
var x;
minimize f: x * x;
""", "input")
for node in compound_stmt.nodes:
node.accept(AMPLToPythonConverter(sys.stdout))
Running this code prints:
def f(x):
return x * x
To keep things simple the example hardcodes argument name x, but it is possible to derive it from the AST.
The following connection between GAMS and Pyomo might be of use:
Pyomo is a Python-based open-source software package that supports a diverse set of optimization capabilities for formulating, solving, and analyzing optimization models.
The convert function provided by GAMS allows you to generate a Pyomo Concrete scalar model from a GAMS-model.
Hence, you should be able to convert the GAMS model to a Pyomo model and then access its functions via the capabilities provided by Pyomo.

Categories

Resources