I am trying to add a constraint to a very complicated minimization problem I have but I am not sure how to implement it, even after reading the docs.
I have a simple example that if answered will help me with my original problem. Here is the code:
from iminuit import Minuit
def f(x,y,z):
return (x-1.)**2 +(y-2*x)**2 + (z-3.*x)**2 -1.
m=Minuit(f, x=.5, error_x=0.2, limit_x=(0.,1.), y=0.,limit_y=
(0.,1.), print_level=1)
m.migrad();
I would love to add a constraint, say x+y=1.
Thanks
Answer to my own question is don't bother using minuit. Use scipy.optimize with method SLSQP. It has equality and inequality constraint methods built in.
Related
I have a function of two variables, R(t,r), that has been constructed using a list of values for R, t, and r. This function cannot be written down, the values are found from solving a differential equation (d R(t,r)/dt). I require to take the derivatives of the function, in particular, I need
dR(t,r)/dr, d^2R(t,r)/drdt. I have tried using this answer to do this, but I cannot seem to get an answer that makes sense. (note that all derivatives should be partials). Any help would be appreciated.
Edit:
my current code. I understand getting anything to work without the `Rdata' file is impossible but the file itself is 160x1001. Really, any data could be made up to get the rest to work. Z_t does not return answers that seem like the derivative of my original function based on what I know, therefore, I know it is not differentiating my function as I'd expect.
If there are numerical routines for using the array of data I do not mind, I simply need some way of figuring out the derivatives.
import numpy as np
from scipy import interpolate
data = np.loadtxt('Rdata.txt')
rvals = np.linspace(1,160,160)
tvals = np.linspace(0,1000,1001)
f = interpolate.interp2d(tvals, rvals, data)
Z_t = interpolate.bisplev(tvals, rvals, f.tck, dx=0.8, dy=0)
I'm looking to set up a constraint-check in Python using PULP. Suppose I had variables A1,..,Xn and a constraint (AffineExpression) A1X1 + ... + AnXn <= B, where A1,..,An and B are all constants.
Given an assignment for X (e.g. X1=1, X2=4,...Xn=2), how can I check if the constraints are satisfied? I know how to do this with matrices using Numpy, but wondering if it's possible to do using PULP to let the library handle the work.
My hope here is that I can check specific variable assignments. I do not want to run an optimization algorithm on the problem (e.g. prob.solve()).
Can PULP do this? Is there a different Python library that would be better? I've thought about Google's OR-Tools but have found the documentation is a little bit harder to parse through than PULP's.
It looks like this is possible doing the following:
Define PULP variables and constraints and add them to an LpProblem
Make a dictionary of your assignments in the form {'variable name': value}
Use LpProblem.assignVarsVals(your_assignment_dict) to assign those values
Run LpProblem.valid() to check that your assignment meets all constraints and variable restrictions
Note that this will almost certainly be slower than using numpy and Ax <= b. Formulating the problem might be easier, but performance will suffer due to how PULP runs these checks.
You can stay in numpy and accomplish this. Looking at a single line from a matrix you can set your row of A equal to a vector and then create a row sum that allows you to check the index and find if it is true. For example:
a = A[0, :]
row_sum = a*x
sum(row_sum) <= B[0]
The last line will return just True or False. Then if you want to change a single index you could update your row_sum array by using
row_sum[3] = a[3]*new_val
and run your analysis again.
I want to solve my other question here so I need sympy to return an error whenever there is no analytical/symbolic solution for and integral.
For example if I try :
from sympy import *
init_printing(use_unicode=False, wrap_line=False, no_global=True)
x = Symbol('x')
integrate(1/cos(x**2), x)
It just [pretty] prints the integral itself
without solving and/or giving an error about not being able to solve it!
P.S. I have also asked this question here on Reddit.
A "symbolic" solution always exists: I just invented a new function intcos(x), which by definition is the antiderivative of 1/cos(x**2). Now this integral has a symbolic solution!
For the question to be rigorously answerable, one has to restrict the class of functions allowed in the answer. Typically one considers elementary functions. As SymPy integral reference explains, the Risch algorithm it employs can prove that some functions do not have elementary antiderivatives. Use the option risch=True and check whether the return value is an instance of sympy.integrals.risch.NonElementaryIntegral
from sympy.integrals.risch import NonElementaryIntegral
isinstance(integrate(1/exp(x**2), x, risch=True), NonElementaryIntegral) # True
However, since Risch algorithm implementation is incomplete, in many cases like 1/cos(x**2) it returns an ordinary Integral object. This means it was not able to either find an elementary antiderivative or prove that one does not exist.
For this example, it helps to rewrite the trigonometric function in terms of exponential, with rewrite(cos, exp):
isinstance(integrate((1/cos(x**2)).rewrite(cos, exp), x, risch=True), NonElementaryIntegral)
returns True, so we know the integral is nonelementary.
Non-elementary antiderivatives
But often we don't really need an elementary function; something like Gamma or erf or Bessel functions may be okay; as long as it's some "known" function (which of course is a fuzzy term). The question becomes: how to tell if SymPy was able to integrate a specific expression or not? Use .has(Integral) check for that:
integrate(2/cos(x**2), x).has(Integral) # True
(not isinstance(Integral) because the return value can be, like here, 2*Integral(1/cos(x**2), x).) This does not prove anything other than SymPy's failure to find the antiderivative. The antiderivative may well be a known function, even an elementary one.
Abstract problem to be solved:
we have n d-dimentional design variables, say {k_0, k_1, ..., k_n}
maximize the minimum of [f(k_0), f(k_1), ... f(k_n)], where f() a nonlinear function, i.e. maximin
constraint: mean([k_0, k_1, ...,k_n])==m, m known constant
Can someone provide an example of how this can be solved (maximin, d-dim variables) via pyOpt?
EDIT: i tried this:
import scipy as sp
from pyOpt.pyOpt_optimization import Optimization
from pyOpt.pyALPSO.pyALPSO import ALPSO
def __objfunc(x,**kwargs):
f=min([x[0]+x[1],x[2]+x[3]])
g=[0.0]
g[0]=(((x[0]+x[1])+(x[2]+x[3]))/2.0)-5
fail=0
return f,g,fail
if __name__=='__main__':
op=Optimization('test', __objfunc)
op.addVarGroup('p0',4,type='c')
op.addObj('f')
op.addCon('ineq','i')
o=ALPSO()
o(op)
print(op._solutions[0])
suppose 2-dimentional design variables
is there any better way?
I probably would reformulate this as:
The min() function you used is non-differentiable (and thus dangerous). Also the mean() function can be replaced by a linear constraint (which is easier).
I am not familiar with the ALPSO solver, but this reformulation would usually be helpful for more traditional solvers like SNOPT, NLPQL and FSQP.
The following example is stated just for the purpose of precise definition of the query. Consider a recursive equation x[k+1] = a*x[k] where a is some constant. Now, is there an easier way or an existing method within sympy/numpy that does the following (i.e., gives an expression over a horizon for a given recursive equation):
def get_expr(init, num):
a = Symbol('a')
expr = init
for i in range(num):
expr = a*expr
return expr
x0 = Symbol('x0')
get_expr(x0,3)
Horizon above is 3.
I was going to suggest using SymPy's rsolve to try to find a closed form solution to your equation, but it seems that at least for this specific one, there is a bug that prevents it from working. See http://code.google.com/p/sympy/issues/detail?id=2943. Maybe if you really want to know for a more complicated expression you could try that. For this one, the closed form solution is just a**n*x0.
Aside from that, SymPy doesn't have any functions that would do this evaluation directly, but it does have some things that can help. There are some memoization decorators in sympy.utilities.memoization that are made for internal use, but should work just fine for external uses. They can help make your evaluation more efficient by caching the result of previous evaluations. You'll need to write the get_expr recursively for it to work effectively. Or you could just write your own cacher. It's not that complicated.