Difference between scipy.optimize.fmin and scipy.optimize.minimize - python

I'm learning python to make models these days. I read the documentation of scipy.optimize.fmin. It also recommends scipy.optimize.minimize. It seems that scipy.optimize.minimize is a more advanced method. Real wonder what's the difference between these two.

scipy.optimize.minimize is a high-level interface that lets you choose from a broad range of solvers, one of which is Nelder–Mead. scipy.optimize.fmin is a special solver using Nelder–Mead. For a specific subdocumentation of minimize with Nelder–Mead, see here.

Related

Multi-objective optimization in pyomo

could you please share a well-structured tutorial or guide to implement multi-objective optimization in Pyomo?
I just found some Q&A which was a bit vague
As far as I know, Pyomo does not have special facilities to handle multi-objective models. So there this nothing special to learn, and you can use any Pyomo text or tutorial.
You will need to formulate your problem in terms of single objective models. Arguably, the most popular schemes are a weighted objective or a lexicographic method. The last approach will require you to solve a number of models. Although the concepts may require a bit of thought, these methods are not difficult to implement.

Can I include continuous variable when optimising with CPMpy?

I need to run a model, where I optimise a diet within a set of constraints and call all integer solutions in the end. I have found a diet example matching almost what I need here: hakank.org. However, in my case, my variables take continuous values, so in the examples this would be all the nutritional values and the cost, while only x take integer. However, it seems like I can only define either 'intvar' or 'boolvar' when defining by variables with this model. Is there a way to overcome this? Other would there be other more suitable models with examples that I can read online?
I'm new to constraint programming, so any help would be appreaciated!
Thanks.
Most Constraint Programming tools and solvers only work with integers. That is where their strength is. If you have a mixture of continuous and discrete variables, it is a good idea to have a look at Mixed Integer Programming. MIP tools and solvers are widely available.
The diet model is a classic example of an LP (Linear Programming) Model. When adding integer restrictions, you end up with a MIP model.
To answer your question: CPMpy does not support float variables (and I'm not sure that it's in the pipeline for future extensions).
Another take - than using MIP solvers as Erwin suggest - would be to write a MiniZinc (https://www.minizinc.org/) model of the problem and use some of its solvers. See my MiniZinc version of the diet problem: http://hakank.org/minizinc/diet1.mzn. And see the MiniZinc version of Stigler's Diet problem though it's float vars only: http://hakank.org/minizinc/stigler.mzn.
There are some MiniZinc CP solvers that also supports float variables, e.g. Gecode, JaCoP, and OptimathSAT. However, depending on the exact constraints - such as the relation with the float vars and the integer vars - they might struggle to find solutions fast. In contrast to some MIP solvers, generating all solutions is one of the general features of CP solvers.
Perhaps all these diverse suggestions more confuse than help you. Sorry about that. It might help if you give some more details about your problem.

how do we calculate runtime of Z3 sat solver

I am using z3py to solve a set of equations. How would I calculate the runtime order of it?
It has bitvecs variables which need to be satisfied in a set of linear equations. The documentation and the guide does not give a way to calculate the runtime.
Are you asking for the (worst-case) time complexity of the used solvers? If so, I don't think that you'll be able to get a good answer: it depends on the (combination of) logic(s) into which your problem falls, e.g. QF_BV or UFNIA, and then on the ((semi) decision) procedures that the solver implements for that (combination of) logic(s).
Have a look at papers from the Z3 authors (https://github.com/Z3Prover/z3/wiki/Publications) - they might provide some details.

parameter within an interval while optimizing

Usually I use Mathematica, but now trying to shift to python, so this question might be a trivial one, so I am sorry about that.
Anyways, is there any built-in function in python which is similar to the function named Interval[{min,max}] in Mathematica ? link is : http://reference.wolfram.com/language/ref/Interval.html
What I am trying to do is, I have a function and I am trying to minimize it, but it is a constrained minimization, by that I mean, the parameters of the function are only allowed within some particular interval.
For a very simple example, lets say f(x) is a function with parameter x and I am looking for the value of x which minimizes the function but x is constrained within an interval (min,max) . [ Obviously the actual problem is just not one-dimensional rather multi-dimensional optimization, so different paramters may have different intervals. ]
Since it is an optimization problem, so ofcourse I do not want to pick the paramter randomly from an interval.
Any help will be highly appreciated , thanks!
If it's a highly non-linear problem, you'll need to use an algorithm such as the Generalized Reduced Gradient (GRG) Method.
The idea of the generalized reduced gradient algorithm (GRG) is to solve a sequence of subproblems, each of which uses a linear approximation of the constraints. (Ref)
You'll need to ensure that certain conditions known as the KKT conditions are met, etc. but for most continuous problems with reasonable constraints, you'll be able to apply this algorithm.
This is a good reference for such problems with a few examples provided. Ref. pg. 104.
Regarding implementation:
While I am not familiar with Python, I have built solver libraries in C++ using templates as well as using function pointers so you can pass on functions (for the objective as well as constraints) as arguments to the solver and you'll get your result - hopefully in polynomial time for convex problems or in cases where the initial values are reasonable.
If an ability to do that exists in Python, it shouldn't be difficult to build a generalized GRG solver.
The Python Solution:
Edit: Here is the python solution to your problem: Python constrained non-linear optimization

Constrained least-squares estimation in Python

I'm trying to perform a constrained least-squares estimation using Scipy such that all of the coefficients are in the range (0,1) and sum to 1 (this functionality is implemented in Matlab's LSQLIN function).
Does anybody have tips for setting up this calculation using Python/Scipy. I believe I should be using scipy.optimize.fmin_slsqp(), but am not entirely sure what parameters I should be passing to it.[1]
Many thanks for the help,
Nick
[1] The one example in the documentation for fmin_slsqp is a bit difficult for me to parse without the referenced text -- and I'm new to using Scipy.
scipy-optimize-leastsq-with-bound-constraints on SO givesleastsq_bounds, which is
leastsq
with bound constraints such as 0 <= x_i <= 1.
The constraint that they sum to 1 can be added in the same way.
(I've found leastsq_bounds / MINPACK to be good on synthetic test functions in 5d, 10d, 20d;
how many variables do you have ?)
Have a look at this tutorial, it seems pretty clear.
Since MATLAB's lsqlin is a bounded linear least squares solver, you would want to check out scipy.optimize.lsq_linear.
Non-negative least squares optimization using scipy.optimize.nnls is a robust way of doing it. Note that, if the coefficients are constrained to be positive and sum to unity, they are automatically limited to interval [0,1], that is one need not additionally constrain them from above.
scipy.optimize.nnls automatically makes variables positive using Lawson and Hanson algorithm, whereas the sum constraint can be taken care of as discussed in this thread and this one.
Scipy nnls uses an old fortran backend, which is apparently widely used in equivalent implementations of nnls by other software.

Categories

Resources