Multi Threading for PuLP library in python - python

I want to solve an optimisation problem using PuLP library in python. My optimisation problem has >10000 variables and lot of constraints. It takes very long time for PuLP to solve such big problems. Is there any way we can implement multi threading and gain speed ?
Any other solution/library for such big optimisation problems?

Linear programming has not been very amenable to paralelisation, so your best bet to make the problem faster is either to use a different solver or to reformulate your problem.
You can get a feel for the speed at which other solvers can solve your problem by generating an MPS file (using the writeMPS() method on your propblem variable) and submitting it to NeOS.

Related

scaling MILP using pulp and cplex

i have a MILP with ~3000 binaries, 300000 continuous variables and ~1MM constraints. I am trying to solve this on the VM how long could it potentially take on a 16 core 128 gig machine? also what are the general limits of creating problems using pulp that cplex solver can handle on such a machine? any insights would be appreciated
The solution time is not just a function of the number of variables and equations. Basically, you just have to try it out. No one can predict how much time is needed to solve your problem.
It is impossible to answer either question sensibly. There are some problems with only a few thousand variables that are still unsolved 'hard' problems and others with millions of variables that can be solved quite easily. Solution time depends hugely on the structure and numerical details of your problem and many other non-trivial factors.

Fastest LP free Library in python, with copy Function

i have written a linear optimization problem in PuLP and it works great.. but the solving time is huge, so i have tried another optimization library google-OR that is x10 time fister but has a big issue that doesn't allow to deepcopy the problem (while PuLP does) and for me is very important due to the fact that my optimization problem is a search procedure so i have to change some values in the constraints and solve the problem again and again (hundreds of time). So there is any python library faster than PuLP that allows the copy of the model? Or there is a way to copy an LP problem in google-OR?

How to interpret cProfile results of PuLP

I fear I am a bit in over my head. I have profiled a Mixed Integer Problem (MIP) with cProfile and gprof2dot. The MIP is implemented via the pulp library. The MIP problem is solvable, which I tested on smaller problems. I profiled the MIP on a larger problem, for which it could not find a solution.
In the following picture of the cProfile output, it can be seen that 88.83% of the time the file is active in the function WaitForSingleObject. I am not familiar enough with the pulp source code to know what times are appropriate for this part of ActualSolve. Intuitively, I expected that most time needs to be spend in the ActualSolve. However, for me it seems like within this function, the most time is spent waiting. Is it possible to reduce this waiting time?
Thanks in advance and kind regards.

Particle swarm optimization with both continuous and discrete variables

So I want to try to solve my optimization problem using particle swarm optimiztion algorithm. As I comoratable with python I was looking into PySwarms toolkit. The issue is I am not really experienced in this field and don't really know how to account for integrality constraints of my problem. I was looking for advice on what are some approches to dealing with integral variables in PSO. And maybe some examples with PySwarms or any good alternative packages?
You can try pymoo module, which is an excellent multi-objective optimization tool. It can also solve mixed variable problems. Despite pymoo is first of all designed to solve such problems using genetic algorithms, there is an implementation of PSO (single-objective with continuous variables). Maybe you'll find it useful to try to solve your mixed variable problem using genetic algorithm or one of its modifications (e.g. NSGAII).

Algorithm behind standard pulp solver

I'm currently working on an LP optimization problem with and looked into PuLP.
I know that PuLPs default solver is: PULP-CBC-CMD. I solved a test problem with this and I'm wondering what kind of algorithm this solver actually uses... it doesnt seem to be a simplex as my problem got interpreted completely differently than a simplex interpretion would look like?
Also: Every other solver for PuLP has to be added to PuLP manually right?
Also: what solvers are you guys working with in python?
Thanks in advance!
CBC is based on simplex, yes. But, like most solvers, it combines simplex with many other algorithms such as branch-and-bound and cut-generation.
In particular, to solve linear programs it uses Clp: https://github.com/coin-or/Clp
More information on the CBC solver in their site: https://github.com/coin-or/Cbc

Categories

Resources