python solving differential equation with complex variables - python

folks,
Is it possible to solve ODE with complex variable in python? The equation I have has the following form
dx/dt = -a x -i y(t)
where y(t) is a known function, a is a known number and i is the root of -1.
I tried to use odeint() but it gives many error messages.
I am guessing odeint() does not work with complex variables. So one way out would be to separate the real and imaginary parts of x and treat the original ODE as two coupled ODEs.
But I am also wondering if there are some more convenient way for this task? Solving ODE/PDE with complex variables is a general problem and it will be quite a hassle to make this complex -> real conversion by hand all the time.
Thanks very much.

I'd suggest using scipy.integrate.complex_ode instead of scipy.integrate.odeint which performs the conversion automatically.

Related

linear programming with python

I'am trying to solve a following problem.
In fact, this is Least Absolute Deviation Regression problem. I want to know how to solve this with python. I know that scipy has "linprog" which solve linear system with linear inequality constraints. But here there is two variable in inequality constraints, t, x. So, I want to know how to apply the "linprog" or is there other library which can solve this problem? Thanks
You have to write your problem in standard form, splitting the second inequality constraint and concatenating the two optimization variables. Then, you can feed it to linprog.
It is more of a math problem than an implementation one.

Coupled non-linear equations in FyPi

I'm trying to set up a system for solving these 5 coupled PDEs in FyPi to study the dynamics of electrons and holes in semiconductors
The system of coupled PDEs
I'm struggling with defining the terms highligted in blue as they're products of one variable with gradient of another. For example, I'm able to define the third equation like this without error messages:
eq3 = ImplicitSourceTerm(coeff=1, var=J_n) == ImplicitSourceTerm(coeff=e*mu_n*PowerLawConvectionTerm(var=phi), var=n) + PowerLawConvectionTerm(coeff=mu_n*k*T, var=n)
But I'm not sure if this is a good way. Is there a better way how to define this non-linear term, please?
Also, if I wanted to define a term that would be product of two variables (say p and n), would it be just:
ImplicitSourceTerm(p, var=n)
Or is there a different way?
I am amazed that you don't get an error from passing a PowerLawConvectionTerm as a coefficient of an ImplicitSourceTerm. It's certainly not intended to work. I suspect you would get an error if you attempted to solve().
You should substitute your flux equations into your continuity equations so that you end up with three second-order PDEs for electron drift-diffusion, hole drift-diffusion, and Poisson's equation. It will hopefully then be a bit clearer how to use FiPy Terms to represent the different elements of those equations.
That said, these equations are challenging. Please see this issue and this notebook for some pointers on how to set up and solve these equations, but realize that we provide no examples in our documentation because we haven't been able to come up with anything robust enough. Solving for pseudo-Fermi levels has worked a bit better for me than solving for electron and hole concentrations.
ImplicitSourceTerm(p, var=n) is a reasonable way to represent the n*p recombination term.

How to use complex variables in a Gurobi problem

I currently solve optimization problems with complex variables using CVX + Mosek, on MATLAB. I'm now considering switching to Gurobi + Python for some applications.
Is there a way to declare complex values (both inside constraints and as optimization variables) directly into Gurobi's Python interface?
If not, which are good modeling languages, with Python interface, that automates the reduction of the problem to real variables before calling the solver?
I know, for instance, that YALMIP does this reduction (though no Python interface), and newer versions of CVXPY also (but I haven't used it extensively, and don't know if it already has good performance, is stable, and reasonably complete). Any thoughts on these issues and recommendations of other interfaces are thus welcome.
The only possible variables in Gurobi are:
Integer;
Binary;
Continuous;
Semi-Continuous and;
Semi-Integer.
Also, I don't know the problem you're trying to solve, but complex number are quite strange for linear optimization.
The complex plane isn't a ordered field, so that is not possible to say that a given complex number z1 > z2
You'll probably have to model your problem in such way that you can decompose the constraints with real and imaginary parts, so that you can work only with real numbers.

PDE solver that handles constraints

I am trying to solve a system of partial differential equations of the general form
F(f(x,y), f'(x,y), f''(x,y), g(x,y), g'(x,y), g''(x,y)) = 0
where the derivatives may be taken with respect to both x and y and f(x,y) and g(x,y) are subject to some constraint
G(f(x,y),g(x,y)) = 0
I wonder if there exists any (preferably Python based) solver (not a method, as I know the methods) that can deal with a problem of this kind? Would appreciate any help and apologise if my question seems to general.
Such a problem will require initial conditions and boundary conditions to be satisfied to obtain an unique solution. Also you will need to provide a domain (geometry) to the solver. I think you must look at finite element solvers in python.
Just a quick Google search provided few finite element solvers in python, however I have not tested any. So I guess that would be a good starting point.
If you are looking for a finite element solver, Fenics has python bindings.

parameter within an interval while optimizing

Usually I use Mathematica, but now trying to shift to python, so this question might be a trivial one, so I am sorry about that.
Anyways, is there any built-in function in python which is similar to the function named Interval[{min,max}] in Mathematica ? link is : http://reference.wolfram.com/language/ref/Interval.html
What I am trying to do is, I have a function and I am trying to minimize it, but it is a constrained minimization, by that I mean, the parameters of the function are only allowed within some particular interval.
For a very simple example, lets say f(x) is a function with parameter x and I am looking for the value of x which minimizes the function but x is constrained within an interval (min,max) . [ Obviously the actual problem is just not one-dimensional rather multi-dimensional optimization, so different paramters may have different intervals. ]
Since it is an optimization problem, so ofcourse I do not want to pick the paramter randomly from an interval.
Any help will be highly appreciated , thanks!
If it's a highly non-linear problem, you'll need to use an algorithm such as the Generalized Reduced Gradient (GRG) Method.
The idea of the generalized reduced gradient algorithm (GRG) is to solve a sequence of subproblems, each of which uses a linear approximation of the constraints. (Ref)
You'll need to ensure that certain conditions known as the KKT conditions are met, etc. but for most continuous problems with reasonable constraints, you'll be able to apply this algorithm.
This is a good reference for such problems with a few examples provided. Ref. pg. 104.
Regarding implementation:
While I am not familiar with Python, I have built solver libraries in C++ using templates as well as using function pointers so you can pass on functions (for the objective as well as constraints) as arguments to the solver and you'll get your result - hopefully in polynomial time for convex problems or in cases where the initial values are reasonable.
If an ability to do that exists in Python, it shouldn't be difficult to build a generalized GRG solver.
The Python Solution:
Edit: Here is the python solution to your problem: Python constrained non-linear optimization

Categories

Resources