Python modeling language for ordinary differential equations? - python

I am working on ordinary differential equations that I want to solve using numerical methods in Python. Some of these ODEs are chemical equations, e.g. A + B > C with stoichiometry coefficient sigma. This leads to the differential equation du/dt = sigma * a * b.
To do this, the classical method is to use e.g. a Runge-Kutta solver, such as the one in Scipy's integrate.ode.
Instead, I would like to use a modeling language similar to cvxpy is for convex optimization. This method would allow to formally define the ODE and automatically solve the equation. It would be similar, in Python, to what is done in OpenModelica.
Is there such a tool?

Related

Matlab or Python tool for solving continuous replicator equation

I am trying to numerically solve a continuous replicator partial differential equation in order to model evolution in a competitive system. The equation takes the form of a second-order nonlinear integral PDE, with the specific PDE in the attached image.
Do you know of any tools in MATLAB or Python that can help in solving these forms of equations? I've looked at MATLAB's built-in solver and PDE toolbox, but neither can do second-order integral PDEs.

Jacobian of linear and nonlinear systems using numpy

I am writing a python class for both linear and nonlinear MIMO systems. These two classes inherit from a parent class called Model, which contains the ode of the model. For the linear system, the function looks like
def disc_linear_fn(x, u):
A = np.array([[],[],[],[]])
B = np.array([[]])
x_dot = A # x + B # u
return x_dot
whereas the function of a nonlinear system returns a nonlinear ode.
I want to acquire the Jacobian for both nonlinear and linear systems.
Question 1: How to acquire the exact matrices in a linear system ode function without returning them, i.e. A,B.
Question 2: Which package can be used to calculate the Jacobian of a nonlinear system in numpy. I know how to realize it by programming the approximation algorithm or using packages such as casadi, jax, sympy, but I'd prefer to not describe systems in other packages. Since the next step is to solve an optimization problem in cvxpy, I want to keep every information in numpy to maintain consistency.
Any recommendations would be appreciated! Many thanks in advance.

Solution to nonlinear differential equation with non-constant mass matrix

If I have a system of nonlinear ordinary differential equations, M(t,y) y' = F(t,y), what is the best method of solution when my mass matrix M is sometimes singular?
I'm working with the following system of equations:
If t=0, this reduces to a differential algebraic equation. However, even if we restrict t>0, this becomes a differential algebraic equation whenever y4=0, which I cannot set a domain restriction to avoid (and is an integral part of the system I am trying to model). My only previous exposure to DAEs is when an entire row is 0 -- but in this case my mass matrix is not always singular.
What is the best way to implement this numerically?
So far, I've tried using Python where I add a small number (0.0001) to the main diagonals of M and invert it, solving the equations y' = M^{-1}(t,y) F(t,y). However, this seems prone to instabilities, and I'm unsure if this is a universally appropriate means of regularization.
Python doesn't have any built-in functions to deal with mass matrices, so I've also tried coding this in Julia. However, DifferentialEquations.jl states explicitly that "Non-constant mass matrices are not directly supported: users are advised to transform their problem through substitution to a DAE with constant mass matrices."
I'm at a loss on how to accomplish this. Any insights on how to do this substitution or a better way to solve this type of problem would be greatly appreciated.
The following transformation leads to a constant mass matrix:
.
You need to handle the case of y_4 = 0 separately.

How to numerically solve this system of arbitrary number of differential equations?

How can I solve a system of k differential equations with derivatives appearing in every equation? I am trying to use Scipy's solve_ivp.
All the equations are of the following form:
equations
How can this system of equations be numerically solved using any solver? using solve_ivp, it seems you should be able to write every equation independent of the other ones, which seems not possible in this case when we have more than 2 equations.
If you set C[i]=B[i,i] then you can transform the equations to the linear system B*z'=A. This can be solved as
zdot = numpy.linalg.solve(B,A)
so that the derivative is this constant solution of a constant linear system, and the resulting solution for z is linear, z(t)=z(0)+zdot*t.

Deterministic and stochastic part of an equation

I'm on the lookout for a numerical method that can solve both a deterministic and stochastic equation. In the deterministic case, I know that a fourth order RK method is a valuable one, very effective. Unfortunately, there has not been applied to stochastic equations successfully (at least as far as I know).
Now what I want to know is if a numerical method that can solve both equations (roughly I mean, in comparison to the analytic solutions) exists and, in that case, what would be. A stochastic equation analytically solvable would be the Black-Scholes one, for instance.
There are methods for solving these kinds of equations in DifferentialEquations.jl. Stochastic differential equations are a form of mixed deterministic and stochastic equation and solving them is shown in the SDE tutorial. Mixing discrete stochasticity with deterministic equations is shown in the jump equation tutorial. While written naively in Julia, it is accessible in Python via the package diffeqpy. Notice that this has some example stochastic differential equations in the README.

Categories

Resources