from sympy import *
var('x y')
eqn=x**2+y
print(eqn.diff(x))
Getting 2*x but required 2*x+y'
We have to initialize y as a function of x rather than a Symbol
from sympy import *
var('x')
y = Function('y')(x)
eqn = x**2 + y
x=eqn.diff(x)
print(x)
# 2*x + Derivative(y(x), x)
Source:GUser Jashan
As mention by Smichr(https://stackoverflow.com/users/1089161/smichr) ,to get dy/dx from an equation or expression = 0
from sympy import *
var('x y')
eqn = x**2 + y
x=idiff(eqn,y,x)
print(x)
# -2*x
Related
My goal is to find u as function of x and y using Sympy module.
the equations are:
The answer should be:
from sympy import cosh, sinh, symbols, sin, cos, Eq
u, v, x, y = symbols('u, v, x, y')
eqq1 = Eq(x, sin(u)*cosh(v))
eqq2 = Eq(y, cos(u)*sinh(v))
What's next?
I tried
result = solve((Eqq1, Eqq2), u, v)
Obviously it's not the right way.
Probably not the answer that you expect, but you can rework the problem by eliminating v, as follows:
(x cos u)² - (y sin u)² = cos²u sin²u
Then with t = sin²u, the equation is quadratic in t:
x² (1 - t) - y² t = (1 - t) t
Divide each side of x equation by cosh(v) and each side of y equation by sinh(v). Square all sides. Replace sinh(v)**2 with z and cosh(v)**2 with 1 + z. The sum of the lhs of the equations is 1. The difference is cos(2u). You can solve for cos(2u) and z and work out something close to what you are looking for.
>>> from sympy import x, y, z, v, u
>>> from sympy import cosh,sinh,cos,Tuple,solve
>>> e1 = x**2/cosh(v)**2 + y**2/sinh(v)**2-1
>>> e2 = y**2/sinh(v)**2-x**2/cosh(v)**2-cos(2*u)
>>> solve(Tuple(e1,e2).subs(cosh(v)**2,1+z).subs(sinh(v)**2,z), z, cos(2*u), dict=True)
[
{z: x**2/2 + y**2/2 - sqrt((x**2 - 2*x + y**2 + 1)*(x**2 + 2*x + y**2 + 1))/2 - 1/2,
cos(2*u): -x**2 - y**2 - sqrt((x**2 - 2*x + y**2 + 1)*(x**2 + 2*x + y**2 + 1))},
{z: x**2/2 + y**2/2 + sqrt((x**2 - 2*x + y**2 + 1)*(x**2 + 2*x + y**2 + 1))/2 - 1/2,
cos(2*u): -x**2 - y**2 + sqrt((x**2 - 2*x + y**2 + 1)*(x**2 + 2*x + y**2 + 1))}]
I have the following equation, like this:
y = 3x2 + x
Then, I want to differentiate the both side w.r.t the variable t with sympy. I try to implement it in the following code in JupyterNotebook:
>>> import sympy as sp
>>> x, y, t = sp.symbols('x y t', real=True)
>>> eq = sp.Eq(y, 3 * x **2 + x)
>>>
>>> expr1 = eq.lhs
>>> expr1
𝑦
>>> expr1.diff(t)
0
>>>
>>> expr2 = eq.rhs
>>> expr2
3𝑥^2+𝑥
>>> expr2.diff(t)
0
As the result, sympy will treat the symbol x and y as a constant. However, the ideal result I want should be the same as the result derived manually like this:
y = 3x2 + x
d/dt (y) = d/dt (3x2 + x)
dy/dt = 6 • x • dx/dt + 1 • dx/dt
dy/dt = (6x + 1) • dx/dt
How can I do the derivative operation on a expression with a specific symbol which is not a free symbol in the expression?
You should declare x and y as functions rather than symbols e.g.:
In [8]: x, y = symbols('x, y', cls=Function)
In [9]: t = symbols('t')
In [10]: eq = Eq(y(t), 3*x(t)**2 + x(t))
In [11]: eq
Out[11]:
2
y(t) = 3⋅x (t) + x(t)
In [12]: Eq(eq.lhs.diff(t), eq.rhs.diff(t))
Out[12]:
d d d
──(y(t)) = 6⋅x(t)⋅──(x(t)) + ──(x(t))
dt dt dt
https://docs.sympy.org/latest/modules/core.html#sympy.core.function.Function
Alternatively, the idiff function was made for this purpose but it works with expressions like f(x, y) and can return the value of dy/dx. So first make your Eq and expression and then calculate the desired derivative:
>>> from sympy import idiff
>>> e = eq.rewrite(Add)
>>> dydx = idiff(e, y, x); dydx
6*x + 1
Note, too, that even in your equation (if you write it explicitly in terms of functions of t) you do not need to isolate y(t) -- you can differentiate and solve for it:
>>> from sympy.abc import t
>>> x,y=map(Function,'xy')
>>> eq = x(t)*(y(t)**2 - y(t) + 1)
>>> yp=y(t).diff(t); Eq(yp, solve(eq.diff(t),yp)[0])
Eq(Derivative(y(t), t), (-y(t)**2 + y(t) - 1)*Derivative(x(t), t)/((2*y(t) - 1)*x(t)))
Using Python 3.9.7 in VS code. I'm doing some stress analysis in python:
from math import sin, cos, tan, degrees, radians
from operator import eq
from sympy import *
x, y, z, a = symbols("x y z a")
vm_crit = ((x - y)**2 + (y - z)**2 + (z - x)**2)**(1/2) -(2**(1/2))*a
vm_crit_sub = vm_crit.subs(x, z)
vm_crit_solve = solveset(vm_crit_sub,y)
print(vm_crit_solve)
I get:
ConditionSet(y, Eq(-1.4142135623731*a + ((-y + z)**2 + (y - z)**2)**0.5, 0), Complexes)
The second argument is correct, but for some reason the function isn't solving for 'y'. If I get rid of the squares:
x, y, z, a = symbols("x y z a")
vm_crit = (x + y + z) - a
vm_crit_sub = vm_crit.subs(x, z)
vm_crit_solve = solveset(vm_crit_sub,y)
print(vm_crit_solve)
I get:
{a - 2*z}
This is correct.
In sympy do you have to expand before solving? If so, is there a way around it?
Thanks for any help.
General:
I am using maximum entropy to find distribution for on positive integers vectors, I can estimate the mean and variance, and have three equation I am trying to find a and b,
The equations:
integral(exp(a*x^2+bx+c) from (0 , infinity))-1
integral(xexp(ax^2+bx+c)from (0 , infinity))- mean
integral(x^2*exp(a*x^2+bx+c) from (0 , infinity))- mean^2 - var
(integrals between [0,∞))
The problem:
I am trying to use numerical solver and I used fsolve of sympy
But I guess I am missing some knowledge.
My code:
import numpy as np
import sympy as sym
from scipy.optimize import *
def myFunction(x,*data):
y = sym.symbols('y')
m,v=data
F = [0]*3
x[0] = - abs(x[0])
print(x)
F[0] = (sym.integrate(sym.exp(x[0] * y ** 2 + x[1] * y + x[2]), (y, 0,sym.oo)) -1).evalf()
F[1] = (sym.integrate(y*sym.exp(x[0] * y ** 2 + x[1] * y + x[2]), (y, 0,sym.oo))-m).evalf()
F[2] = (sym.integrate((y**2)*sym.exp(x[0] * y ** 2 + x[1] * y + x[2]), (y,0,sym.oo)) -v-m).evalf()
print(F)
return F
data = (10,3.5) # mean and var for example
xGuess = [1, 1, 1]
z = fsolve(myFunction,xGuess,args = data)
print(z)
my result are not that accurate, is there a better way to solve it?
integral(exp(a*x^2+bx+c))-1 = 5.67659292676884
integral(xexp(ax^2+bx+c))- mean = −1.32123173796713
integral(x^2*exp(a*x^2+bx+c))- mean^2 - var = −2.20825624606312
Thanks
I have rewritten the problem replacing sympy with numpy and lambdas (inline functions).
Also note that in your problem statement you subtract the third equation with $mean^2$, but in your code you only subtract $mean$.
import numpy as np
from scipy.optimize import minimize
from scipy.integrate import quad
def myFunction(x,data):
m,v=data
F = np.zeros(3) # use numpy array
# use scipy.integrade.quad for integration of lambda functions
# quad output is (result, error), so we just select the result value at the end
F[0] = quad(lambda y: np.exp(x[0] * y ** 2 + x[1] * y + x[2]), 0, np.inf)[0] -1
F[1] = quad(lambda y: y*np.exp(x[0] * y ** 2 + x[1] * y + x[2]), 0, np.inf)[0] -m
F[2] = quad(lambda y: (y**2)*np.exp(x[0] * y ** 2 + x[1] * y + x[2]), 0, np.inf)[0] -v-m**2
# minimize the squared error
return np.sum(F**2)
data = (10,3.5) # mean and var for example
xGuess = [-1, 1, 1]
z = minimize(lambda x: myFunction(x, data), x0=xGuess,
bounds=((None, 0), (None, None), (None, None))) # use bounds for negative first coefficient
print(z)
# x: array([-0.99899311, 2.18819689, 1.85313181])
Does this seem more reasonable?
I'm trying to simplify an expression using sympy but the relational terms seem to disappear. A toy example is as follows:
import sympy
from sympy import *
x = Symbol('x')
y = Symbol('y')
z = Symbol('z')
If I run:
z * Eq(x, y)
Then the output is:
z*(x == y)
But if I try to simplify this using:
simplify(z * Eq(x, y))
Then the output is:
z
Which I would not expect - should I expect this behaviour and if so, is there any way to prevent simplify from removing the relational term?
Thanks.
Logic and arithmetic operations cannot be combined to make such operations.
Supposing:
from sympy import *
x, y, z = symbols('x y z')
f = symbols('f', cls=Function)
For arithmetic operation:
xeqy = Piecewise((1,Eq(x,y)),(0,True)) # {1 for x = y, 0 otherwise}
f = z * xeqy # {z for x = y, 0 otherwise}
simplify(f)
For logical operation:
f = And(z,Eq(x,y)) # z ∧ (x = y)
simplify(f)