How to check if a SymPy expression has analytical integral - python

I want to solve my other question here so I need sympy to return an error whenever there is no analytical/symbolic solution for and integral.
For example if I try :
from sympy import *
init_printing(use_unicode=False, wrap_line=False, no_global=True)
x = Symbol('x')
integrate(1/cos(x**2), x)
It just [pretty] prints the integral itself
without solving and/or giving an error about not being able to solve it!
P.S. I have also asked this question here on Reddit.

A "symbolic" solution always exists: I just invented a new function intcos(x), which by definition is the antiderivative of 1/cos(x**2). Now this integral has a symbolic solution!
For the question to be rigorously answerable, one has to restrict the class of functions allowed in the answer. Typically one considers elementary functions. As SymPy integral reference explains, the Risch algorithm it employs can prove that some functions do not have elementary antiderivatives. Use the option risch=True and check whether the return value is an instance of sympy.integrals.risch.NonElementaryIntegral
from sympy.integrals.risch import NonElementaryIntegral
isinstance(integrate(1/exp(x**2), x, risch=True), NonElementaryIntegral) # True
However, since Risch algorithm implementation is incomplete, in many cases like 1/cos(x**2) it returns an ordinary Integral object. This means it was not able to either find an elementary antiderivative or prove that one does not exist.
For this example, it helps to rewrite the trigonometric function in terms of exponential, with rewrite(cos, exp):
isinstance(integrate((1/cos(x**2)).rewrite(cos, exp), x, risch=True), NonElementaryIntegral)
returns True, so we know the integral is nonelementary.
Non-elementary antiderivatives
But often we don't really need an elementary function; something like Gamma or erf or Bessel functions may be okay; as long as it's some "known" function (which of course is a fuzzy term). The question becomes: how to tell if SymPy was able to integrate a specific expression or not? Use .has(Integral) check for that:
integrate(2/cos(x**2), x).has(Integral) # True
(not isinstance(Integral) because the return value can be, like here, 2*Integral(1/cos(x**2), x).) This does not prove anything other than SymPy's failure to find the antiderivative. The antiderivative may well be a known function, even an elementary one.

Related

pyOpt multi-objective minimax example

Abstract problem to be solved:
we have n d-dimentional design variables, say {k_0, k_1, ..., k_n}
maximize the minimum of [f(k_0), f(k_1), ... f(k_n)], where f() a nonlinear function, i.e. maximin
constraint: mean([k_0, k_1, ...,k_n])==m, m known constant
Can someone provide an example of how this can be solved (maximin, d-dim variables) via pyOpt?
EDIT: i tried this:
import scipy as sp
from pyOpt.pyOpt_optimization import Optimization
from pyOpt.pyALPSO.pyALPSO import ALPSO
def __objfunc(x,**kwargs):
f=min([x[0]+x[1],x[2]+x[3]])
g=[0.0]
g[0]=(((x[0]+x[1])+(x[2]+x[3]))/2.0)-5
fail=0
return f,g,fail
if __name__=='__main__':
op=Optimization('test', __objfunc)
op.addVarGroup('p0',4,type='c')
op.addObj('f')
op.addCon('ineq','i')
o=ALPSO()
o(op)
print(op._solutions[0])
suppose 2-dimentional design variables
is there any better way?
I probably would reformulate this as:
The min() function you used is non-differentiable (and thus dangerous). Also the mean() function can be replaced by a linear constraint (which is easier).
I am not familiar with the ALPSO solver, but this reformulation would usually be helpful for more traditional solvers like SNOPT, NLPQL and FSQP.

sympy - Composition of functions and operators

Let's assume that we have a function f and an operator L. In this case, it can be something simple, like,
L[f](x)=\sum_{k=1}^{4}f(x+k)
My main objective is to compute compositions of operators, like L above, using sympy. Sympy has no problem handling compositions of functions but we can quickly see that there is gonna be a problem with the operator above.
For example, I can define it as,
class L(Function):
#classmethod
def eval(cls, f,x):
k = Symbol('k')
return summation(f(k+x),(k,1,4))
And this indeed computes L[f] but returns an evaluated object that is no longer a function of x, so computing L[L[f]] no longer makes sense.
Is there a way in sympy to convert what L returns to be a function of x? I think that would solve the problem, since then I would be able to re-apply L on the new object.
Thanks for your time.
This question had a simple answer after all. Sympy's Lambda does the trick in this case and then I can re-apply L after evaluation is done.

Derivative of a conjugate in sympy

When I try to differentiate a symbol with SymPy I get the following
In : x=Symbol('x')
In : diff(x,x)
Out: 1
When I differentiate the symbol respect to its conjugate the result is
In [55]: diff(x,x.conjugate())
Out[55]: 0
However, when I try to differentiate the conjugate of the symbol SymPy doesn't do it
In : diff(x.conjugate(),x)
Out: Derivative(conjugate(x), x)
This is still correct, but the result should be zero. How can I make SimPy perform the derivative of a conjugate?
I'm not sure about the mathematics if diff(conjugate(x), x) should be zero. The fact that diff(x,x.conjugate()) gives zero has nothing to do with mathematics (and might even be considered a SymPy bug). It gives zero simply because x does not contain conjugate(x) (symbolically), so it sees it as a constant with respect to it. This is probably wrong, since x is not a constant with respect to conjugate(x). The fact that SymPy lets you take derivatives with respect to defined functions is probably a bug, actually. It is supposed to allow things like diff(f(x)**2, f(x)), where f = Function('f') is an undefined function, but for defined functions, it is probably mathematically incorrect (or at least not what you expect).
See http://docs.sympy.org/latest/modules/core.html?highlight=derivative#sympy.core.function.Derivative, particularly the section on derivatives wrt non-Symbols. To paraphrase, taking derivatives with respect to a function is just a notational convenience and does not represent a mathematical chain rule. Rather, something like diff(x, conjugate(x)) should be thought of as something like diff(x.subs(conjugate(x), dummy), dummy).subs(dummy, conjugate(x)).
Regarding conjugate(x).diff(x), this gives an unevaluated derivative because no derivative is defined for conjugate. I'm not sure if any closed-form answer is possible here anyway. Probably this is the most useful thing that SymPy could return. I can't find any good answers anywhere as to what a reasonable answer for this should be (you should ask on math SE to get a better answer about it).

SymPy - apply limits to an indefinite integral

In SymPy, is it possible to apply limits to an indefinite integral and evaluate it?
import sympy
from sympy.abc import theta
y = sympy.sin(theta)
Y_indef = sympy.Integral(y)
Y_def = sympy.Integral(y, (theta, 0, sympy.pi / 2))
Y_def.evalf() produces a number.
I'm looking for something like Y_indef.evalf((theta, 0, sympy.pi/2)) to get the same answer.
I do not know of a direct way, however you can extract the information from Y_indef in order to create a definite integral:
>>> indef = Integral(x)
>>> to_be_integrated, (free_var,) = indef.args
>>> definite = Integral(to_be_integrated, (free_var, 1, 2))
.args is a general attribute containing anything needed to construct most SymPy objects.
Edit: To address the comments to the questions.
SymPy may succeed evaluating definite integral and at the same time fail to solve their indefinite version. This is due to the existence of additional algorithms to be applied to definite integrals.
Both definite and indefinite integrals are instances of the same class. The only difference is what they contain in their .args. The need for different classes is not yet felt, given that SymPy mostly uses Integral as a flag to say that it can not solve the integral (i.e. the integrate function returns Integral when all of the implemented algorithms fail).

Getting an expression over a horizon for a given recursive equation in sympy/numpy

The following example is stated just for the purpose of precise definition of the query. Consider a recursive equation x[k+1] = a*x[k] where a is some constant. Now, is there an easier way or an existing method within sympy/numpy that does the following (i.e., gives an expression over a horizon for a given recursive equation):
def get_expr(init, num):
a = Symbol('a')
expr = init
for i in range(num):
expr = a*expr
return expr
x0 = Symbol('x0')
get_expr(x0,3)
Horizon above is 3.
I was going to suggest using SymPy's rsolve to try to find a closed form solution to your equation, but it seems that at least for this specific one, there is a bug that prevents it from working. See http://code.google.com/p/sympy/issues/detail?id=2943. Maybe if you really want to know for a more complicated expression you could try that. For this one, the closed form solution is just a**n*x0.
Aside from that, SymPy doesn't have any functions that would do this evaluation directly, but it does have some things that can help. There are some memoization decorators in sympy.utilities.memoization that are made for internal use, but should work just fine for external uses. They can help make your evaluation more efficient by caching the result of previous evaluations. You'll need to write the get_expr recursively for it to work effectively. Or you could just write your own cacher. It's not that complicated.

Categories

Resources