I am trying to compute intersections, distances and derivatives on 2D symbolic parametric curves (that is a curve defined on the plan by a function) but I can't find any Python module that seems to do the job.
So far I have only found libraries that deal with plotting or do numerical approximation so I thought I could implement it myself as a light overlay on top of a symbolic mathematics library.
I start experimenting with SymPy but I can wrap my head around it: it doesn't seems to be able to return intervals even in finite number (for instance solve(x = x) fails !) and only a small numbers of solutions is some simple cases.
What tool could be suitable for the task ?
I guess that parametric functions relate to the advanced topics of mathematical analysis, and I haven't seen any libraries yet that could match your demands. However you could try to look through the docs of the Sage project...
It would help if you give an example of two curves that you want to define. solve is up to the task for finding intersections of all quadratic curves (it will actually solve quartics and some quintics, too).
When you say "distance" what do you mean - arc length sort of distance or distance from a point to the curve?
As for tangents, that is easily handled with idiff (see its docstring for examples with help(idiff).
Related
I have a problem: I have an expensive to compute 1D function (float->float). I want to approximate it with splines for efficiency.
I know I can define a set of knot points with a uniform grid on the function's domain, evaluate the function on that grid and calculate a spline over that set. But the functions are special - they have huge dull regions and some special places with complex behavior. I want an algorithm that adaptively finds some optimal set of the knot points, sapling them more densely where the function has a difficult shape, and less so when the spline makes a good job in approximating it.
How can I find a library (preferably Python, but at this point I will take anything open source) that does it with an automatic knot selection? I tried many Google searches and still found nothing
I have finally found one, that seems to be working. It's splipy. It's method fit produces a B-Spline interpolation of the function object, with a relatively precise way to control the accuracy of the interpolation.
The source is on GitHub.
My goal is to build a temperature gradient map over a floor plan to display minute changes in temp. via uniformly distributed sensors.
As far as I understand most heatmap tools available work with point density to produce heatmaps whereas what I'm looking for is a gradient based on varying values of individual points (sensors) on the map. I.e. something like this...
which I nicked from here.
From what I've gathered interpolation will definitely be involved and it may just be Radial Basis Function Interpolation because it wouldn't require a grid as per this post.
I've used the Anaconda distribution thus far. The data from sensors will be extracted from timescaleDB and the positions of sensors will be lat/long.
I've done very minor experimentation with the code from the link above and got this result. Radial Basis Function Interpolation
So here are my questions. Multiple python libraries have interpolation as a built-in function but which one of the libraries would be the best for the task described above? What parts of documentation should I read up on from libraries which can help me with this specific problem? Any good resource recommendations for this topic? Would anything else be required for this apart from interpolation?
Thanks in advance!
P.S. This is a side project I'd like to work on as a student, not commercial in any way shape or form.
I like scipy.interpolate library. It has a lot of nice functions, the simplest that would work for you would probably be the scipy.interpolate.interp2d(), and if you want to go with an non linear distribution of sensors griddata() is very useful.
I would like to represent a bunch of particles (~100k grains), for which I have the position (including rotation) and a discretized level set function (i.e. the signed distance of every voxel from surface). Due to the large sample, I'm searching for eficient solutions to visualize it.
I first went for vtk, using its python interface, but I'm not really sure if it's the best (and simplest) way to do it since, as far as I know, there is no direct implementation for getting an isosurface from a 3D data set. In the beginning I was thinking usind marching cubes, but then I still would have to use a threshold or to interpolate in order to get the voxels that are on the surface and label them in order to be used by the marching cubes.
Now I found mayavi which has a python function
mlab.pipeline.iso_surface()
I however did not find much documentation on it and was wondering how it behaves in terms of performance.
Does someone have experience with this kind of tools? Which would be the best solution (in terms of efficiency and, secondly, in terms of simplicity - I do not know the vtk library, but if there is a huge difference in performance I can dig into it, also without python interface).
I need to calculate the area between two curves.
I have lots of data, so I'd like to do it programmatically.
Basically, I always have 2 normal distributions, calculated from a mean value and standard deviation. I would then like to calculate how much they intersect.
Here is an example of what I mean, and also some code in R (that I don't know).
Is there already a function in matplotlib or scipy or some other module that does it for me?
In case I have to implement it myself, I think that I should do:
find the intersections (there will be max 2)
see which function is lower before, [between], and after the intersection
calculate the integral of the lower function and add them all together
Is that right? How can I do the single steps? Are there functions, modules, etc that can help?
I don't know R either, but the answer seems to be in the link you provided: just integrate the minimum of your distributions.
You don't need to find intersections, just feed min(f(x), g(x)) to scipy.integrate.quad.
I have some data that are the integrals of an unknown curve within bins. For your interest, the data is ocean wave energy and the bins are for directions, e.g. 0-15 degrees. If possible, I would like to fit a curve on to the data that conserves the integrals within the bins. I've tried sketching it on a notepad with a pencil and it seems like it could be possible. Does anyone know of any curve-fitting tool in Python to do this, for example in the scipy interpolation sub-package?
Thanks in advance
Edit:
Thanks for the help. If I do it, it looks like I will try the method that is recommended in section 4 of this paper: http://journals.ametsoc.org/doi/abs/10.1175/1520-0485%281996%29026%3C0136%3ATIOFFI%3E2.0.CO%3B2. In theory, it basically uses matrices to make some 'fake' data from the known integrals between each band. When plotted, this data then produces an interpolated line graph that preserves the integrals.
It's a little outside my bailiwick, but I can suggest having a look at SciKits to see if there's anything there that might be useful. Other packages to browse would be pandas and StatsModels. Good luck!
If you have a curve f(x) which is an approximation to the integral of another curve g(x), i.e. f=int(g,x) then the two are related by the Fundamental theorem of calculus, that is, your original function is the derivative of the first curve g = df/dx. As such you can use numpy.diff or any of the higher order methods to approximate df/dx to obtain an estimate of your original curve.
One possibility: calculate the cumulative sum of the bin volumes (np.cumsum), fit an interpolating spline to it, and then take the derivative to get the curve.
scipy splines have methods to calculate the derivatives.
The only limitation, in case it is relevant in your case, the spline through the cumulative sum might not be monotonic, and the derivative might be negative over some intervals.
I guess that the literature on smoothing a histogram looks at similar constraints on the volume of the integral/bin, but I don't have any references ready.
1/ fit2histogram
Your question is about fitting an histogram. I just came through documentation for some Python package for Multi-Variate Pattern Analysis, PyMVPA, and some function for histogram fitting is proposed. An example is here: PyMVPA.
However, I guess that set of available distributions is limited to famous distributions.
2/ integral computation
As already mentionned, next solution is to approximate integral value, and to fit a model to the resulting set of data. Either you know explicit expression for the derivative, or you use computational derivation: finite difference, analytical method.