Is the geoToH3 function as pseudo-code available? - python

Is there a (python) or pseudocode example of geoToH3 available? I just need this function and would like to avoid installing the library on my target environment (AWS GLUE, PySpark)
I tried to follow the javascript implementation but even that used C magic internally.

There isn't a pseudocode implementation that I'm aware of, but there's a fairly thorough explanation in the documentation. Roughly:
Select the icosahedron face (0-20) the point lies on (using point square distance in 3d space)
Project the point into face-oriented IJK coordinates
Convert the IJK coords to an H3 index by calculating the index digits at each resolution and setting the appropriate bits
The core logic can be found here and here. It's not trivial to implement - unless there's a strong reason to avoid installing, that would be the far easier and more reliable option.

Related

Any python library for Bezier curve approximation with biarcs?

I am in need of a simple python library which can perform Bezier curve approximation algorithm with biarcs (biarc interpolation). I am looking for something like the algorithm explained at approximating-bezier-curves-by-biarcs which also includes CSharp implementation.
I have tried searching, for similar implementation in python. I found some but they were inside in CNC controllers code like gcodetools. Extracting just the part I need, seems to be complicated. And couldn't find any simple ones, which just implement the algorithm.
Before I try to convert CSharp code to python, I want to check here if any such python script already exists. Please share anything you think might be helpful.

Slow Down SymPy's Computations into Smaller Steps

I'm playing around with SymPy and it is very powerful. However, I would like to get it to 'slow down' and solve pieces of an equation at a time instead of most of the equation. For instance, given an input string equation (assuming the correct form) like
9x-((17-3)(4x)) - 8(34x)
I would like to first solve
9x-((14)(4x)) - 8(34x)
And then
9x-(56x) - 8(34x)
and then
9x-(56x) - 272x
And so on.
Another example,
from sympy import *
s = (30*(5*(5-10)-10*x))+10
s2 = expand(s, basic=False)
Gives me -300*x - 740 in one step, and I just want a single * done at a time
Reading the ideas document produced as a result of the Google Summer of Code, this appears to be something yet to be added to the library. As it stands there is no way of doing this for your example, without completely coding something yourself.
The issue of converting algorithms that are not equivalent to human workings, into discrete steps, is discussed and highlighted in the above document. I'm not sure if that'd be an issue in the implementation of expansion, but it's certainly an issue for other algorithms, which machines compute differently for reasons of efficiency.
tl;dr This library doesn't support step-by-step breakdowns for your example. Only the manualintegrate function currently has step-by-step workings.

solving ODEs on networks with PyDSTool

After using scipy.integrate for a while I am at the point where I need more functions like bifurcation analysis or parameter estimation. This is why im interested in using the PyDSTool, but from the documentation I can't figure out how to work with ModelSpec and if this is actually what will lead me to the solution.
Here is a toy example of what I am trying to do: I have a network with two nodes, both having the same (SIR) dynamic, described by two ODEs, but different initial conditions. The equations are coupled between nodes via the Epsilon (see formula below).
formulas as a picture for better read, the 'n' and 'm' are indices, not exponents ~>
http://image.noelshack.com/fichiers/2014/28/1404918182-odes.png
(could not use the upload on stack, sadly)
In the two node case my code (using PyDSTool) looks like this:
#multiple SIR metapopulations
#parameter and initial condition definition; a dict is a must
import PyDSTool as pdt
params={'alpha': 0.7, 'beta':0.1, 'epsilon1':0.5,'epsilon2':0.5}
ini={'s1':0.99,'s2':1,'i1':0.01,'i2':0.00}
DSargs=pdt.args(name='SIRtest_multi',
ics=ini,
pars=params,
tdata=[0,20],
#the for-macro generates formulas for s1,s2 and i1,i2;
#sum works similar but sums over the expressions in it
varspecs={'s[o]':'for(o,1,2,-alpha*s[o]*sum(k,1,2,epsilon[k]*i[k]))',
'i[l]':'for(l,1,2,alpha*s[l]*sum(m,1,2,epsilon[m]*i[m]))'})
#generator
DS = pdt.Generator.Vode_ODEsystem(DSargs)
#computation, a trajectory object is generated
trj=DS.compute('test')
#extraction of the points for plotting
pts=trj.sample()
#plotting; pylab is imported along with PyDSTool as plt
pdt.plt.plot(pts['t'],pts['s1'],label='s1')
pdt.plt.plot(pts['t'],pts['i1'],label='i1')
pdt.plt.plot(pts['t'],pts['s2'],label='s2')
pdt.plt.plot(pts['t'],pts['i2'],label='i2')
pdt.plt.legend()
pdt.plt.xlabel('t')
pdt.plt.show()
But in my original problem, there are more than 1000 nodes and 5 ODEs for each, every node is coupled to a different number of other nodes and the epsilon values are not equal for all the nodes. So tinkering with this syntax did not led me anywhere near the solution yet.
What I am actually thinking of is a way to construct separate sub-models/solver(?) for every node, having its own parameters (epsilons, since they are different for every node). Then link them to each other. And this is the point where I do not know wether it is possible in PyDSTool and if it is the way to handle this kind of problems.
I looked through the examples and the Docs of PyDSTool but could not figure out how to do it, so help is very appreciated! If the way I'm trying to do things is unorthodox or plain stupid, you are welcome to make suggestions how to do it more efficiently. (Which is actually more efficient/fast/better way to solve problems like this: subdivide it into many small (still not decoupled) models/solvers or one containing all the ODEs at once?)
(Im neither a mathematician nor a programmer, but willing to learn, so please be patient!)
The solution is definitely not to build separate simulation models. That won't work because so many variables will be continuously coupled between the sub-models. You absolutely must have all the ODEs in one place together.
It sounds like the solution you need is to use the ModelSpec object constructs. These let you hierarchically build the sub-model definitions out of symbolic pieces. They can have their own "epsilon" parameters, etc. You declare all the pieces when you're finished and let PyDSTool make the final strings containing the ODE definitions for you. I suggest you look at the tutorial example at:
http://www.ni.gsu.edu/~rclewley/PyDSTool/Tutorial/Tutorial_compneuro.html
and the provided examples: ModelSpec_test.py, MultiCompartments.py. But, remember that you still have to have a source for the parameters and coupling data (i.e., a big matrix or dictionary loaded from a file) to be able to automate the process of building the model, otherwise you'd still be writing it all out by hand.
You have to build some classes for the components that you want to have. You might also create a factory function (compare 'makeSoma' in the neuralcomp.py toolbox) that will take all your sub-components and create an ODE based on summing something up from each of the declared components. At the end, you can refer to the parameters by their position in the hierarchy. One might be 's1.epsilon' while another might be 'i4.epsilon'.
Unfortunately, to build models like this efficiently you will have to learn to do some more complex programming! So start by understanding all the steps in the tutorial. You can email me directly through the SourceForge support discussions or email once you've got started and have specific questions.

Point in Polygon with geoJSON in Python

I have a geoJSON database with lots of polygons (census tracts specifically) and I have lots of long,lat points.
I am hoping that there would exist an efficient Python code to identify which census tract a given coordinate is in, however so far my googling hasn't revealed anything.
Thanks!
I found an interesting article describing how to do exactly what you are looking to do.
TL;DR: Use Shapely
You will find this code at the end of the article:
import json
from shapely.geometry import shape, Point
# depending on your version, use: from shapely.geometry import shape, Point
# load GeoJSON file containing sectors
with open('sectors.json') as f:
js = json.load(f)
# construct point based on lon/lat returned by geocoder
point = Point(-122.7924463, 45.4519896)
# check each polygon to see if it contains the point
for feature in js['features']:
polygon = shape(feature['geometry'])
if polygon.contains(point):
print 'Found containing polygon:', feature
A great option for working with these types of data is PostGIS, a spatial database extender for PostgreSQL. I personally keep all of my geo data in a PostGIS database, and then reference it in python using psycopg2. I know it's not pure python, but it's got unbelievable performance benefits (discussed below) over pure python.
PostGIS has functionality built in to determine if a point or shape is within another shape. The good documentation on the ST_Within function expands upon this simple example:
SELECT
ST_WITHIN({YOUR_POINT},boundary)
FROM census;
-- returns true or false for each of your tracts
The benifit you'll gain from PostGIS that you likely won't achieve elsewhere is indexing, which can improve your speed 1,000x [1], making it better than even the best written C program (unless the C program also creates an index for your data). The database, when properly setup, will cache information about your tracts, and when you ask if a point is within a tract, it won't have to search everything... it can take advantage of it's index.
Getting data into and out of PostGRES is pretty simple. A great tutorial that will walk you through the basics of PostGIS with sample datasets not too different from yours can be found here. It's reasonably long, but if you're new to PostGIS (as I was), you'll be very entertained and excited the entire time:
http://workshops.boundlessgeo.com/postgis-intro/
[1] Indexing decreased a nearest neighbor search in one of my huge databases (20 m from 53 seconds to 8.2 milliseconds.
One cannot have really fast geometric code in Python. Instead the usual approach is to use fast C/C++ library with Python wrappers.
For example, you can start with CGAL - a very comprehensive C++ geometric library. It has Python bindings for most of its routines, see the link http://code.google.com/p/cgal-bindings/.

Convex polyhedra intersection in Python [duplicate]

Are there any libraries that provide 3D polyhedra, and support calculating the intersection of two polyhedra?
If it makes a difference, the polyhedra I want to model do not have 'holes' in them.
The focus would be on correctness first and speed a close second!
Ideally this library would:
have existing tidy python bindings
be free-standing or have reasonable and small dependencies
support calculating the outline of the polyhedron when view from any given angle
CGAL offers rather more than you're asking for, but does in particular include polyhedra and "boolean"-like operations on them (I'm not sure about "view from any angle" as a primitive, though -- as I recall it wasn't there when I last used it, but that was a while ago -- you may have to iterate projecting the hedges on the appropriate plane).
The Python bindings are here and I believe the only "big" dependency is Boost Python (used for the bindings).

Categories

Resources