I currently use PostGIS as a backbone for a lot of spatial functions I perform in python scripts. Specifically taking several shapefile geometries and seeing if they intersect, and then sorting them into seperate directories. I upload the shapefiles using shp2pgsql and then correlate them using ST_Intersects and then sort them using os/shutil functions in the script.
My problem is one of our teams works only on government networks and cannot get postgres/postGIS approved by their system admins. Is there a python module/function out there that performs the same correlation of geometries as ST_Intersects without the need of postgres? Or if I need to write this myself, is there a site for algorithms pertaining to geometries. For example if I have an upper left and a lower right coordinate, how can I compute the other two points. I'm not asking for anyone to write code for me, just some help being pointed in the right direction.
Also all datums performed in WGS 1984
There are many tools to read Shapefiles, which you can use to get their extents or bounds. These can be used to build an R-tree index with the Rtree pacakge, which has some good examples in the documentation. With an R-tree index, you can use intersection to see where the bounding boxes intersect. This is similar to PostGIS' GiST index, except in my experience it is much faster to build and use. And if/when you need to do a detailed intersection of the geometries, you can use Shapely, which in turn uses GEOS, which is the same library used by PostGIS. So they are all related in similar ways.
See these related questions:
Looking for a fast way to find the polygon a point belongs to using Shapely
Faster way of polygon intersection with shapely
Related
My goal is to build a temperature gradient map over a floor plan to display minute changes in temp. via uniformly distributed sensors.
As far as I understand most heatmap tools available work with point density to produce heatmaps whereas what I'm looking for is a gradient based on varying values of individual points (sensors) on the map. I.e. something like this...
which I nicked from here.
From what I've gathered interpolation will definitely be involved and it may just be Radial Basis Function Interpolation because it wouldn't require a grid as per this post.
I've used the Anaconda distribution thus far. The data from sensors will be extracted from timescaleDB and the positions of sensors will be lat/long.
I've done very minor experimentation with the code from the link above and got this result. Radial Basis Function Interpolation
So here are my questions. Multiple python libraries have interpolation as a built-in function but which one of the libraries would be the best for the task described above? What parts of documentation should I read up on from libraries which can help me with this specific problem? Any good resource recommendations for this topic? Would anything else be required for this apart from interpolation?
Thanks in advance!
P.S. This is a side project I'd like to work on as a student, not commercial in any way shape or form.
I like scipy.interpolate library. It has a lot of nice functions, the simplest that would work for you would probably be the scipy.interpolate.interp2d(), and if you want to go with an non linear distribution of sensors griddata() is very useful.
I would like to represent a bunch of particles (~100k grains), for which I have the position (including rotation) and a discretized level set function (i.e. the signed distance of every voxel from surface). Due to the large sample, I'm searching for eficient solutions to visualize it.
I first went for vtk, using its python interface, but I'm not really sure if it's the best (and simplest) way to do it since, as far as I know, there is no direct implementation for getting an isosurface from a 3D data set. In the beginning I was thinking usind marching cubes, but then I still would have to use a threshold or to interpolate in order to get the voxels that are on the surface and label them in order to be used by the marching cubes.
Now I found mayavi which has a python function
mlab.pipeline.iso_surface()
I however did not find much documentation on it and was wondering how it behaves in terms of performance.
Does someone have experience with this kind of tools? Which would be the best solution (in terms of efficiency and, secondly, in terms of simplicity - I do not know the vtk library, but if there is a huge difference in performance I can dig into it, also without python interface).
Lowest/Highest Combined Surface(s)
I'm looking for a methodology (and/or preferably a software approach) to create what I'm calling the Lowest (or highest) combined surface for a set of polygons.
So if our input was these two polygons that partially overlap and definitely intersect
My Lowest Combined output would be these three polygons
Given a number of "surfaces" (3d polygons)
We've gone through a variety of approaches and the best solution we could come up with involved applying a point grid to each polygon and performing calculations to return the lowest sets of points at each grid location. The problem is that the original geometry is lost in this approach which doesn't give us a working solution.
Background
I'm looking at a variety of "surfaces" that can be represented by 3d faces (cad Speak) or polygons and usually are distributed in a shapefile (.shp). When there are two surfaces that interact I'm interested in taking either the lowest combined or highest combined surface. I'm able to do this in CAD by manually tracing out new polygons for the interaction zones - but once I get into more than a handful of surfaces this becomes too labor intensive.
The current Approach
My current approach which falls somewhere in the terrible category is to generate a point cloud from each surface on a 1m grid and then do a grid cell based comparison of the points.
I do this by using AutoCAD Civl 3D's surface Generation Tools to create a TIN from each polygon surface and then using its Surface. This is then exported to a 1m DEM file which I believe is a gridded output format.
Next each DEM file is brought into Global Mapper where I generate a single point at the center of each "elevation grid cell". Next this data is exported to a .csv file where each point contains a variety of attributes such as what the name of the surface this point came from and what its altitude is
Next once I have a set of CSV files I run them through a python script that will export the lowest point (and associated attributes) at each grid. I do everything in UTM because the UTM grid is based on meters and it makes everything easier.
Lastly we bring the point file back into global mapper - coloring each point by what surface it started from.
There a variety of issues with this approach - sometimes things don't line up perfectly and there is a variety of cleanup I have to do
Also the edges end up being jagged - as is the case because I've converted nice straight lines into a point cloud
Alternatively we came up with a similar approach in Arc GIS using the Surface Comparison tool, however it had similar limitations to what we ran into with my approach.
What I'm looking for is a way to do this automatically with a variable number of inputs. I'm willing to use just about any tool to have this done, as it seems like it shouldn't be too difficult a process
Software?
When I look at this problem from a programmers point of view it looks rather straight forward - but I'm at a total loss how to proceed. I'm assuming Stack Overflow is the correct stack exchange for this question - but if it should be somewhere else - I'm happy to move it to a different exchange.
I wasn't sure if something like Mathematica (which i have zero experience) with could handle this situation or whether there was some fancy 3d math library in python that could chop polygons up by how they interact and then give me the lowest for co-located polys.
In any case I'm willing to try anything out so please if you have an idea of what tools and/or libraries I can use to do this please share! I have to assume that there is SOMETHING out there that can handle this type of 3d geometric processing
Thanks
EDIT
Because the commenters seem confused I am not asking for code - I am asking for methodologies, libraries, support tools, or even software packages that can perform these operations. I plan to write software to do this, however, I am hoping I don't need to pull out my trig books and write all these operations by hand. I have to assume there is somebody out there that has dealt with something similar before.
I have a geoJSON database with lots of polygons (census tracts specifically) and I have lots of long,lat points.
I am hoping that there would exist an efficient Python code to identify which census tract a given coordinate is in, however so far my googling hasn't revealed anything.
Thanks!
I found an interesting article describing how to do exactly what you are looking to do.
TL;DR: Use Shapely
You will find this code at the end of the article:
import json
from shapely.geometry import shape, Point
# depending on your version, use: from shapely.geometry import shape, Point
# load GeoJSON file containing sectors
with open('sectors.json') as f:
js = json.load(f)
# construct point based on lon/lat returned by geocoder
point = Point(-122.7924463, 45.4519896)
# check each polygon to see if it contains the point
for feature in js['features']:
polygon = shape(feature['geometry'])
if polygon.contains(point):
print 'Found containing polygon:', feature
A great option for working with these types of data is PostGIS, a spatial database extender for PostgreSQL. I personally keep all of my geo data in a PostGIS database, and then reference it in python using psycopg2. I know it's not pure python, but it's got unbelievable performance benefits (discussed below) over pure python.
PostGIS has functionality built in to determine if a point or shape is within another shape. The good documentation on the ST_Within function expands upon this simple example:
SELECT
ST_WITHIN({YOUR_POINT},boundary)
FROM census;
-- returns true or false for each of your tracts
The benifit you'll gain from PostGIS that you likely won't achieve elsewhere is indexing, which can improve your speed 1,000x [1], making it better than even the best written C program (unless the C program also creates an index for your data). The database, when properly setup, will cache information about your tracts, and when you ask if a point is within a tract, it won't have to search everything... it can take advantage of it's index.
Getting data into and out of PostGRES is pretty simple. A great tutorial that will walk you through the basics of PostGIS with sample datasets not too different from yours can be found here. It's reasonably long, but if you're new to PostGIS (as I was), you'll be very entertained and excited the entire time:
http://workshops.boundlessgeo.com/postgis-intro/
[1] Indexing decreased a nearest neighbor search in one of my huge databases (20 m from 53 seconds to 8.2 milliseconds.
One cannot have really fast geometric code in Python. Instead the usual approach is to use fast C/C++ library with Python wrappers.
For example, you can start with CGAL - a very comprehensive C++ geometric library. It has Python bindings for most of its routines, see the link http://code.google.com/p/cgal-bindings/.
Are there any libraries that provide 3D polyhedra, and support calculating the intersection of two polyhedra?
If it makes a difference, the polyhedra I want to model do not have 'holes' in them.
The focus would be on correctness first and speed a close second!
Ideally this library would:
have existing tidy python bindings
be free-standing or have reasonable and small dependencies
support calculating the outline of the polyhedron when view from any given angle
CGAL offers rather more than you're asking for, but does in particular include polyhedra and "boolean"-like operations on them (I'm not sure about "view from any angle" as a primitive, though -- as I recall it wasn't there when I last used it, but that was a while ago -- you may have to iterate projecting the hedges on the appropriate plane).
The Python bindings are here and I believe the only "big" dependency is Boost Python (used for the bindings).