I would like to find for every address in my list, what is the constituency that the address belongs to.
I have found this map, that lists the constituency by location
https://www.google.com/maps/d/viewer?mid=12vZFyd7VqJyI2v5XOhBK5olnPnw&ll=1.343725467746546%2C103.87371266459195&z=12
My addresses are of the form
Marine Parade Central
Marine Vista
etc. I can geocode them to obtain latitude longitude as well.
Manually, you can key this address into the map, then find out which constituency it belongs to. Is there a way to automate this process? I've only used the heatmap layer before, and downloading the map is only available in KML. If only there was a simple API for this.. If there was a python way to go about this, would be preferable, but anything that works really.
Would be really grateful if someone can guide me on this! :)
Related
Let's say I want to create maps of crime, education, traffic or etc on a street or a city. What are the modules I need to learn or the best ones?
For some data, I will be using excell like documents where I will have street names or building numbers unlinked to Google Maps directly and will be combined later through codes. For some, I want to obtain data directly from Google Maps, such as names of the stores or street numbers. I'm a beginner and a sociologist and this is the main reason I want to learn programming. Maybe painting on a map picture can be a lot easier but on the long term my aim is using Google Maps since it can obtain data by itself. Thanks in advance.
I'm a beginner, need a long shot plan and an advice. I watched some numpy and pandas videos and they seem ok and doable so far.
There are several Python modules that can be used to work with Google Maps data. Some of the most popular ones include:
Google Maps API: This is the official API for working with Google Maps data. It allows you to access a wide range of data, including street maps, satellite imagery, and places of interest. You can use the API to search for addresses, get directions, and even create custom maps.
gmaps: This is a Python wrapper for the Google Maps API. It makes it easy to work with the API by providing a simple, Pythonic interface. It also includes support for several popular Python libraries, such as Pandas and Numpy.
folium: This is a library for creating leaflet maps in Python. It allows you to create interactive maps, add markers and other data, and customize the appearance of your maps.
geopandas: This library allows you to work with geospatial data in Python. It is built on top of the popular Pandas library and includes support for working with shapefiles, geojson, and more.
geopy: This is a Python library for working with geocoding and distance calculations. It can be used to convert addresses to latitude and longitude coordinates, as well as to perform distance calculations between two points.
In general, it's recommended to start with Google Maps API and gmaps and folium, you can also use geopandas and geopy later when you need more advanced functionalities. Try to start with simple examples and gradually increase the complexity of your projects.
I have a lot of UK data and what I would like to do is extract this data based upon a post code, co-ordinates, grid ref etc.
Is this possible using Python?
Yes. If you just have the postcodes, you'll first most likely need to convert them to coordinates. This can be done with 3rd party tools such as Googles Distance Matrix API, or the Royal Mail UK Postcode mailing list. Once you have coordinates, with this, you can plot them however you like using other tools such as Highcharts, or make your own.
I have a large collection (and growing) of geospatial data (lat, lon) points (stored in mongodb, if that helps).
I'd like to generate a choropleth map (http://vis.stanford.edu/protovis/ex/choropleth.html), which requires knowing the state which contains that point. Is there a database or algorithm that can do this without requiring call to external APIs (i.e. I'm aware of things like geopy and the google maps API).
Actually, the web app you linked to contains the data you need -
If you look at http://vis.stanford.edu/protovis/ex/us_lowres.js for each state, borders[] contains a [lat,long] polyline which outlines the state. Load this data and check for point-in-polygon - http://en.wikipedia.org/wiki/Point_in_polygon
Per Reverse Geocoding Without Web Access you can speed it up a lot by pre-calculating a bounding box on each state and only testing point-in-polygon if point-in-bounding-box.
Here's how to do it in FORTRAN. Remember FORTRAN? Me neither. Anyway, it looks pretty simple, as every state has its own range.
EDIT It's been point out to me that your starting point is LAT-LONG, not the zipcode.
The algorithm for converting a lat-long to a political division is called "a map". Seriously, that's allan ordinary map is, a mapping of every point in some range to the division it belongs to. A detailed digital map of all 48 contiguous states would be a big database, and then you would need some (fairly simple) code to determine for each state (described as a series of line segments outlining the border) whether a given point was inside it or out.
you can try using Geonames database. It has long/lat as well as city, postal and other location type data. It is free as well.
but If you need to host it locally or import it into your own database , the USGS and NGA provide a comprehensive list of cities with lat/lon. It's updated reguarly, free, and reliable.
http://geonames.usgs.gov
http://earth-info.nga.mil/gns/html/index.html
Not sure the quality of the data, but give this a shot: http://www.boutell.com/zipcodes/
If you don't mind a very crude solution, you could adapt the click-map here.
I would like to be able to associate various models (Venues, places, landmarks) with a City/Country.
But I am not sure what some good ways of implementing this would be.
I'm following a simple route, I have implemented a Country and City model.
Whenever a new city or country is mentioned it is automatically created.
Unfortunately I have various problems:
The database can easily be polluted
Django has no real knowledge of what those City/Countries really are
Any tips or ideas? Thanks! :)
A good starting places would be to get a location dataset from a service like Geonames. There is also GeoDjango which came up in this question.
As you encounter new location names, check them against your larger dataset before adding them. For your 2nd point, you'll need to design this into your object model and write your code accordingly.
Here are some other things you may want to consider:
Aliases & Abbreviations
These come up more than you would think. People often use the names of suburbs or neighbourhoods that aren't official towns. You can also consider ones like LA -> Los Angeles MTL for Montreal, MT. Forest -> Mount Forest, Saint vs (ST st. ST-), etc.
Fuzzy Search
Looking up city names is much easier when differences in spelling are accounted for. This also helps reduce the number of duplicate names for the same place.
You can do this by pre-computing the Soundex or Double Metaphone values for the cities in your data set. When performing a lookup, compute the value for the search term and compare against the pre-computed values. This will work best for English, but you may have success with other romance language derivatives (unsure what your options are beyond these).
Location Equivalence/Inclusion
Be able to determine that Brooklyn is a borough of New York City.
At the end of the day, this is a hard problem, but applying these suggestions should greatly reduce the amount of data corruption and other headaches you have to deal with.
Geocoding datasets from yahoo and google can be a good starting poing, Also take a look at geopy library in django.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 months ago.
Improve this question
I am working on an application where one of the requirements is that I be able to perform realtime reverse geocoding operations based on GPS data. In particular, I must be able to determine the state/province to which a latitude, longitude pair maps and detect when we have moved from one state/province to another.
I have a couple ideas so far but wondered if anyone had any ideas on either of the following:
What is the best approach for tackling this problem in an efficient manner?
Where is a good place to find and what is the appropriate format for North American state/province boundaries
As a starter, here are the two main ideas I have:
Break North America into a grid with each rectangle in the grid mapping to a particular state province. Do a lookup on this table (which grows quickly the more precise you would like to be) based on the latitude and then the longitude (or vice versa).
Define polygons for each of the states and do some sort of calculation to determine in which polygon a lat/lon pair lies. I am not sure exactly how to go about this. HTML image maps come to mind as one way of defining the bounds for a state/province.
I am working in python for the interested or those that might have a nice library they would like to suggest.
To be clear... I do not have web access available to me, so using an existing reverse geocoding service is not an option at runtime
I created an offline reverse geocoding module for countries: https://github.com/richardpenman/reverse_geocode
>>> import reverse_geocode
>>> coordinates = (-37.81, 144.96), (31.76, 35.21)
>>> reverse_geocode.search(coordinates)
[{'city': 'Melbourne', 'code': 'AU', 'country': 'Australia'},
{'city': 'Jerusalem', 'code': 'IL', 'country': 'Israel'}]
I will see if I can add data for states.
I suggest using a variant of your first idea: Use a spatial index. A spatial index is a data structure built from rectangles, mapping lat/long to the payload. In this case you will probably map rectangles to state-province pairs. An R-tree may be a good option. Here's an R-tree python package. You could detect roaming by comparing the results of consecutive searches.
I would stay away from implementing your own solution from scratch. This is a pretty big undertaking and there are already tools out there to do this. If you're looking for an open source approach (read: free), take a look at this blog post: Using PostGIS to Reverse Geocode.
If you can get hold of state boundaries as polygons (for example, via OpenStreetMap), determining the current state is just a point-in-polygon test.
If you need address data, an offline solution would be to use Microsoft Mappoint.
You can get data for the entire united states from open street map You could then extract the data you need such as city or state locations into what ever format works best for your application. Note although data quality is good it isn't guaranteed to be completely accurate so if you need complete accuracy you may have to look somewhere else.
I have a database with all of this data and some access tools. I made mine from the census tiger data. I imagine it'd basically be an export of my database to sqlite and a bit of code translation.
The free reverse geocoding service I developed (www.feroeg.com) is based on spatialite, a sqlite library implementing SQL spatial capabilities (r-tree).
The data are imported from OpenStreetMap (nation, cities, streets,street number) and OpenAddresses (street numbers) using proprietary tools.
The entire world consumes about 250GB.
There is a paper describing the architecture of the service:
https://feroeg.com/Feroeg_files/Feroeg Presentation.pdf
At the moment the project (importer and server) is closed source.
Reverse Geocoding Library (C++) and converting tools are availabile on request.