API to explore Transportation Mode given Geo-Points - python

I’m new to here-aip, and I’m not sure if my needs can be addressed via service offering (I'm open to other location like services / utilities).
I have a list of (ordered) geo-points describing a transit (already taken place in the past) and I would like to get a clue of a feasible transportation mode in which this transit had taken place (e.g.: this looks like a train trip!).
input example :
lon
lat
121.240436
24.952392
121.24043
24.95239
121.240436
24.952392
121.23966
24.952605
121.23964
24.9526
121.23964
24.95227
121.23964
24.95227
121.239683
24.952316
121.23967
24.95232
121.240149
24.951126
121.24016
24.95111
I have thought about providing the list of points as a constraint and receive the estimated duration using each transportation mode (then on my side I can compare the possible duration with the actual duration and conclude the transportation mode).
I currently understand that if I provide two points (e.g. start and end) I can get a duration estimation and route, but I need more then that (e.g. if the actual transit is circular, providing start point and end point will not be meaningful).
Any ideas?

You can make use of VIA parameter to define the multiple waypoints while calculating the route.
Please refer the below link for more details.
https://developer.here.com/documentation/routing-api/dev_guide/topics/use-cases/via.html
And Refer below example which explain how we can use the VIA parameter in the request.
https://router.hereapi.com/v8/routes?apikey={YOUR_API_KEY}&origin=52.542948,13.37152&destination=52.523734,13.40671&via=52.529061,13.369288&via=52.522272,13.381991&return=polyline,summary,actions,instructions,turnByTurnActions&transportMode=car&departureTime=2022-04-05T20:45:00-07:00

Related

Adding location data to Google Fit

I'm writing an application in Python to add data to Google Fit. I am successful in creating data sources and adding datasets with points for heart rate, cadence, speed, steps, etc. as well as sessions for those datasets.
I am now trying to add location data so that activities in Google Fit show maps of the activity but not having any luck with that. Something that is unclear to me is that while all of the above items are a single data point, location is 4 data points according to https://developers.google.com/fit/datatypes/location#location_sample.
Do these 4 different items in the data point need to be named in any way, or do I just add them as 4 fpVals one after another in the same order as described on the above reference? I.e. in building my array of points for the dataset's patch operation do I just add them to the value array as such:
gfit_loc.append(dict(
dataTypeName='com.google.location.sample',
endTimeNanos=p.time.timestamp() * 1e9,
startTimeNanos=p.time.timestamp() * 1e9,
value=[dict(fpVal=p.latitude),
dict(fpVal=p.longitude),
dict(fpVal=10),
dict(fpVal=p.elevation)]
))
where the dataset is added with:
data = service.users().dataSources().datasets().patch(
userId='me',
dataSourceId='raw:com.google.location.sample:718486793782',
datasetId='%s-%s' % (min_log_ns, max_log_ns),
body=dict(
dataSourceId='raw:com.google.location.sample:718486793782',
maxEndTimeNs=max_log_ns,
minStartTimeNs=min_log_ns,
point=gfit_loc
)
).execute()
So it turns out that I was doing everything correctly with a small exception. I was setting the activity type to 95, which is defined as Walking (treadmill) for all activities in my prototype. I had not gotten to allowing the user to specify the activity type and given that 95 is an indoor treadmill activity, Google Fit was simply not showing any location data for the activity in the form of a map.
Once I started using non-treadmill activity types, maps started showing up in my Google Fit activities.

Which pvlib function have i to use in order to get the yearly in-plane irradiation just like the one I get using the interface in PVGIS?

I simply need an example of the yearly in plane irradiation using irradiance, given the latitude, the longitude, the tilt of the surface and the azimut, in order to automatize my process later.
I need to do this computation in order to compute Actual vs. Theoretic Production Ratio of some plants in central EU timezone.
My aim is to obtain the same number provided by the graphic user interface:
Summary Section of the PVG Performance Tool, so I suppose that they provide the "typical" annual irradiation, caring about the effects of cloudy days.
Thank you very much!
Any kind of help would be really appreciated.
pvlib has a function for retrieving PVGIS TMY time-series, which include GHI, DNI, DHI, temperature, wind speed, and a few others for Europe and Africa.
I am currently working on a pull request for adding capabilities for retrieving PVGIS's hourly radiation and PV power output to pvlib (exactly the parameters shown in your image). You can find the code by going to the file part of the pull-request and copy it to a file on your desktop and it should work smoothly. Let me know if you need any help using it.

Hub and Spoke indication using Python

Situation
Our company generates waste from various locations in US. The waste is taken to different locations based on suppliers' treatment methods and facilities placed nationally.
Consider a waste stream A which is being generated from location X. Now the overall costs to take care of Stream A includes Transportation cost from our site as well treatment method. This data is tabulated.
What I want to achieve
I would like my python program to import excel table containing this data and plot the distance between our facility and treatment facility and also show in a hub-spoke type picture just like airlines do as well show data regarding treatment method as a color or something just like on google maps.
Can someone give me leads on where should I start or which python API or module that might best suite my scenario?
This is a rather broad question and perhaps not the best for SO.
Now to answer it: you can read excel's csv files with the csv module. Plotting is best done with matplotlib.pyplot.

Display GTFS - Routes on a Map without Shapes

I try to consume some GTFS Feeds and work with them.
I created a MySQL Database and a Python Script, which downloads GTFS - Files and import them in the right tables.
Now I am using the LeafLet Map - Framework, to display the Stops on the map.
The next step is to display the routes of a bus or tram line on the map.
In the GTFS - Archive is no shapes.txt.
Is there a way to display the routes without the shapes.txt ?
Thanks!
Kali
You will have to generate your own shape using underlying street data or public transit lines. See detailed post by Anton Dubrau (he is an angel for writing this blog post).
https://medium.com/transit-app/how-we-built-the-worlds-prettiest-auto-generated-transit-maps-12d0c6fa502f
Specifically:
Here’s an example. In the diagram below, we have a trip with three
stops, and no shape information whatsoever. We extract the set of
tracks the trip uses from OSM (grey lines). Our matching algorithm
then finds a trajectory (black line) that follows the OSM, while
minimizing its length and the errors to the stops (e1, e2, e3).
The only alternative to using shapes.txt would be to use the stops along the route to define shapes. The laziest way would be to pick a single trip for that route, get the set of stops from stop_times.txt, and then get the corresponding stop locations from stops.txt.
If you wanted or needed to, you could get a more complete picture by finding the unique ordered sets of stops among all of the trips on that route, and define a shape for each ordered set in the same way.
Of course, these shapes would only be rough estimates because you don't have any information about the path taken by the vehicles between stops.

How to convert from lat lon to zipcode or state to generate choropleth map

I have a large collection (and growing) of geospatial data (lat, lon) points (stored in mongodb, if that helps).
I'd like to generate a choropleth map (http://vis.stanford.edu/protovis/ex/choropleth.html), which requires knowing the state which contains that point. Is there a database or algorithm that can do this without requiring call to external APIs (i.e. I'm aware of things like geopy and the google maps API).
Actually, the web app you linked to contains the data you need -
If you look at http://vis.stanford.edu/protovis/ex/us_lowres.js for each state, borders[] contains a [lat,long] polyline which outlines the state. Load this data and check for point-in-polygon - http://en.wikipedia.org/wiki/Point_in_polygon
Per Reverse Geocoding Without Web Access you can speed it up a lot by pre-calculating a bounding box on each state and only testing point-in-polygon if point-in-bounding-box.
Here's how to do it in FORTRAN. Remember FORTRAN? Me neither. Anyway, it looks pretty simple, as every state has its own range.
EDIT It's been point out to me that your starting point is LAT-LONG, not the zipcode.
The algorithm for converting a lat-long to a political division is called "a map". Seriously, that's allan ordinary map is, a mapping of every point in some range to the division it belongs to. A detailed digital map of all 48 contiguous states would be a big database, and then you would need some (fairly simple) code to determine for each state (described as a series of line segments outlining the border) whether a given point was inside it or out.
you can try using Geonames database. It has long/lat as well as city, postal and other location type data. It is free as well.
but If you need to host it locally or import it into your own database , the USGS and NGA provide a comprehensive list of cities with lat/lon. It's updated reguarly, free, and reliable.
http://geonames.usgs.gov
http://earth-info.nga.mil/gns/html/index.html
Not sure the quality of the data, but give this a shot: http://www.boutell.com/zipcodes/
If you don't mind a very crude solution, you could adapt the click-map here.

Categories

Resources