There are several posts already about getting directions to Google Maps API if your waypoints exceed 23, but the answers about it do not account for if optimize_waypoints is True. The difference is, adding one stop to the list of addresses could throw off the whole route optimization equation. Does anyone have any suggested work-arounds for it?
I was thinking maybe running it once, dropping off the smallest mileage different stops (say if I send 26, drop off the 3 smallest), then running it again, then somehow reconcile the two routes? Is there a more efficient or cleaner way? Any thoughts? Thanks!
I am using Python, but this is not a question to specific Python. More of a general programming and Google Maps directions question. But answers referencing Python would be most appreciated!
We have an entry for this feature request for including more than 23 waypoints in Directions API requests in the Google Issue Tracker so that we can provide information on it to all Google Maps APIs users, as the technical aspects of this issue are of interest to a number of our customers,
Issues in the Issue Tracker are directly curated by Directions API specialists, who will provide updates in the Issue Tracker whenever there's news from the engineering team.
We would like to warmly invite you to view the issue in the Issue Tracker, and to star it to register your interest. This will subscribe you to receive technical updates on the issue. Starring the issue also provides us with valuable feedback on the importance of the issue to our customers, and increases the issue's priority with the product engineering team.
You can view and star the issue here:
- https://issuetracker.google.com/35824756
This Issue Tracker entry is the authoritative source for public information regarding this issue, and all publicly-relevant updates will be posted there.
Related
I want to get the route between 2 points using google maps api but I want also avoid some coordinates between them.
I have been investigating this feature but I do not know if it is possible to get this done. See these threads:
Is there a way to avoid a specific road or coordinate in Google Directions?
Avoid some coordinates in routes using Google Directions API Android
Anyone know if it is possible?
Thanks
Avoid feature has been introduced in Google Maps Distance Matrix API, however it can only be used to avoid Tolls, Highways, Ferries, and Indoor.
you can check this on its documentation page.
https://developers.google.com/maps/documentation/distance-matrix/intro
This feature is pretty popular and has been formerly requested in Google Issue Tracker. If you really need it you can go ahead and support it by starring it.
https://issuetracker.google.com/issues/35816642
I have a simple piece of Python code to run a Radar Search with Places API and return the results coordinates in a list. I run into three problems, first, the results pulled this way do not match with doing a search on Google Maps itself using the same coordinates and parameters. Specifically, I get MANY more results on Radar Search. In a radius of 1km, I get more than 200 results for a restaurant chain name.
Second, the results go beyond 1km, my specified radius. The furthest is 1.3km away using Haversine.
Third, the results are wrong. The keyword field has no effect on the results. For example, searching for "McDonalds" or "Car" with the same parameters yield the exact same results. One of the results points to an Adidas store when I use the Place ID to find the Google description.
This is code independent, these problems are there if I just C&P this into the url bar:
https://maps.googleapis.com/maps/api/place/radarsearch/json?location=39.876186,116.439424&radius=1000&keyword=McDonalds&key=KEY
I have seen another similar post on Places API malfunctioning recently. Any help is appreciated. Thanks
I have a support ticket open with Google about this, as we're Enterprise customers, and they have confirmed there is an issue and they're working on it. From my conversations with them over the last few days:
There have been a few other reports of this issue and we've reported
the problem to the Places API team. I'll get back to you as soon as we
have more information from them.
We've received some other reports of this and the API engineers are
looking at the issue with the highest priority. There's no obvious
cause yet, but we'll let you know when they're done investigating and
have determined a fix.
I'm sorry to hear about the complaints that you're receiving, but
unfortunately the engineers haven't been able to give me an ETA yet. I
expect to hear back from them soon but can't give an estimate yet.
I'll post updates here as I get them.
UPDATE 9/8: Google's support is saying this issue will be fixed by end of the week.
UPDATE 9/12: Google fixed it. It was being tracked here: https://code.google.com/p/gmaps-api-issues/issues/detail?id=7082
I need to get the number of unique visitors(say, for the last 5 minutes) that are currently looking at an article, so I can display that number, and sort the articles by most popular.
ex. Similar to how most forums display 'There are n people viewing this thread'
How can I achieve this on Google App Engine? I am using Python 2.7.
Please try to explain in a simple way because I recently started learning programming and I am working on my first project. I don't have lots of experience. Thank you!
Create a counter (property within entity) and increase it transactionally for every page view. If you have more then a few pageviews a second you need to look into sharded counters.
There is no way to tell when someone stops viewing a page unless you use Javascript to inform the server when that happens. Forums etc typically assume that someone has stopped viewing a page after n minutes of inactivity, and base their figures on that.
For minimal resource use, I would suggest using memcache exclusively here. If the value gets evicted, the count will be incorrect, but the consequences of that are minimal, and other solutions will use a lot more resources.
Did you consider Google Analytics service for getting statistics? Read this article about real-time monitoring using this service. Please note: a special script must be embedded on every page you want to monitor.
I am working on a website for which it would be useful to know the number of links shared by a particular facebook page (e.g., http://www.facebook.com/cocacola) so that the user can know whether they are 'liking' a firehose of information or a dribble of goodness. What is the best way to get the number of links/status updates that are shared by a particular page?
+1 for implementations that use python (this is a django website) but any solutions are welcome! I tried using fbconsole to accomplish this but I have come up a little short.
For what it is worth, this unanswered question seems relevant. As does the fact that, as of 2012.04.18, you can export your data to csv from the insights management page on the facebook site. The information is in there I just don't know how to get it out...
Thanks for your help!
In the event that anyone else finds this useful, I thought I'd post my gist example here. fbconsole makes it fairly simple to extract data through the Facebook Graph API.
The caveat is that it was not terribly easy to programmatically extract data through fbconsole so I wrote the fbconsole.automatically_authenticate to make it much easier to access this information in a systematic way. This addition has not yet been incorporated into the master branch of fbconsole (it was just posted this morning), but it is available here in the meantime for those that are interested.
What ready available algorithms could I use to data mine twitter to find out the degrees of separation between 2 people on twitter.
How does it change when the social graph keeps changing and updating constantly.
And then, is there any dump of twitter social graph data which I could use rather than making so many API calls to start over.
From the Twitter API
What's the Data Mining Feed and can I have access to it?
The Data Mining Feed is an expanded version of our /statuses/public_timeline REST API method. It returns 600 recent public statuses, cached for a minute at a time. You can request it up to once per minute to get a representative sample of the public statuses on Twitter. We offer this for free (and with no quality of service guarantees) to researchers and hobbyists. All we ask is that you provide a brief description of your research or project and the IP address(es) you'll be requesting the feed from; just fill out this form. Note that the Data Mining Feed is not intended to provide a contiguous stream of all public updates on Twitter; please see above for more information on the forthcoming "firehose" solution.
and also see: Streaming API Documentation
There was a company offering a dump of the social graph, but it was taken down and no longer available. As you already realized - it is kind of hard, as it is changing all the time.
I would recommend checking out their social_graph api methods as they give the most info with the least API calls.
There might be other ways of doing it but I've just spent the past 10 minutes looking at doing something similar and stumbled upon this Q.
I'd use an undirected (& weighted - as I want to look at location too) graph - use JgraphT or similar in py; JGraphT is java based but includes different prewritten algos.
You can then use an algorithm called BellmanFord; takes an integer input and searches the graph for the shortest path with the integer input, and only integer input, unlike Dijkstras.
http://en.wikipedia.org/wiki/Bellman%E2%80%93Ford_algorithm
I used it recently in a project for flight routing, iterating up to find shortest path with shortest 'hops' (edges).