I've tried the following,
input: lat/lon data
then I'll calculate a box around it by, let's say 50 m, so +/- 50 m on easting/northing value.
Now I reconvert it to lat/lon and with a script:
http://robotics.ai.uiuc.edu/~hyoon24/LatLongUTMconversion.py I get a result that just can't be, lon before is around 7, afterwards around 2.
zone, easting, northing = LLtoUTM(23, location.get_lat(), location.get_lon())
topUTM = northing + error
bottomUTM = northing - error
leftUTM = easting - error
rightUTM = easting + error
left, top = UTMtoLL(23, leftUTM, topUTM, zone)
Is the error in my code, or might be the script flawed?
So I've tried to use pyproj, just lat/lon to utm to lat/lon to see what happens
>>> p = pyproj.Proj(proj='utm', zone=32, ellps='WGS84')
>>> p
<pyproj.Proj object at 0x7ff9b8487dd0>
>>> x,y = p(47.9941214, 7.8509671)
>>> print x,y
5159550.36822 1114087.43925
>>> print p(x,y,inverse=True)
(47.971558538495991, 7.8546573140162605)
And here it's not as extremely far off as with the script from above, but it still seems strongly enough incorrect as to not be able to use it. How come? What can I do to get more exact results?
EDIT:
I ran test() and it all tests passed.
in epsg file there is no such thing. The closest I've found was this:
<32632> +proj=utm +zone=32 +ellps=WGS84 +datum=WGS84 +units=m +no_defs <>
no tmerc. Also What would I need to pass the towgs84 as parameters? The ones above?
I've created a small UTM conversion library for Python last week and uploaded it to the Python Package Index: http://pypi.python.org/pypi/utm
I have compared it to using pyproj and it is faster and more accurate. Given your sample data, this is the result:
>>> import utm
>>> u = utm.from_latlon(47.9941214, 7.8509671)
>>> print u
(414278, 5316285, 32, 'T')
>>> print utm.to_latlon(*u)
(47.994157948891505, 7.850963967574302)
UPDATE: Richards answer below describes the real solution for this issue.
The error is in your code.
First off, the PyProj issue listed in one of the other answers is real. You should check your epsg file and make sure it includes the line
<2392> +proj=tmerc +lat_0=0 +lon_0=24 +k=1.000000 +x_0=2500000 +y_0=0 +ellps=intl +towgs84=-90.7,-106.1,-119.2,4.09,0.218,-1.05,1.37 +units=m +no_defs no_defs <>
Note the towgs84 parameter.
Your problem with PyProj stems from mis-using the projection command.
If we take 47.9941214N, 7.8509671E and convert to UTM we get Zone 32, 414278 Easting, 5316286 Northing.
You perform the following PyProj operations:
p = pyproj.Proj(proj='utm', zone=32, ellps='WGS84')
>>> x,y = p(47.9941214, 7.8509671)
>>> print x,y
5159550.36822 1114087.43925
>>> print p(x,y,inverse=True)
(47.971558538495991, 7.8546573140162605)
But, if we consult the PyProj documentation, we see the following:
Calling a Proj class instance with the arguments lon, lat will convert
lon/lat (in degrees) to x/y native map projection coordinates (in
meters).
Let's try running the OP's PyProj operations again, but switch the order of the lon/lat arguments:
p = pyproj.Proj(proj='utm', zone=32, ellps='WGS84')
>>> x,y = p(7.8509671, 47.9941214)
>>> print x,y
414278.16731 5316285.59492
>>> print p(x,y,inverse=True)
(7.850967099999812, 47.994121399999784)
The operation inverts itself (pretty much) perfectly!
To answer the first part of your question, if you look in http://robotics.ai.uiuc.edu/~hyoon24/LatLongUTMconversion.py at the definition of UTMtoLL, you find the following:
UTMtoLL(ReferenceEllipsoid, northing, easting, zone)
Yet you use UTMtoLL(23, leftUTM, topUTM, zone) where leftUTM is an Easting and topUTM is a Northing.
Therefore, in the case of both your first script and PyProj, you've used the wrong order of arguments.
It's a good reminder to always double- (or triple-) check your work before suggesting that someone else's is wrong. That said, Python's documentation is not the greatest and PyProj's documentation in this instance is cryptic at best. A nice web-based explanation of this command and accompanied by examples of its usage would have probably prevented angst on your part.
I have no problem with pyproj, try the following code
from pyproj import Proj
Lat = 52.063098675
Lon = -114.132980348 #Calgary
ZoneNo = "11" #Manually input, or calcuated from Lat Lon
myProj = Proj("+proj=utm +zone="+ZoneNo+",\
+north +ellps=WGS84 +datum=WGS84 +units=m +no_defs") #north for north hemisphere
UTMx, UTMy = myProj(Lon, Lat)
########################################
#UTM ==> Lat Lon:
ZoneNo = "11" #Manually input or from other sources
myProj = Proj("+proj=utm +zone="+\
ZoneNo+", +north +ellps=WGS84 +datum=WGS84 +units=m +no_defs")
Lon2, Lat2 = myProj(UTMx, UTMy,inverse=True)
print Lat2
print Lon2
Your issue with pyProj sounds just like the one described here:
http://code.google.com/p/pyproj/issues/detail?id=3
which is resolved:
solved! in epsg file there must be
<2392> +proj=tmerc +lat_0=0 +lon_0=24 +k=1.000000 +x_0=2500000 +y_0=0 +ellps=intl
+towgs84=-90.7,-106.1,-119.2,4.09,0.218,-1.05,1.37 +units=m +no_defs no_defs <>
note the towgs84 parameter!
Check that thread out if you want to continue to use pyproj.
Also, does the test() function of the module work? Have you tried any of the scripts that come with it in the test directory?
Related
I am using this Mapillary endpoint: https://tiles.mapillary.com/maps/vtp/mly1_public/2/{zoom_level}/{x}/{y}?access_token={} and getting such responses back (see photo). Also, here is the Mapillary documentation.
It is not quite clear to me what the nested coordinate lists in the response represent. By the looks of it, I initially thought it may have to do with pixel coordinates. But judging by the context (the API documentation) and the endpoint I am using, I would say that is not the case. Also, I am not sure if the json response you see in the picture is valid geojson. Some online formatters did not accept it as valid.
I would like to find the bounding box of the "sequence". For context, that would be the minimal-area rectangle defined by two lat, lon positions that fully encompasses the geometry of the so-called "sequence"; and a "sequence" is basically a series of photos taken during a vehicle/on-foot trip, together with the metadata associated with the photos (metadata is available using another endpoint, but that is just for context).
My question is: is it possbile to turn the coordinates you see in the pictures into (lat,lon)? Having those, it would be easy for me to find the bounding box of the sequence. And if so, how? Also, please notice that some of the nested lists are of type LineString while others are MultiLineString (which I read about the difference here: help.arcgis.com, hope this helps)
Minimal reproducible code snippet:
import json
import requests
import mercantile
import mapbox_vector_tile as mvt
ACCESS_TOKEN = 'XXX' # can be provided from here: https://www.mapillary.com/dashboard/developers
z6_tiles = list(mercantile.tiles( #us_west_coast_bbox
west=-125.066423,
south=42.042594,
east=-119.837770,
north=49.148042,
zooms=6
))
# pprint(z6_tiles)
vector_tiles_url = 'https://tiles.mapillary.com/maps/vtp/mly1_public/2/{}/{}/{}?access_token={}'
for tile in z6_tiles:
res = requests.get(vector_tiles_url.format(tile.z,tile.x,tile.y,ACCESS_TOKEN))
res_json = mvt.decode(res.content)
with open('idea.json','w+') as f:
json.dump(res_json, f, indent=4)
I think this get_normalized_coordinates is the solution I was looking for. Please take this with a grain of salt, as I did not fully test it yet. Will try to and then I will update my answer. Also, please be cautious, because for tiles closer to either the South or the North Pole, the Z14_TILE_DMD_WIDTH constant will not be the one you see, but something more like: 0.0018958715374282065.
Z14_TILE_DMD_WIDTH = 0.02197265625
Z14_TILE_DMD_HEIGHT = 0.018241950298914844
def get_normalized_coordinates(bbox: mercantile.LngLatBbox,
target_lat: int,
target_lon: int,
extent: int=4096): # 4096 is Mapillary's default
"""
Returns lon,lat tuple representing real position on world map of a map feature.
"""
min_lon, min_lat, _, _ = bbox
return min_lon + target_lon / extent * Z14_TILE_DMD_WIDTH,
min_lat + target_lat / extent * Z14_TILE_DMD_HEIGHT
And if you are wondering how I came with the constants that you see, I simply iterated over the list of tiles that I am interested in and checked to make sure they all have the same width/height size (this might have not been the case, keeping in mind what I mentioned above about tiles closer to one of the poles - I think this is called "distortion", not sure). Also, for context: these tiles I iterated over are within this bbox: (-125.024414, 31.128199, -108.896484, 49.152970) (min_lon, min_lat, max_lon, max_lat; US west coast) which I believe is also why all the tiles have the same width/height sizes.
set_test = set()
for tile in relevant_tiles_set:
curr_bbox = mercantile.bounds(list_relevant_tiles_set[i])
dm_width_diff: float = curr_bbox.east - curr_bbox.west
dm_height_diff: float = curr_bbox.north - curr_bbox.south
set_test.add((dm_width_diff, dm_height_diff))
set_test
output:
{(0.02197265625, 0.018241950298914844}
UPDATE: forgot to mention that you actually do not need to compute those WIDTH, HEIGHT constants. You just replace those with (max_lon - min_lon) and (max_lat - min_lat) respectively. What I did with those constants was something for testing purposes only
I want to extract all coordinates out of a table which are inside a given radius.
How do I need to set the for loop?
I use the haversine formula for this and I just enter the lat and lon values of the center point and the lat and lon values of the point to be tested if it is in the given radius.
So I thought I need a for-loop where I run the haversine formula for each row of the lat and lon column and if the cooridnates are inside the radius i save them in an list.
#Get coordinates
#Center coordinates = nearest road location
lat1 = float(lowParkingUtilization.iloc[roadIndex].toLat)
lon1 = float(lowParkingUtilization.iloc[roadIndex].toLon)
#Test coordinates = scooter coordinates
insideRadius = []
radius = 2.50 # in kilometer
for i in eScooterVOI['lat']:
lat2 = float(eScooterVOI['lat'][i])
lon2 = float(eScooterVOI['lon'][i])
a = haversine(lon1, lat1, lon2, lat2)
if a <= radius:
insideRadius += str(lon2)+","+str(lat2)
else:
With the given code I get following error message:
File "<ipython-input-574-02dadebee55c>", line 18
^
SyntaxError: unexpected EOF while parsing
The correct answer for the question "How do I need to set the for loop?" is: YOU DON'T. pandas dataframes are not for looping over their rows. What you DO need to do is the create two new columns in the dataframe, one to calculate the distance, and one to store the names in the format you want:
eScooterVOI['dist'] = eScooterVOI.apply(lambda x: haversine(lon1, lat1, x['lon'], x['lat']), axis=1)
eScooterVOI['name'] = eScooterVOI['lon'].astype(str) + ',' + eScooterVOI['lat'].astype(str)
And then, to get a list with only the names of the coordinates whose distance is less than the radius use:
insideRadius = list(eScooterVOI[eScooterVOI['dist'] <= radius]['name'])
btw: the haversine function can be built in a way that it recieves a pandas series instead of a value, and by that it could be implemented much faster than using df.apply, but that would require changing some code which is not here in the question.
The SyntaxError: unexpected EOF while parsing error message means that some of the code blocks were not completed and the end of the code has reached.
Your else block requires at least one line of code that should be in it.
For example:
else:
lots_of_code_to_write_here
You had this error because of your else block.
When Python reads it, it expects some code to be written. Python does not find any so an error occures.
Your code might be working laready, just delete the else block, you can use an if block without being bound to use an else one.
Anyway, if you absolutely want to use an else block try something like that :
if a <= radius:
insideRadius += str(lon2)+","+str(lat2)
else :
pass
But I do not think it is recommended.
I have a list of position coordinates given in UK grid reference format (X,Y) and I would like to convert them into latitude and longitude.
I am using OSGridConverter (python library) but is not converting it correctly. For example, one input for one location is X = 517393.6563 and Y = 194035.5469.
from OSGridConverter import grid2latlong
l=grid2latlong('TG 517393.6563 194035.5469')
The above, gives me an error: OSGridConverter error: Invalid grid reference
Therefore, which is wrong, I try:
>>>l=grid2latlong('TG 517393 194035')
>>>(l.latitude,l.longitude)
(52.71367793063314, 1.7297510074170983)
Which ends with a location out in UK, which is not correct. Most probably is due data formats, but I am not sure about how to solve it.
You should probably use something like pyproj:
import pyproj
crs_british = pyproj.Proj(init='EPSG:27700')
crs_wgs84 = pyproj.Proj(init='EPSG:4326')
long, lat = pyproj.transform(crs_british, crs_wgs84, 517393.6563, 194035.5469)
print(lat, long)
In your case this will give: 51.63289090467179, -0.3052119183057834
Firstly, a definition:
OS Grid Grid references (OSGGRs) refer to grid references prepended with the two letter grid identifiers, as explained rather well here.
As you found OSGGRs arn't floating point numbers. See the link above.
Moving on, the numbers for OSGGRs have their origin at the south westerly point of the grid - which is very different from the origin of the OSGB BNG projection (coords are easting and northings). See above link for great explanation.
You seem to be taking the BGGRs numeric portion as an Easting/Northing, which is incorrect.
So, armed with this knowledge, we can use OSGridConverter to convert from OSGGRs to eastings and northings using a conversion to lat long as an intermediary. But note the authors comments on errors at the bottom of the PyPI package page
import OSGridConverter
cvt_wgs84 = OSGridConverter.grid2latlong('SN5117013189')
print(cvt_wgs84.latitude, cvt_wgs84.longitude)
51.79749210128498 -4.160451113839529
cvt_EN = OSGridConverter.latlong2grid(cvt_wgs84.latitude, cvt_wgs84.longitude)
print(cvt_EN.E,cvt_EN.N)
251116 213191
# Back to OSGGR, however, note the error of tens of meters.
str(cvt_EN)
'SN 51116 13191'
I did a quick 'by-eye' check by-map on the agreement between the OSGGR coords and the lat/long, and the error is in meters. May be acceptable to some.
the code below tries to compute manually the first equinox of 2019.
It returns
('d1=', 2019/3/20 21:43:48)
('d2=', 2019/3/20 21:43:49)
2019/3/20 21:58:31
that is, a discrepancy of 15 minutes with the real equinox. Is this normal?
Did I forget something? The problem also occurs with the solstices, and also if I used the integrated newton method. Could it have something to do with the epoch of computation?
Thanks,
Dennis
import ephem
sun = ephem.Sun()
# computing Spring equinox:
d1 = ephem.Date('2019/03/15')
d2 = ephem.Date('2019/03/25')
a=ephem.degrees('180.0')
for i in range(20):
#middle date
d3=(d1+d2)/2
sun.compute(d3)
if sun.hlon>a:
d2=d3
else:
d1=d3
print("d1=",ephem.Date(d1))
print("d2=",ephem.Date(d2))
d1 = ephem.next_equinox('2019')
print(d1)
It looks like the difference is because PyEphem's underlying astronomy library always measures heliocentric longitude relative to the coordinates of J2000, which by the date you are asking about is noticeably different from the coordinates-of-date which are used to define the equinox.
Try running this as your compute step:
sun.compute(d3, epoch=d3)
and then look for when sun.ra is zero degrees; the result should be the equinox. I'll see about getting the PyEphem Quick Reference updated to note that heliocentric coordinates don't seem to pay attention to the epoch= parameter.
Many thanks, Brandon, this is very helpful and I am finally getting the correct value! In fact, it seems that the equinoxes are defined by the right ascension being equal to 0h, 6h, 12h, 18h, and not the heliocentric longitude being 0, 90, 180, 270. There is a slight difference between ra and hlon, when you run the code below. But this leads to another question. The Wikipedia page https://en.wikipedia.org/wiki/Equinox says that the equinoxes are defined by the longitude being 0 or 180. So who is correct?
import ephem
sun = ephem.Sun()
d1 = ephem.Date('2019/03/15')
d2 = ephem.Date('2019/03/25')
a=ephem.degrees('0.0') # or 90, or 180, or 270
def spring_equinox(date):
sun.compute(date)
return ephem.degrees(sun.ra - a).znorm
d = ephem.newton(spring_equinox, d1, d2)
print(ephem.Date(d))
print sun.ra
print sun.hlon
I'm looking to determine the alt/az of (un-famous) stars at given RA/Dec at specific times from Mauna Kea. I'm trying to compute these parameters using pyephem, but the resulting alt/az don't agree with other sources. Here's the calculation for HAT-P-32 from Keck:
import ephem
telescope = ephem.Observer()
telescope.lat = '19.8210'
telescope.long = '-155.4683'
telescope.elevation = 4154
telescope.date = '2013/1/18 10:04:14'
star = ephem.FixedBody()
star._ra = ephem.degrees('02:04:10.278')
star._dec = ephem.degrees('+46:41:16.21')
star.compute(telescope)
print star.alt, star.az
which returns -28:43:54.0 73:22:55.3, though according to Stellarium, the proper alt/az should be: 62:26:03 349:15:13. What am I doing wrong?
EDIT: Corrected latitude and longitude, which were formerly reversed.
First, you've got long and latitude backwards; second, you need to provide the strings in hexadecimal form; and third, you need to provide the RA as hours, not degrees:
import ephem
telescope = ephem.Observer()
# Reversed longitude and latitude for Mauna Kea
telescope.lat = '19:49:28' # from Wikipedia
telescope.long = '-155:28:24'
telescope.elevation = 4154.
telescope.date = '2013/1/18 00:04:14'
star = ephem.FixedBody()
star._ra = ephem.hours('02:04:10.278') # in hours for RA
star._dec = ephem.degrees('+46:41:16.21')
star.compute(telescope)
This way, you get:
>>> print star.alt, star.az
29:11:57.2 46:43:19.6
PyEphem always uses UTC for time, so that programs operate the same and give the same output wherever they are run. You simply need to convert the date you are using to UTC, instead of using your local time zone, and the results agree fairly closely with Stellarium; use:
telescope.date = '2013/1/18 05:04:14'
The result is this alt/az:
62:27:19.0 349:26:19.4
To know where the small remaining difference comes from, I would have to look into how the two programs handle each step of their computation; but does this get you close enough?