Obtain Latitude and Longitude from a GeoTIFF File - python

Using GDAL in Python, how do you get the latitude and longitude of a GeoTIFF file?
GeoTIFF's do not appear to store any coordinate information. Instead, they store the XY Origin coordinates. However, the XY coordinates do not provide the latitude and longitude of the top left corner and bottom left corner.
It appears I will need to do some math to solve this problem, but I don't have a clue on where to start.
What procedure is required to have this performed?
I know that the GetGeoTransform() method is important for this, however, I don't know what to do with it from there.

To get the coordinates of the corners of your geotiff do the following:
from osgeo import gdal
ds = gdal.Open('path/to/file')
width = ds.RasterXSize
height = ds.RasterYSize
gt = ds.GetGeoTransform()
minx = gt[0]
miny = gt[3] + width*gt[4] + height*gt[5]
maxx = gt[0] + width*gt[1] + height*gt[2]
maxy = gt[3]
However, these might not be in latitude/longitude format. As Justin noted, your geotiff will be stored with some kind of coordinate system. If you don't know what coordinate system it is, you can find out by running gdalinfo:
gdalinfo ~/somedir/somefile.tif
Which outputs:
Driver: GTiff/GeoTIFF
Size is 512, 512
Coordinate System is:
PROJCS["NAD27 / UTM zone 11N",
GEOGCS["NAD27",
DATUM["North_American_Datum_1927",
SPHEROID["Clarke 1866",6378206.4,294.978698213901]],
PRIMEM["Greenwich",0],
UNIT["degree",0.0174532925199433]],
PROJECTION["Transverse_Mercator"],
PARAMETER["latitude_of_origin",0],
PARAMETER["central_meridian",-117],
PARAMETER["scale_factor",0.9996],
PARAMETER["false_easting",500000],
PARAMETER["false_northing",0],
UNIT["metre",1]]
Origin = (440720.000000,3751320.000000)
Pixel Size = (60.000000,-60.000000)
Corner Coordinates:
Upper Left ( 440720.000, 3751320.000) (117d38'28.21"W, 33d54'8.47"N)
Lower Left ( 440720.000, 3720600.000) (117d38'20.79"W, 33d37'31.04"N)
Upper Right ( 471440.000, 3751320.000) (117d18'32.07"W, 33d54'13.08"N)
Lower Right ( 471440.000, 3720600.000) (117d18'28.50"W, 33d37'35.61"N)
Center ( 456080.000, 3735960.000) (117d28'27.39"W, 33d45'52.46"N)
Band 1 Block=512x16 Type=Byte, ColorInterp=Gray
This output may be all you need. If you want to do this programmaticly in python however, this is how you get the same info.
If the coordinate system is a PROJCS like the example above you are dealing with a projected coordinate system. A projected coordiante system is a representation of the spheroidal earth's surface, but flattened and distorted onto a plane. If you want the latitude and longitude, you need to convert the coordinates to the geographic coordinate system that you want.
Sadly, not all latitude/longitude pairs are created equal, being based upon different spheroidal models of the earth. In this example, I am converting to WGS84, the geographic coordinate system favoured in GPSs and used by all the popular web mapping sites. The coordinate system is defined by a well defined string. A catalogue of them is available from spatial ref, see for example WGS84.
from osgeo import osr, gdal
# get the existing coordinate system
ds = gdal.Open('path/to/file')
old_cs= osr.SpatialReference()
old_cs.ImportFromWkt(ds.GetProjectionRef())
# create the new coordinate system
wgs84_wkt = """
GEOGCS["WGS 84",
DATUM["WGS_1984",
SPHEROID["WGS 84",6378137,298.257223563,
AUTHORITY["EPSG","7030"]],
AUTHORITY["EPSG","6326"]],
PRIMEM["Greenwich",0,
AUTHORITY["EPSG","8901"]],
UNIT["degree",0.01745329251994328,
AUTHORITY["EPSG","9122"]],
AUTHORITY["EPSG","4326"]]"""
new_cs = osr.SpatialReference()
new_cs .ImportFromWkt(wgs84_wkt)
# create a transform object to convert between coordinate systems
transform = osr.CoordinateTransformation(old_cs,new_cs)
#get the point to transform, pixel (0,0) in this case
width = ds.RasterXSize
height = ds.RasterYSize
gt = ds.GetGeoTransform()
minx = gt[0]
miny = gt[3] + width*gt[4] + height*gt[5]
#get the coordinates in lat long
latlong = transform.TransformPoint(minx,miny)
Hopefully this will do what you want.

I don't know if this is a full answer, but this site says:
The x/y map dimensions are called easting and northing. For datasets in a geographic coordinate system these would hold the longitude and latitude. For projected coordinate systems they would normally be the easting and northing in the projected coordinate system. For ungeoreferenced images the easting and northing would just be the pixel/line offsets of each pixel (as implied by a unity geotransform).
so they may actually be longitude and latitude.

Related

Python Rotation of Polygon in Longitude/Latitude Coordinate System

I need to rotate a polygon (which I am provided as a KML file) on a map, so that its shape remains the same.
Here is the function I use on each coordinate in turn to perform the rotation:
def rotate(
points,
origin,
angle
):
# Convert to radians:
rotation_angle = np.deg2rad(angle)
# Build transformer:
R = np.array([[np.cos(rotation_angle), -np.sin(rotation_angle)], [np.sin(rotation_angle), np.cos(rotation_angle)]])
# Convert list to at least 2 dimension arrays...
o = np.atleast_2d(origin)
p = np.atleast_2d(points)
# Return transformed points:
return np.squeeze((R # (p.T-o.T) + o.T).T)
When I perform this on all the coordinates in a rectangle and rotate 90 degrees anti-clockwise, I get a parallelogram. I am thinking this is because the scaling of distances along the arc between two lines of latitude and longitude are different.
Here is an example of what I am seeing:
The green rectangle is the original shape, the red parallelogram is the rotated shape, and the origin is shown for ease of understanding.
I have spent too long trying to make this work and there is obviously a simple way to achieve this which both I and trusty Google are missing!

Finding nearest geographic raster element

I have a large list of longitude latitude points and want to find the nearest rectangle (so which rectangle contains the point) in a given raster of geographic coordinates.
However, for the raster I only have the centroids of each rectangle (polygon) in the raster. I know though that the rectangles have a size of 250m x 250m.
Just checking for absolute distance or geographic distance to the centers does not work, as the rectangles are not necessarily aligned. I am happy to get ideas.
I think you could generate your raster of geographic coordinates that represent raster cells following this approach: https://gis.stackexchange.com/questions/177061/ascii-file-with-latitude-longitude-and-data-to-geotiff-using-python
And then if you created a shapefile of your latitute and longitude points you could get raster cell ID for each point using this approach:
def GetRasterValueAtPoints(rasterfile, shapefile, fieldname):
'''
__author__ = "Marc Weber <weber.marc#epa.gov>"
Original code attribution: https://gis.stackexchange.com/a/46898/2856
returns raster values at points in a point shapefile
assumes same projection in shapefile and raster file
Arguments
---------
rasterfile : a raster file with full pathname and extension
shapefile : a shapefile with full pathname and extension
fieldname : field name in the shapefile to identify values
'''
src_ds=gdal.Open(rasterfile)
no_data = src_ds.GetRasterBand(1).GetNoDataValue()
gt=src_ds.GetGeoTransform()
rb=src_ds.GetRasterBand(1)
df = pd.DataFrame(columns=(fieldname, "RasterVal"))
i = 0
ds=ogr.Open(shapefile)
lyr=ds.GetLayer()
for feat in lyr:
geom = feat.GetGeometryRef()
name = feat.GetField(fieldname)
mx,my=geom.GetX(), geom.GetY() #coord in map units
#Convert from map to pixel coordinates.
#Only works for geotransforms with no rotation.
px = int((mx - gt[0]) / gt[1]) #x pixel
py = int((my - gt[3]) / gt[5]) #y pixel
intval = rb.ReadAsArray(px,py,1,1)
if intval == no_data:
intval = -9999
df.set_value(i,fieldname,name)
df.set_value(i,"RasterVal",float(intval))
i+=1
return df

Converting xy coordinates in relation to pixels to latitude and longitude

Here is my situation:
The company I work for are using some third party software that plots the location of devices on a wifi network. The software requires you to upload a floor plan of the area (this is just a 2000 x 1187 px jpeg). The software spits out xy coordinates for every device location which is visualised over the floor plan.
I have been asked to take these xy coordinates and display them using the Google maps API, so therefore need to somehow convert them to latitude and longitude.
The third party software uses the pixel dimensions of the jpeg as reference to calculating the xy coordinate, according to the limited documentation.
I have attempted to use Python and the pyproj module to make the conversion (there are over half a million coordinates to convert), but the results just seem to plot the lat long in roughly the same position every time.
Here is what I have so far:
projection = Proj("+proj=lcc +lat_1=53.466082 +lat_2=53.464971 +lat_0=53.465482 +lon_0=-2.338944 +x_0=0 +y_0=0 +no_defs +a=6378137 +rf=298.257222101 +to_meter=1 +k_0=200")
wgs84 = Proj(init='epsg:4326')
lat = xCoordinate
lon = yCoordinate
xx, yy = transform(projection, wgs84, lat, lon)
latLong = {'lat':xx, 'long':yy}
return latLong
I admittedly have very little experience with mapping etc and have very little time to get my head around the topic.
I hope this is enough information to go on. If anyone can help in pointing me in the right direction then I would be eternally grateful.
Cheers.

Why are Basemap south polar stereographic map projection coordinates not agreeing with those of data sets in the same projection?

Some satellite based earth observation products provide latitude/longitude information while others provide the X/Y coordinates within a given grid projection (and there are also some having both, see example).
My approach in the second case is to set up a Basemap map which has the same parameters (projection, ellipsoid, origin of map) as given by the data provider in a way that the given X/Y values equal the Basemap coordinates. However if I do so the geolocation does not agree with other data sets including the Basemap coastline.
I have experienced this with three different data sets from different trustworthy sources. For the minimal example I use Landsat data provided by the U.S. Geological Survey which includes both, X/Y coordinates of a South Polar Stereographic grid and the corresponding lat/lon coordinates for all four corners of the image.
From a Landsat metafile we get (ID: LC82171052016079LGN00):
CORNER_UL_LAT_PRODUCT = -66.61490 CORNER_UL_LON_PRODUCT = -61.31816
CORNER_UR_LAT_PRODUCT = -68.74325 CORNER_UR_LON_PRODUCT = -58.04533
CORNER_LL_LAT_PRODUCT = -67.68721 CORNER_LL_LON_PRODUCT = -67.01109
CORNER_LR_LAT_PRODUCT = -69.94052 CORNER_LR_LON_PRODUCT = -64.18581
CORNER_UL_PROJECTION_X_PRODUCT = -2259300.000
CORNER_UL_PROJECTION_Y_PRODUCT = 1236000.000
CORNER_UR_PROJECTION_X_PRODUCT = -1981500.000
CORNER_UR_PROJECTION_Y_PRODUCT = 1236000.000
CORNER_LL_PROJECTION_X_PRODUCT = -2259300.000
CORNER_LL_PROJECTION_Y_PRODUCT = 958500.000
CORNER_LR_PROJECTION_X_PRODUCT = -1981500.000
CORNER_LR_PROJECTION_Y_PRODUCT = 958500.000
...
GROUP = PROJECTION_PARAMETERS MAP_PROJECTION = "PS" DATUM = "WGS84"
ELLIPSOID = "WGS84" VERTICAL_LON_FROM_POLE = 0.00000 TRUE_SCALE_LAT =
-71.00000 FALSE_EASTING = 0 FALSE_NORTHING = 0 GRID_CELL_SIZE_PANCHROMATIC = 15.00 GRID_CELL_SIZE_REFLECTIVE = 30.00
GRID_CELL_SIZE_THERMAL = 30.00 ORIENTATION = "NORTH_UP"
RESAMPLING_OPTION = "CUBIC_CONVOLUTION" END_GROUP =
PROJECTION_PARAMETERS
By using Basemap with the right map projection we should be able to derive the corner lat/lon values from the X/Y values:
import numpy as np
from mpl_toolkits.basemap import Basemap
m=Basemap(resolution='h',projection='spstere', ellps='WGS84', boundinglat=-60,lon_0=180, lat_ts=-71)
x_crn=np.array([-2259300,-1981500,-2259300,-1981500])# upper left, upper right, lower left, lower right
y_crn=np.array([1236000, 1236000, 958500, 958500])# upper left, upper right, lower left, lower right
x0, y0= m(0, -90)
#Basemap coordinates at the south pole
#note that (0,0) of the Basemap is in a corner of the map,
#while other data sets use the south pole.
#This is easy to take into account:
lon_crn, lat_crn = m(x0-x_crn, y0-y_crn, inverse=True)
print 'lon_crn: '+str(lon_crn)
print 'lat_crn: '+str(lat_crn)
Which returns:
lon_crn: [-61.31816102 -58.04532791 -67.01108782 -64.1858106 ]
lat_crn: [-67.23548626 -69.3099076 -68.28071626 -70.47651326]
As you can see the longitudes agree to the given precision with those from the metafile, but the latitudes are to low.
I can approximate the latitudes by:
lat_crn=(lat_crn+90.)*1.0275-90.
But this is really not satisfying.
This is how the image is located if using the X/Y corner coordinates from the metafile (in red the Basemap drawcoastlines()):
and this is how it looks like using the corner lat/lon:
In this case I can simply use the lat/lon coordinates, but as mentioned before there are datasets (like this) which is provided by X/Y coordinates only, which makes it very important to rely on the Basemap projection. I know that there are other modules to re-project the data as a potential workaround, but it should work without other modules and a re-projection could introduce errors itself.
As this problem appears with different data sets I like to believe that it is a bug in the Basemap module, but I might also make the same mistake again and again or have wrong expectations.
I did some experimentation and it seems like changing lat_ts has no effect with projection='spstere'. In fact, it seems as if the projection latitude is implicitly assumed to be lat_ts=-90. regardless of what value you assign.
I had more success using projection='stere' instead, so that you would construct the Basemap in your example as follows:
m=Basemap(width=5400000., height=5400000., projection='stere',
ellps='WGS84', lon_0=180., lat_0=-90., lat_ts=-71.)
You may prefer to set the latitude and longitude of the corners instead of the width and height of the plot for your application.

Find the intersection between two geographical data points

I have two pairs of lat/lon (expressed in decimal degrees) along with their radius (expressed in meters). What I am trying to achieve is to find if an intersect between these two points exits (of course, it is obvious that this doesn't hold here but the plan is to try this algorithm in many other data points). In order to check this I am using Shapely's intersects() function. My question however is how should I deal with the different units? Should I make some sort of transformation \ projection first (same units for both lat\lon and radius)?
48.180759,11.518950,19.0
47.180759,10.518950,10.0
EDIT:
I found this library here (https://pypi.python.org/pypi/utm) which seems helpfull. However, I am not 100% sure if I apply it correctly. Any ideas?
X = utm.from_latlon(38.636782, 21.414384)
A = geometry.Point(X[0], X[1]).buffer(30.777)
Y = utm.from_latlon(38.636800, 21.414488)
B = geometry.Point(Y[0], Y[1]).buffer(23.417)
A.intersects(B)
SOLUTION:
So, I finally managed to solve my problem. Here are two different implementations that both solve the same problem:
X = from_latlon(48.180759, 11.518950)
Y = from_latlon(47.180759, 10.518950)
print(latlonbuffer(48.180759, 11.518950, 19.0).intersects(latlonbuffer(47.180759, 10.518950, 19.0)))
print(latlonbuffer(48.180759, 11.518950, 100000.0).intersects(latlonbuffer(47.180759, 10.518950, 100000.0)))
X = from_latlon(48.180759, 11.518950)
Y = from_latlon(47.180759, 10.518950)
print(geometry.Point(X[0], X[1]).buffer(19.0).intersects(geometry.Point(Y[0], Y[1]).buffer(19.0)))
print(geometry.Point(X[0], X[1]).buffer(100000.0).intersects(geometry.Point(Y[0], Y[1]).buffer(100000.0)))
Shapely only uses the Cartesian coordinate system, so in order to make sense of metric distances, you would need to either:
project the coordinates into a local projection system that uses distance units in metres, such as a UTM zone.
buffer a point from (0,0), and use a dynamic azimuthal equidistant projection centered on the lat/lon point to project to geographic coords.
Here's how to do #2, using shapely.ops.transform and pyproj
import pyproj
from shapely.geometry import Point
from shapely.ops import transform
from functools import partial
WGS84 = pyproj.Proj(init='epsg:4326')
def latlonbuffer(lat, lon, radius_m):
proj4str = '+proj=aeqd +lat_0=%s +lon_0=%s +x_0=0 +y_0=0' % (lat, lon)
AEQD = pyproj.Proj(proj4str)
project = partial(pyproj.transform, AEQD, WGS84)
return transform(project, Point(0, 0).buffer(radius_m))
A = latlonbuffer(48.180759, 11.518950, 19.0)
B = latlonbuffer(47.180759, 10.518950, 10.0)
print(A.intersects(B)) # False
Your two buffered points don't intersect. But these do:
A = latlonbuffer(48.180759, 11.518950, 100000.0)
B = latlonbuffer(47.180759, 10.518950, 100000.0)
print(A.intersects(B)) # True
As shown by plotting the lon/lat coords (which distorts the circles):

Categories

Resources