If I have a line defined by a start and end coordinates, how do I get n equally spaced points on that line, taking the curvature of the earth into account?
I'm looking for an algorithm, and/or a python library that implements this.
Using geographiclib, a python implementation of GeographicLib, I was able to do this:
from geographiclib.geodesic import Geodesic
number_points = 10
gd = Geodesic.WGS84.Inverse(35, 0, 35, 90)
line = Geodesic.WGS84.Line(gd['lat1'], gd['lon1'], gd['azi1'])
for i in range(number_points + 1):
point = line.Position(gd['s12'] / number_points * i)
print((point['lat2'], point['lon2']))
output:
(35.0, -7.40353472481637e-21)
(38.29044006500327, 7.8252809205988445)
(41.01134777655358, 16.322054184499173)
(43.056180665524245, 25.451710440063902)
(44.328942450747135, 35.08494460239694)
(44.76147256654079, 45.00000000000001)
(44.328942450747135, 54.91505539760305)
(43.05618066552424, 64.54828955993611)
(41.01134777655356, 73.67794581550085)
(38.290440065003274, 82.17471907940114)
(34.99999999999999, 90.0
You can use the npts method from pyproj's Geod class.
See
https://jswhit.github.io/pyproj/pyproj.Geod-class.html
Given a single initial point and terminus point (specified by python
floats lon1,lat1 and lon2,lat2), returns a list of longitude/latitude
pairs describing npts equally spaced intermediate points along the
geodesic between the initial and terminus points.
Emphasis mine because I missed that at first.
First you create an Geod instance, specifying the ellipsoid you want it to use. Then you can call the method.
from pyproj import Geod
geod = Geod("+ellps=WGS84")
points = geod.npts(lon1=-89.6627,
lat1=39.7658,
lon2=147.2800,
lat2=-42.8500,
npts=100
)
points is now a list of tuples on the geodesic line between your start and end point:
[(-91.27649937899028, 39.21278457275204),
(-92.86468478264302, 38.6377120347621),
(-94.42723159402209, 38.04136774269571),
(-95.96421169120758, 37.42453136174509),
(-97.47578514283185, 36.78797425216882),
...
Related
I'm trying to fit a line segment to a set of points but I have trouble finding an algorithm for it. I have a 2D line segment L and a set of 2D points C. L can be represented in any suitable way (I don't care), like support and definition vector, two points, a linear equation with left and right bound, ... The only important thing is that the line has a beginning and an end, so it's not infinite.
I want to fit L in C, so that the sum of all distances of c to L (where c is a point in C) is minimized. This is a least squares problem but I (think) cannot use polynmoial fitting, because L is only a segment. My mathematical knowledge in that area is a bit lacking so any hints on further reading would be appreciated aswell.
Here is an illustration of my problem:
The orange line should be fitted to the blue points so that the sum of squares of distances of each point to the line is minimal. I don't mind if the solution is in a different language or not code at all, as long as I can extract an algorithm from it.
Since this is more of a mathematical question I'm not sure if it's ok for SO or should be moved to cross validated or math exchange.
This solution is relatively similar to one already posted here, but I think is slightly more efficient, elegant and understandable, which is why I posted it despite the similarity.
As was already written, the min(max(...)) formulation makes it hard to solve this problem analytically, which is why scipy.optimize fits well.
The solution is based on the mathematical formulation for distance between a point and a finite line segment outlined in https://math.stackexchange.com/questions/330269/the-distance-from-a-point-to-a-line-segment
import numpy as np
import matplotlib.pyplot as plt
from scipy.optimize import minimize, NonlinearConstraint
def calc_distance_from_point_set(v_):
#v_ is accepted as 1d array to make easier with scipy.optimize
#Reshape into two points
v = (v_[:2].reshape(2, 1), v_[2:].reshape(2, 1))
#Calculate t* for s(t*) = v_0 + t*(v_1-v_0), for the line segment w.r.t each point
t_star_matrix = np.minimum(np.maximum(np.matmul(P-v[0].T, v[1]-v[0]) / np.linalg.norm(v[1]-v[0])**2, 0), 1)
#Calculate s(t*)
s_t_star_matrix = v[0]+((t_star_matrix.ravel())*(v[1]-v[0]))
#Take distance between all points and respective point on segment
distance_from_every_point = np.linalg.norm(P.T -s_t_star_matrix, axis=0)
return np.sum(distance_from_every_point)
if __name__ == '__main__':
#Random points from bounding box
box_1 = np.random.uniform(-5, 5, 20)
box_2 = np.random.uniform(-5, 5, 20)
P = np.stack([box_1, box_2], axis=1)
segment_length = 3
segment_length_constraint = NonlinearConstraint(fun=lambda x: np.linalg.norm(np.array([x[0], x[1]]) - np.array([x[2] ,x[3]])), lb=[segment_length], ub=[segment_length])
point = minimize(calc_distance_from_point_set, (0.0,-.0,1.0,1.0), options={'maxiter': 100, 'disp': True},constraints=segment_length_constraint).x
plt.scatter(box_1, box_2)
plt.plot([point[0], point[2]], [point[1], point[3]])
Example result:
Here is a proposition in python. The distance between the points and the line is computed based on the approach proposed here: Fit a line segment to a set of points
The fact that the segment has a finite length, which impose the usage of min and max function, or if tests to see whether we have to use perpendicular distance or distance to one of the end points, makes really difficult (impossible?) to get an analytic solution.
The proposed solution will thus use optimization algorithm to approach the best solution. It uses scipy.optimize.minimize, see: https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.minimize.html
Since the segment length is fixed, we have only three degrees of freedom. In the proposed solution I use x and y coordinate of the starting segment point and segment slope as free parameters. I use getCoordinates function to get starting and ending point of the segment from these 3 parameters and the length.
import numpy as np
from scipy.optimize import minimize
import matplotlib.pyplot as plt
import math as m
from scipy.spatial import distance
# Plot the points and the segment
def plotFunction(points,x1,x2):
'Plotting function for plane and iterations'
plt.plot(points[:,0],points[:,1],'ro')
plt.plot([x1[0],x2[0]],[x1[1],x2[1]])
plt.xlim(0, 1)
plt.ylim(0, 1)
plt.show()
# Get the sum of the distance between all the points and the segment
# The segment is defined by guess and length were:
# guess[0]=x coordinate of the starting point
# guess[1]=y coordinate of the starting point
# guess[2]=slope
# Since distance is always >0 no need to use root mean square values
def getDist(guess,points,length):
start_pt=np.array([guess[0],guess[1]])
slope=guess[2]
[x1,x2]=getCoordinates(start_pt,slope,length)
total_dist=0
# Loop over each points to get the distance between the point and the segment
for pt in points:
total_dist+=minimum_distance(x1,x2,pt,length)
return(total_dist)
# Return minimum distance between line segment x1-x2 and point pt
# Adapted from https://stackoverflow.com/questions/849211/shortest-distance-between-a-point-and-a-line-segment
def minimum_distance(x1, x2, pt,length):
length2 = length**2 # i.e. |x1-x2|^2 - avoid a sqrt, we use length that we already know to avoid re-computation
if length2 == 0.0:
return distance.euclidean(p, v);
# Consider the line extending the segment, parameterized as x1 + t (x2 - x1).
# We find projection of point p onto the line.
# It falls where t = [(pt-x1) . (x2-x1)] / |x2-x1|^2
# We clamp t from [0,1] to handle points outside the segment vw.
t = max(0, min(1, np.dot(pt - x1, x2 - x1) / length2));
projection = x1 + t * (x2 - x1); # Projection falls on the segment
return distance.euclidean(pt, projection);
# Get coordinates of start and end point of the segment from start_pt,
# slope and length, obtained by solving slope=dy/dx, dx^2+dy^2=length
def getCoordinates(start_pt,slope,length):
x1=start_pt
dx=length/m.sqrt(slope**2+1)
dy=slope*dx
x2=start_pt+np.array([dx,dy])
return [x1,x2]
if __name__ == '__main__':
# Generate random points
num_points=20
points=np.random.rand(num_points,2)
# Starting position
length=0.5
start_pt=np.array([0.25,0.5])
slope=0
#Use scipy.optimize, minimize to find the best start_pt and slope combination
res = minimize(getDist, x0=[start_pt[0],start_pt[1],slope], args=(points,length), method="Nelder-Mead")
# Retreive best parameters
start_pt=np.array([res.x[0],res.x[1]])
slope=res.x[2]
[x1,x2]=getCoordinates(start_pt,slope,length)
print("\n** The best segment found is defined by:")
print("\t** start_pt:\t",x1)
print("\t** end_pt:\t",x2)
print("\t** slope:\t",slope)
print("** The total distance is:",getDist([x1[0],x2[1],slope],points,length),"\n")
# Plot results
plotFunction(points,x1,x2)
I am trying to perform a simple task using simple math in python and I suspect that the inherit error in converting from radians to degrees as a result of an error with floating point math (as garnered from another question on the topic please don't mark this as a duplicate question, it's not).
I am trying to extend a line by 500m. To do this I am taking the the endpoint coordinates from a supplied line and using the existing heading of said line to generate the coordinates of the point which is 500m in the same heading.
Heading is important in this case as it is the source of my error. Or so I suspect.
I use the following function to calculate the interior angle of my right angle triangle, built using the existing line, or in this case my hypotenuse:
def intangle(xypoints):
angle = []
for i in xypoints:
x1 = i[0][0]
x2 = i[1][0]
y1 = i[0][1]
y2 = i[1][1]
gradient = (x1 - x2)/(y1-y2)
radangle = math.atan(gradient)
angle.append((math.degrees(radangle)))
return angle
My input points are, for example:
(22732.23679147904, 6284399.7935522054)
(20848.591367954294, 6281677.926560438)
I know going into this that my angle is 35° as these coordinates are programmatically generated by a separate function and when plotted are out by around 3.75" for each KM. Another error as a result of converting radians to degrees but acceptable in its scope.
The error generated by the above function however, results in an angle that plots my new endpoint in such a place that the line is no longer perfectly straight when I connect the dots and I absolutely have to have a straight line.
How can I go about doing this differently to account for the floating point error? Is it even possible? If not, then what would be an acceptable method of extending my line by howevermany meters using euclidean geometry?
To add to this, I have already done all relevant geographic conversions and I am 100% sure that I am working on a 2D plane so the ellipsoid and such do not play a role in this at all.
Using angles is unnecessary, and there are problems in the way you do it. Using the atan will only give you angles between -pi/2 and pi/2, and you will get the same angle value for opposite directions.
You should rather use Thales:
import math
a = (22732.23679147904, 6284399.7935522054)
b = (20848.591367954294, 6281677.926560438)
def extend_line(a, b, length):
"""
Returns the coordinates of point C at length beyond B in the direction of A->B"""
ab = math.sqrt((a[0]-b[0])**2 + (a[1]-b[1])**2)
coeff = (ab + length)/ab
return (a[0] + coeff*(b[0]-a[0]), a[1] + coeff*(b[1]-a[1]) )
print(extend_line(a, b, 500))
# (20564.06031560228, 6281266.7792872535)
I was trying to create river cross-section profiles based on the point terrestical measurements. When trying to create a Shapely LineString from a Series of points with the common id, I realized that the order of given points really matters as the LineString would just connect given points 'indexwise' (connect points in the list-given order). The below code illustrates the default behaviour:
from shapely.geometry import Point, LineString
import geopandas as gpd
import numpy as np
import matplotlib.pyplot as plt
# Generate random points
x=np.random.randint(0,100,10)
y=np.random.randint(0,50,10)
data = zip(x,y)
# Create Point and default LineString GeoSeries
gdf_point = gpd.GeoSeries([Point(j,k) for j,k in data])
gdf_line = gpd.GeoSeries(LineString(zip(x,y)))
# plot the points and "default" LineString
ax = gdf_line.plot(color='red')
gdf_point.plot(marker='*', color='green', markersize=5,ax=ax)
That would produce the image:
Question: Is there any built-in method within Shapely that would automatically create the most logical (a.k.a.: the shortest, the least complicated, the least criss-cross,...) line through the given list of random 2D points?
Below can you find the desired line (green) compared to the default (red).
Here is what solved my cross-section LineString simplification problem. However, my solution doesn't correctly address computationally more complex task of finding the ultimately shortest path through the given points. As the commenters suggested, there are many libraries and scripts available to solve that particulal problem, but in case anyone want to keep it simple, you can use what did the trick for me. Feel free to use and comment!
def simplify_LineString(linestring):
'''
Function reorders LineString vertices in a way that they each vertix is followed by the nearest remaining vertix.
Caution: This doesn't calculate the shortest possible path (travelling postman problem!) This function performs badly
on very random points since it doesn't see the bigger picture.
It is tested only with the positive cartesic coordinates. Feel free to upgrade and share a better function!
Input must be Shapely LineString and function returns Shapely Linestring.
'''
from shapely.geometry import Point, LineString
import math
if not isinstance(linestring,LineString):
raise IOError("Argument must be a LineString object!")
#create a point lit
points_list = list(linestring.coords)
####
# DECIDE WHICH POINT TO START WITH - THE WESTMOST OR SOUTHMOST? (IT DEPENDS ON GENERAL DIRECTION OF ALL POINTS)
####
points_we = sorted(points_list, key=lambda x: x[0])
points_sn = sorted(points_list, key=lambda x: x[1])
# calculate the the azimuth of general diretction
westmost_point = points_we[0]
eastmost_point = points_we[-1]
deltay = eastmost_point[1] - westmost_point[1]
deltax = eastmost_point[0] - westmost_point[0]
alfa = math.degrees(math.atan2(deltay, deltax))
azimut = (90 - alfa) % 360
if (azimut > 45 and azimut < 135):
#General direction is west-east
points_list = points_we
else:
#general direction is south-north
points_list = points_sn
####
# ITERATIVELY FIND THE NEAREST VERTIX FOR THE EACH REMAINING VERTEX
####
# Create a new, ordered points list, starting with the east or southmost point.
ordered_points_list = points_list[:1]
for iteration in range(0, len(points_list[1:])):
current_point = ordered_points_list[-1] # current point that we are looking the nearest neighour to
possible_candidates = [i for i in points_list if i not in ordered_points_list] # remaining (not yet sortet) points
distance = 10000000000000000000000
best_candidate = None
for candidate in possible_candidates:
current_distance = Point(current_point).distance(Point(candidate))
if current_distance < distance:
best_candidate = candidate
distance = current_distance
ordered_points_list.append(best_candidate)
return LineString(ordered_points_list)
There is no built in function, but shapely has a distance function.
You could easily iterate over the points and calculate the shortest distance between them and construct the 'shortest' path.
There are some examples in the offical github repo.
Google's OR-Tools offer a nice and efficient way for solving the Travelling Salesman Problem: https://developers.google.com/optimization/routing/tsp.
Following the tutorial on their website would give you a solution from this (based on your example code):
to this:
I have a list of coordinates (lat/lon) representing a route.
Given a certain radius and another coordinate I need to check if the coord is in the route (within the given radius from any point) and its distance from the beginning of the route.
I looked at Shapely and it looks like a good solution.
I started off by creating a StringLine
from shapely.geometry import LineString
route = LineString[(x, y), (x1, y1), ...]
Then to check if the point is near the route I've added a buffer and checked
for intersection
from shapely.geometry import Point
p = Point(x, y)
r = 0.5
intersection = p.buffer(r).intersection(route)
if intersection.is_empty:
print "Point not on route"
else:
# Calculate P distance from the begning of route
I'm stuck calculating the distance. I thought of splitting the route at p and measuring the length of the first half but the intersection result I get is a HeterogeneousGeometrySequence which I'm not sure what I can do with.
I believe I found a solution:
if p.buffer(r).intersects(route):
return route.project(p)
Rather than buffering a geometry, which is expensive and imperfect (since buffering requires a number of segments and many other options), just see if the point is within a distance threshold:
if route.distance(p) <= r:
return route.project(p)
Also, you probably realised by now that your distance units are in degrees. If you want linear distances, like meters, you would need to make it much more complicated using different libraries.
I have two pairs of lat/lon (expressed in decimal degrees) along with their radius (expressed in meters). What I am trying to achieve is to find if an intersect between these two points exits (of course, it is obvious that this doesn't hold here but the plan is to try this algorithm in many other data points). In order to check this I am using Shapely's intersects() function. My question however is how should I deal with the different units? Should I make some sort of transformation \ projection first (same units for both lat\lon and radius)?
48.180759,11.518950,19.0
47.180759,10.518950,10.0
EDIT:
I found this library here (https://pypi.python.org/pypi/utm) which seems helpfull. However, I am not 100% sure if I apply it correctly. Any ideas?
X = utm.from_latlon(38.636782, 21.414384)
A = geometry.Point(X[0], X[1]).buffer(30.777)
Y = utm.from_latlon(38.636800, 21.414488)
B = geometry.Point(Y[0], Y[1]).buffer(23.417)
A.intersects(B)
SOLUTION:
So, I finally managed to solve my problem. Here are two different implementations that both solve the same problem:
X = from_latlon(48.180759, 11.518950)
Y = from_latlon(47.180759, 10.518950)
print(latlonbuffer(48.180759, 11.518950, 19.0).intersects(latlonbuffer(47.180759, 10.518950, 19.0)))
print(latlonbuffer(48.180759, 11.518950, 100000.0).intersects(latlonbuffer(47.180759, 10.518950, 100000.0)))
X = from_latlon(48.180759, 11.518950)
Y = from_latlon(47.180759, 10.518950)
print(geometry.Point(X[0], X[1]).buffer(19.0).intersects(geometry.Point(Y[0], Y[1]).buffer(19.0)))
print(geometry.Point(X[0], X[1]).buffer(100000.0).intersects(geometry.Point(Y[0], Y[1]).buffer(100000.0)))
Shapely only uses the Cartesian coordinate system, so in order to make sense of metric distances, you would need to either:
project the coordinates into a local projection system that uses distance units in metres, such as a UTM zone.
buffer a point from (0,0), and use a dynamic azimuthal equidistant projection centered on the lat/lon point to project to geographic coords.
Here's how to do #2, using shapely.ops.transform and pyproj
import pyproj
from shapely.geometry import Point
from shapely.ops import transform
from functools import partial
WGS84 = pyproj.Proj(init='epsg:4326')
def latlonbuffer(lat, lon, radius_m):
proj4str = '+proj=aeqd +lat_0=%s +lon_0=%s +x_0=0 +y_0=0' % (lat, lon)
AEQD = pyproj.Proj(proj4str)
project = partial(pyproj.transform, AEQD, WGS84)
return transform(project, Point(0, 0).buffer(radius_m))
A = latlonbuffer(48.180759, 11.518950, 19.0)
B = latlonbuffer(47.180759, 10.518950, 10.0)
print(A.intersects(B)) # False
Your two buffered points don't intersect. But these do:
A = latlonbuffer(48.180759, 11.518950, 100000.0)
B = latlonbuffer(47.180759, 10.518950, 100000.0)
print(A.intersects(B)) # True
As shown by plotting the lon/lat coords (which distorts the circles):