I am calcualting the angle between two lines on the following image
with this code
# the lines are in the format (x1, y1, x2, y2)
def getAngle(line_1, line_2):
angle1 = math.atan2(line_1[1] - line_1[3], line_1[0] - line_1[2])
angle2 = math.atan2(line_2[1] - line_2[3], line_2[0] - line_2[2])
result = math.degrees(abs(angle1 - angle2))
if result < 0:
result += 360
return result
Now the function works between the two red lines (almost on top of each other) and the red and the green line. However, between the red and the blue line the fuction returns 239.1083 when it should be ~300. Since it is working in some of the cases and not others i am not sure what the problem is.
Some example inputs and outputs:
getAngle((316,309,316,-91), (316,309,421,209)) = 46.3971 # working
getAngle((316,309,316,-91), (199,239,316,309)) = 239.108 # should be around 300
For the example getAngle((316,309,316,-91), (199,239,316,309)), The culprit is measurement of angles in this case.
Angles are getting calculated w.r.t. positive X axis. The angle which you have defined here, calculates phi in the given image below rather than theta, which you should be expecting. Since the rotation is negative in nature (observe the arrow for phi), any subsequent calculation must ensure positive rotation, rather than the negative one. Otherwise, you'll be short by the complementary angle, roughly.
In the given example, the correct angle of line2 should be about +210 degrees, or about -150 degrees. Similarly, the angle of line1 could be +90 or -90 degrees. Now, it's all in the game of which ones to add or subtract and how?
The 239.something, let's call it 240 is gotten by abs(90-(-150)) The
300 you are expecting is gotten by abs(-90 - (+210)).
The difference of 60 degrees is the complement of theta = 30 degrees.
So, it's not so much as bad formula, it's bad argument passing and checking to get positive or negative angles.
Related
I am trying to perform a simple task using simple math in python and I suspect that the inherit error in converting from radians to degrees as a result of an error with floating point math (as garnered from another question on the topic please don't mark this as a duplicate question, it's not).
I am trying to extend a line by 500m. To do this I am taking the the endpoint coordinates from a supplied line and using the existing heading of said line to generate the coordinates of the point which is 500m in the same heading.
Heading is important in this case as it is the source of my error. Or so I suspect.
I use the following function to calculate the interior angle of my right angle triangle, built using the existing line, or in this case my hypotenuse:
def intangle(xypoints):
angle = []
for i in xypoints:
x1 = i[0][0]
x2 = i[1][0]
y1 = i[0][1]
y2 = i[1][1]
gradient = (x1 - x2)/(y1-y2)
radangle = math.atan(gradient)
angle.append((math.degrees(radangle)))
return angle
My input points are, for example:
(22732.23679147904, 6284399.7935522054)
(20848.591367954294, 6281677.926560438)
I know going into this that my angle is 35° as these coordinates are programmatically generated by a separate function and when plotted are out by around 3.75" for each KM. Another error as a result of converting radians to degrees but acceptable in its scope.
The error generated by the above function however, results in an angle that plots my new endpoint in such a place that the line is no longer perfectly straight when I connect the dots and I absolutely have to have a straight line.
How can I go about doing this differently to account for the floating point error? Is it even possible? If not, then what would be an acceptable method of extending my line by howevermany meters using euclidean geometry?
To add to this, I have already done all relevant geographic conversions and I am 100% sure that I am working on a 2D plane so the ellipsoid and such do not play a role in this at all.
Using angles is unnecessary, and there are problems in the way you do it. Using the atan will only give you angles between -pi/2 and pi/2, and you will get the same angle value for opposite directions.
You should rather use Thales:
import math
a = (22732.23679147904, 6284399.7935522054)
b = (20848.591367954294, 6281677.926560438)
def extend_line(a, b, length):
"""
Returns the coordinates of point C at length beyond B in the direction of A->B"""
ab = math.sqrt((a[0]-b[0])**2 + (a[1]-b[1])**2)
coeff = (ab + length)/ab
return (a[0] + coeff*(b[0]-a[0]), a[1] + coeff*(b[1]-a[1]) )
print(extend_line(a, b, 500))
# (20564.06031560228, 6281266.7792872535)
I'm scripting in python for starters.
Making this example simple, I have one edge, with uv Coordinates of ([0,0],[1,1]), so its a 45 degree angle. I have another edge that is ([0,0],[0,1]) so its angle is 0/360 degrees. My goal is to compare the angles of those two edges in order to get the difference so I can modify the angle of the second edge to match the angle of the first edge. Is there a way to do this via vector math?
Easiest to reconstruct and thus constructively remember is IMO the complex picture. To compute the angle from a=a.x+i*a.y to b=b.x+i*b.y rotate b back by multiplying with the conjugate of a to get an angle from the zero angle resp. the positive real axis,
arg((a.x-i*a.y)*(b.x+i*b.y))
=arg((a.x*b.x+a.y*b.y)+i*(a.x*b.y-a.y*b.x))
=atan2( a.x*b.y-a.y*b.x , a.x*b.x+a.y*b.y )
Note that screen coordinates use the opposite orientation to the Cartesian/complex plane, thus change use a sign switch as from atan2(y,x) to atan2(-y,x) to get an angle in the usual direction.
To produce a vector b rotated angle (in radians) w from a, multiply by cos(w)+i*sin(w) to obtain
b.x = cos(w)*a.x - sin(w)*a.y
b.y = cos(w)*a.y + sin(w)*a.x
You will have to rescale to get a specified length of b.
I am performing motion tracking of an object, and I am trying to identify the front and back of the object. The object is asymmetrical, which means that the centroid of the contour is closer to the front than the back. Using this information, I am approaching this as follows:
Draw contours of object
Find centroid
centroidx, centroidy = int(moments['m10']/moments['m00']), int(moments['m10']/moments['m00'])
Draw bounding ellipse
cv2.fitEllipse(contour)
Calculate major axis length as follows (and as shown in the figure)
MAx, MAy = int(0.5 * ellipseMajorAxisx*math.sin(ellipseAngle)), int(0.5 * ellipseMajorAxisy*math.cos(ellipseAngle))
Calculate beginning and ending x, y coordinates of the major axis
MAxtop, MAytop = int(ellipseCentrex + MAx), int(ellipseCentrey + MAy)
MAxbot, MAybot = int(ellipseCentrex - MAx), int(ellipseCentrey - MAy)
Identify which of the points is closer to the centroid of the contour
distancetop = math.sqrt((centroidx - MAxtop)**2 + (centroidy - MAytop)**2)
distancebot = math.sqrt((centroidx - MAxbot)**2 + (centroidy - MAybot)**2)
min(distancetop, distancebot)
The problem I am encountering is, while I get the "front" end of the ellipse correct most of the time, occasionally the point is a little bit away. As far as I have observed, this seems to be happening such that the x value is correct, but y value is different (in effect, I think this represents the major axis of an ellipse that is perpendicular to mine). I am not sure if this is an issue with opencv's calculation of angles or (more than likely) my calculations are incorrect. I do realize this is a complicated example, hope my figures help!
EDIT: When I get the wrong point, it is not from a perpendicular ellipse, but of a mirror image of my ellipse. And it happens with the x values too, not just y.
After following ssm's suggestion below, I am getting the desired point most of the time. The point still goes wrong occasionally, but "snaps back" into place soon after. For example, this is a few frames when this happens:
By the way, the above images are after "correcting" for angle by using this code:
if angle > 90:
angle = 180 - angle
If I do not do the correction, I get the wrong point at other times, as shown below for the same frames.
So it looks like I get it right for some angles with angle correction and the other angles without correction. How do I get all the right points in both conditions?
(White dot inside the ellipse is the centroid of the contour, whereas the dot on or outside the ellipse is the point I am getting)
I think your only problem is MAytop. You can consider doing the following:
if ycen<yc:
# switch MAytop and MAybot
temp = MAytop
MAytop = MAybot
MAybot = temp
You may have to do a similar check on the x - scale
I am trying to estimate the value of pi using a monte carlo simulation. I need to use two unit circles that are a user input distance from the origin. I understand how this problem works with a single circle, I just don't understand how I am meant to use two circles. Here is what I have got so far (this is the modified code I used for a previous problem the used one circle with radius 2.
import random
import math
import sys
def main():
numDarts=int(sys.argv[1])
distance=float(sys.argv[2])
print(montePi(numDarts,distance))
def montePi(numDarts,distance):
if distance>=1:
return(0)
inCircle=0
for I in range(numDarts):
x=(2*(random.random()))-2
y=random.random()
d=math.sqrt(x**2+y**2)
if d<=2 and d>=-2:
inCircle=inCircle+1
pi=inCircle/numDarts*4
return pi
main()
I need to change this code to work with 2 unit circles, but I do not understand how to use trigonometry to do this, or am I overthinking the problem? Either way help will be appreciated as I continue trying to figure this out.
What I do know is that I need to change the X coordinate, as well as the equation that determines "d" (d=math.sqrt(x*2+y*2)), im just not sure how.
These are my instructions-
Write a program called mcintersection.py that uses the Monte Carlo method to
estimate the area of this shape (and prints the result). Your program should take
two command-line parameters: distance and numDarts. The distance parameter
specifies how far away the circles are from the origin on the x-axis. So if distance
is 0, then both circles are centered on the origin, and completely overlap. If
distance is 0.5 then one circle is centered at (-0.5, 0) and the other at (0.5, 0). If
distance is 1 or greater, then the circles do not overlap at all! In that last case, your
program can simply output 0. The numDarts parameter should specify the number
of random points to pick in the Monte Carlo process.
In this case, the rectangle should be 2 units tall (with the top at y = 1 and the
bottom at y = -1). You could also safely make the rectangle 2 units wide, but this
will generally be much bigger than necessary. Instead, you should figure out
exactly how wide the shape is, based on the distance parameter. That way you can
use as skinny a rectangle as possible.
If I understand the problem correctly, you have two unit circles centered at (distance, 0) and (-distance, 0) (that is, one is slightly to the right of the origin and one is slightly to the left). You're trying to determine if a given point, (x, y) is within both circles.
The simplest approach might be to simply compute the distance between the point and the center of each of the circles. You've already done this in your previous code, just repeat the computation twice, once with the offset distance inverted, then use and to see if your point is in both circles.
But a more elegant solution would be to notice how your two circles intersect each other exactly on the y-axis. To the right of the axis, the left circle is completely contained within the right one. To the left of the y-axis, the right circle is entirely within the left circle. And since the shape is symmetrical, the two halves are of exactly equal size.
This means you can limit your darts to only hitting on one side of the axis, and then get away with just a single distance test:
def circle_intersection_area(num_darts, distance):
if distance >= 1:
return 0
in_circle = 0
width = 1-distance # this is enough to cover half of the target
for i in range(num_darts):
x = random.random()*width # random value from 0 to 1-distance
y = random.random()*2 - 1 # random value from -1 to 1
d = math.sqrt((x+distance)**2 + y**2) # distance from (-distance, 0)
if d <= 1:
in_circle += 1
sample_area = width * 2
target_area = sample_area * (in_circle / num_darts)
return target_area * 2 # double, since we were only testing half the target
I'm trying to calculate the number of lines of best fit for an ellipse; given a desired error margin (minimum distance from the boundary).
My solution for a unit circle was thus.
def f(u, v, r):
mid_uv = (u + v) * 0.5
N = normalized(mid_uv)
return N * r
And repeat v = f(u, v, r) until radius - |v| < error.
Then simply take 2^i (i being the number of iterations) as the number of segments required.
This algorithm could probably be O(1), and does not work for ellipses (which is what I need it for).
How can I adapt it?
Or better yet, is there another solution?
I cannot formulate a nice neat answer - working with ellipses is quite a bit more challenging that circles - but here goes, in steps:
First - I would tighten up the algorithm for the circle by using a bit of trig. If you draw a chord (line segment) that spans angle angle through a unit circle, the maximum distance from the circle to the chord is calculated thus:
error = 1 - math.cos( angle / 2 )
(You can see this if you draw a diagram with the circle, chord, and chord's bisector.) Inverting this formula, you can calculate the angle given the tolerable error. The first line of code gives the precise angle; the second line shrinks the angle if needed so that it is an exact fraction of the whole circle.
angle = 2 * math.acos( 1 - error )
angle = (2*math.pi) / math.ceil( (2*math.pi) / angle )
Once you have the angle, it's simple to calculate the points around the unit circle for your chord end-points: [(1,0), (cos(angle),sin(angle)), cos(2*angle),sin(2*angle)), ... ]. You will end up with a regular polygon.
Second - For a circle with radius radius, run the above formulas adjusted as follows:
angle = 2 * math.acos( 1 - error/radius )
angle = (2*math.pi) / math.ceil( (2*math.pi) / angle )
And calculate the chord end-points by multiplying the sin and cos values by the radius.
Third - For an ellipse with maximal and minimal radii major and minor, I would use the circle formula to again calculate an angle:
radius = max( major, minor )
angle = 2 * math.acos( 1 - error/radius )
angle = (2*math.pi) / math.ceil( (2*math.pi) / angle )
If the major radius is in the x direction and the minor radius is in the y direction, then you can calculate the chord end-points like this:
[ (major, 0),
(major*cos(angle), minor*sin(angle)),
(major*cos(2*angle), minor*sin(2*angle)),
... ]
This does not always give you the minimal polygon for an ellipse (it will have more chords than necessary near the minor axis, especially for very squashed ellipses), but you only have to do the angle calculation once. If you really need to minimize the number of chords, then after drawing each chord, you will need to re-calculate the angle after each chord, and the formula is not straight-forward (where "not straight-forward" = "difficult for me to figure out").
There is O(1) solution for circle: you can calculate number of equal segments to obtain needed sagitta. Ellipse is more hard case. Maximum sagitta will for a chord that is perpendicular to larger semiaxis (near focuses), so it seems reasonable to choose the points of junction of segments at the ends of larger semiaxis (at least - as first approximation)