calculate anti clockwise angle between 2 points - python

I have a robot with red led and green led mounted at the front and back respectively. I want to calculate the head direction of the robot as in which direction is the greenLEd - redLed vector pointed to.
How can I code it such that the points marked 1 and 2 in the image below have the same angle i.e. 45degree anti-clockwise whereas point 3 should be at 225degrees.
I used the following script but it giving me wrong results:
def headDirectionAngle(redLEDCoords, greenLEDCoords, referenceVector):
greenRedLEDVector = np.array(greenLEDCoords) - np.array(redLEDCoords)
angle = np.math.atan2(np.linalg.det([referenceVector,greenRedLEDVector]),np.dot(referenceVector,greenRedLEDVector))
return np.degrees(angle)
referenceVector = np.array([0,240])
How should I proceed? Thanks for the help.

Back to basics, without numpy.
atan2 already gives you an anticlockwise angle, but between -180 and 180. You can add 360 and calculate modulo 360 to get an angle between 0 and 360:
from math import atan2, degrees
def anti_clockwise(x,y):
alpha = degrees(atan2(y,x))
return (alpha + 360) % 360
print(anti_clockwise(480, 480))
# 45.0
print(anti_clockwise(-480, -480))
# 225.0
x should just be the difference in X coordinates between the green and red LEDs. Same goes for y.

Related

Raycasting to calculate the volume of a space

I'm trying to use ray casting to gather all the surfaces in a room and determine it's volume.
I have a centroid location where the rays will be coming from, but I'm drawing a blank on how to get the rays in all 360 degrees (in 3D space).
I'm not getting any points on the floors or ceilings, it's like it's doing a 60 degree spread rotated about the Z axis.
I think I have the rest of it working, but this is stumping me.
for y in range(360):
for x in range(360):
vector = DB.XYZ(math.sin(math.radians(x)), math.cos(math.radians(x)), math.cos(math.radians(y))).Normalize()
prox = ri.FindNearest(origin, direction).Proximity
point = origin + (direction * prox)
Look at it this way: x and y of vector are created from angle x (-> a circle in the plane) and then you add a z component which lies between -1 and 1 (which cos does). So it's obvious that you end up with a cylindrical distribution.
What you might want are spherical coordinates. Modify your code like this:
for y in range(-90, 91):
for x in range(360):
vector = DB.XYZ(math.sin(math.radians(x)) * cos(math.radians(y)),
math.cos(math.radians(x)) * cos(math.radians(y)),
math.sin(math.radians(y))) # Normalize unnecessary, since vector² = sin² * cos² + cos² * cos² + sin² = 1
prox = ri.FindNearest(origin, direction).Proximity
point = origin + (direction * prox)
But be aware that the angle distribution of rays is not uniform using spherical coordinates. At the poles it's more dense than at the equator. You can mitigate this e.g. by scaling the density of x down, depending on y. The surface elements scale down by cos(y)², so I think you have to scale by cos(y).

How to get the degrees on a circle for each point

I am trying to determine the amount of degrees between two points on a circle. I am having trouble doing so. I have looked at other posts but I cannot seem to figure out how to implement it.
I am trying to get the distance between the two in degrees.
Hopefully the model below gives you a better understanding.
Here is the data chart
def get_angle(x, y):
#75 being the center circle coordinate
(dx, dy) = (75-x, y-75)
angle = degrees(atan2(float(dy), float(dx)))
if angle < 0:
angle += 180
return angle
The values it is returning don't seem right as the values are very similar for some reason. Such as 157 and 120 though it clearly shouldn't return that. I am somewhat new to image processing so I could be looking at it wrong.
Center coordinates: cx, cy
Point coordinates: ax, ay for point A and bc, by for point B
Pseudocode for angle:
dp = dot(A-C, B-C) = (ax-cx)*(bx-cx) + (ay-cy)*(by-cy)
cross = vectorproduct(A-C, B-C) = (ax-cx)*(by-cy) - (ay-cy)*(bx-cx)
angle = degrees(atan2(cross, dp))
if angle < 0:
angle += 180
If angle direction is not what you expect, change sign of cross

Python program to rotate a line not working

I am trying to figure out direction vectors of the arrowheads of an arrow. Basically I'm given a normalized direction vector (u,v,w) and I need the normalized direction vectors of the its two arrow heads which make a 15 degree angle.
My plan is to first start off with a simple normalized vector (0,0,1). The direction vectors of its arrow heads are (-sin(15), 0, -cos(15)) and (sin(15), 0, -cos(15)), and then rotate (0,0,1) so its parallel to the given (u,v,w). I do this by projecting (u,v,w) on its x-axis, and getting its angle relative to (0,0,1), then projecting on the y-axis, and getting its angle relative to (0,0,1), then I use the 3d rotation matrices to use those found angles to rotate the arrow head direction vector.
I have this code below, but its not working properly. Does anyone see whats wrong?
Thanks
ra = 15
ca = math.cos(ra)
sa = math.sin(ra)
px = (0,v,w)
if u!=1:
px = [i/float(math.sqrt(v**2 + w**2)) for i in px]
py = (u,0,w)
if v!=1:
py = [i/float(math.sqrt(u**2 + w**2)) for i in py]
pxangle = math.acos(px[2])
pyangle = math.acos(py[2])
cpx = math.cos(pxangle)
spx = math.sin(pxangle)
cpy = math.cos(pyangle)
spy = math.sin(pyangle)
def rotatefunction(ah):
xr = (ah[0], -spx*ah[2], cpx*ah[2])
return (cpy*xr[0]+spy*xr[2], xr[1], -spy*xr[0]+cpy*xr[2])
lah = rotatefunction((-sa, 0, -ca))
rah = rotatefunction((sa, 0, -ca))
First of all, you need to convert degrees into radians (= degree * (pi / 180)) before passing to math.cos, math.sin, etc.
Suppose the normalized direction vector of the arrow is given by
The unit vector parallel to the z-axis is given by
The orthonormal vector perpendicular to both the unit vectors are given by the cross product
and the angle between the two vector is given by the scalar product
Basically the rotation is about this u (uhat) axis by angle theta. The corresponding rotation matrix is given by
so this is the matrix you need to multiply the arrowhead vectors with.

How to measure distance at angle in image python

I'm working on a particle filter for an autonomous robot right now, and am having trouble producing expected distance measurements by which to filter the particles. I have an image that I'm using as a map. Each pixel represents a certain scaled area in the enviroment. Space the robot can occupy is white, walls are black, and areas that are exterior to the enviroment are grey.
If you are unfamiliar with what a particle filter is, my python code will create a predetermined number of random guesses as to where it might be (x,y,theta) in the white space. It will then measure the distance to the nearest wall with ultrasonic sensors at several angles. The script will compare these measurements with the measurements that would have been expected at each angle for each guessed location/orientation. Those that most closely match the actual measurements will survive while guesses that are less likely to be right will be eliminated.
My problem is finding the nearest wall AT a given angle. Say the sensor is measuring at 60°. For each guess, I need to adjust the angle to account for the guessed robot orientation, and then measure the distance to the wall at that angle. It's easy enough find the nearest wall in the x direction:
from PIL import Image
#from matplotlib._png import read_png
from matplotlib.pyplot import *
mapp = Image.open("Map.png")
pixels = mapp.load()
width = mapp.size[0]
height = mapp.size[1]
imshow(mapp)
pixelWidth = 5
for x in range(width):
if mapp.getpixel((x, 100)) == (0,0,0,255): #Identify the first black pixel
distance = x*pixelWidth self.x
The problem is that I can't tell the script to search one pixel at a time going at a 60°, or 23°, or whatever angle. Right now the best thing I can think of is to go in the x direction first, find a black pixel, and then use the tangent of the angle to determine how many pixels I need to move up or down, but there are obvious problems with this, mostly having to do with corners, and I can't imagine how many if statements it's going to take to work around it. Is there another solution?
Okay, I think I found a good approximation of what I'm trying to do, though I'd still like to hear if anyone else has a better solution. By checking the tangent of the angle I've actually traveled so far between each pixel move, I can decide whether to move one pixel in the x-direction, or in the y-direction.
for i in range(len(angles)):
angle = self.orientation+angles[i]
if angle > 360:
angle -= 360
x = self.x
y = self.y
x1 = x
y1 = y
xtoy_ratio = tan(angle*math.pi/180)
if angle < 90:
xadd = 1
yadd = 1
elif 90 < angle < 180:
xadd = -1
yadd = 1
elif 180 < angle < 270:
xadd = -1
yadd = -1
else:
xadd = 1
yadd = -1
while mapp.getpixel(x,y) != (0,0,0,255):
if (y-y1)/(x-x1) < xtoy_ratio:
y += yadd
else:
x += xadd
distance = sqrt((y-y1)^2+(x-x1)^2)*pixel_width
The accuracy of this method of course depends a great deal on the actual length represented by each pixel. As long as pixel_width is small, accuracy will be pretty good, but if not, it will generally go pretty far before correcting itself.
As I said, I welcome other answers.
Thanks

python: elegant way of finding the GPS coordinates of a circle around a certain GPS location

I have a set of GPS coordinates in decimal notation, and I'm looking for a way to find the coordinates in a circle with variable radius around each location.
Here is an example of what I need. It is a circle with 1km radius around the coordinate 47,11.
What I need is the algorithm for finding the coordinates of the circle, so I can use it in my kml file using a polygon. Ideally for python.
see also Adding distance to a GPS coordinate for simple relations between lat/lon and short-range distances.
this works:
import math
# inputs
radius = 1000.0 # m - the following code is an approximation that stays reasonably accurate for distances < 100km
centerLat = 30.0 # latitude of circle center, decimal degrees
centerLon = -100.0 # Longitude of circle center, decimal degrees
# parameters
N = 10 # number of discrete sample points to be generated along the circle
# generate points
circlePoints = []
for k in xrange(N):
# compute
angle = math.pi*2*k/N
dx = radius*math.cos(angle)
dy = radius*math.sin(angle)
point = {}
point['lat']=centerLat + (180/math.pi)*(dy/6378137)
point['lon']=centerLon + (180/math.pi)*(dx/6378137)/math.cos(centerLat*math.pi/180)
# add to list
circlePoints.append(point)
print circlePoints
Use the formula for "Destination point given distance and bearing from start point" here:
http://www.movable-type.co.uk/scripts/latlong.html
with your centre point as start point, your radius as distance, and loop over a number of bearings from 0 degrees to 360 degrees. That will give you the points on a circle, and will work at the poles because it uses great circles everywhere.
It is a simple trigonometry problem.
Set your coordinate system XOY at your circle centre. Start from y = 0 and find your x value with x = r. Then just rotate your radius around origin by angle a (in radians). You can find the coordinates of your next point on the circle with Xi = r * cos(a), Yi = r * sin(a). Repeat the last 2 * Pi / a times.
That's all.
UPDATE
Taking the comment of #poolie into account, the problem can be solved in the following way (assuming the Earth being the right sphere). Consider a cross section of the Earth with its largest diameter D through our point (call it L). The diameter of 1 km length of our circle then becomes a chord (call it AB) of the Earth cross section circle. So, the length of the arc AB becomes (AB) = D * Theta, where Theta = 2 * sin(|AB| / 2). Further, it is easy to find all other dimensions.

Categories

Resources