I am trying to determine the amount of degrees between two points on a circle. I am having trouble doing so. I have looked at other posts but I cannot seem to figure out how to implement it.
I am trying to get the distance between the two in degrees.
Hopefully the model below gives you a better understanding.
Here is the data chart
def get_angle(x, y):
#75 being the center circle coordinate
(dx, dy) = (75-x, y-75)
angle = degrees(atan2(float(dy), float(dx)))
if angle < 0:
angle += 180
return angle
The values it is returning don't seem right as the values are very similar for some reason. Such as 157 and 120 though it clearly shouldn't return that. I am somewhat new to image processing so I could be looking at it wrong.
Center coordinates: cx, cy
Point coordinates: ax, ay for point A and bc, by for point B
Pseudocode for angle:
dp = dot(A-C, B-C) = (ax-cx)*(bx-cx) + (ay-cy)*(by-cy)
cross = vectorproduct(A-C, B-C) = (ax-cx)*(by-cy) - (ay-cy)*(bx-cx)
angle = degrees(atan2(cross, dp))
if angle < 0:
angle += 180
If angle direction is not what you expect, change sign of cross
Related
I am developing a code in python to check whether a sphere having a center at (x, y, z) coordinates and with radius R, intersects the cube of dimension one, i.e., l = 1, b = 1, h = 1. As mentioned above, I want to know if the sphere intersects the cube at any point or direction, or proportion.
I have a list of sphere coordinates (x,y,z) that must be checked for the intersection. I had done some research on it but couldn't clear my doubts regarding how to approach this example.
I would love to know both the math and coding part of it. Can someone please help me solve it..??
Edit: Cube is axis aligned and placed at the origin.
To reveal a fact of intersection, you can calculate distance from cube to sphere center and compare it with sphere radius. 2D case is described here, it could be easily extended to 3D case.
Get cube center as (rcx, rcy, rcz) and find coordinate differences from cube center to sphere center
dx, dy, dz = x - rcx, y - rcy, z - rcz
Let SquaredDist = 0, and for every coordinate make:
t = dx + 0.5 # 0.5 is half-size of your cube
if t < 0:
SquaredDist += t * t
else:
t = dx - 0.5
if t > 0:
SquaredDist += t * t
finally compare SquaredDist with R*R
Some explanation to comment:
Look at the picture in linked answer. For rectangle ABCD we have center G and coordinate differences GK and GJ, they include half of width and half of height. Squared distance (EC here) is sum of squared distances to proper side lines (planes in 3D case). When the closest (to sphere center) point is cube corner, we take into account three planes, when closest point lies at the edge - we take into account two planes, when closest point lies at facet - we take into account only one plane, and when sphere center is inside - SquaredDist remains zero.
I'm trying to use ray casting to gather all the surfaces in a room and determine it's volume.
I have a centroid location where the rays will be coming from, but I'm drawing a blank on how to get the rays in all 360 degrees (in 3D space).
I'm not getting any points on the floors or ceilings, it's like it's doing a 60 degree spread rotated about the Z axis.
I think I have the rest of it working, but this is stumping me.
for y in range(360):
for x in range(360):
vector = DB.XYZ(math.sin(math.radians(x)), math.cos(math.radians(x)), math.cos(math.radians(y))).Normalize()
prox = ri.FindNearest(origin, direction).Proximity
point = origin + (direction * prox)
Look at it this way: x and y of vector are created from angle x (-> a circle in the plane) and then you add a z component which lies between -1 and 1 (which cos does). So it's obvious that you end up with a cylindrical distribution.
What you might want are spherical coordinates. Modify your code like this:
for y in range(-90, 91):
for x in range(360):
vector = DB.XYZ(math.sin(math.radians(x)) * cos(math.radians(y)),
math.cos(math.radians(x)) * cos(math.radians(y)),
math.sin(math.radians(y))) # Normalize unnecessary, since vector² = sin² * cos² + cos² * cos² + sin² = 1
prox = ri.FindNearest(origin, direction).Proximity
point = origin + (direction * prox)
But be aware that the angle distribution of rays is not uniform using spherical coordinates. At the poles it's more dense than at the equator. You can mitigate this e.g. by scaling the density of x down, depending on y. The surface elements scale down by cos(y)², so I think you have to scale by cos(y).
I have a robot with red led and green led mounted at the front and back respectively. I want to calculate the head direction of the robot as in which direction is the greenLEd - redLed vector pointed to.
How can I code it such that the points marked 1 and 2 in the image below have the same angle i.e. 45degree anti-clockwise whereas point 3 should be at 225degrees.
I used the following script but it giving me wrong results:
def headDirectionAngle(redLEDCoords, greenLEDCoords, referenceVector):
greenRedLEDVector = np.array(greenLEDCoords) - np.array(redLEDCoords)
angle = np.math.atan2(np.linalg.det([referenceVector,greenRedLEDVector]),np.dot(referenceVector,greenRedLEDVector))
return np.degrees(angle)
referenceVector = np.array([0,240])
How should I proceed? Thanks for the help.
Back to basics, without numpy.
atan2 already gives you an anticlockwise angle, but between -180 and 180. You can add 360 and calculate modulo 360 to get an angle between 0 and 360:
from math import atan2, degrees
def anti_clockwise(x,y):
alpha = degrees(atan2(y,x))
return (alpha + 360) % 360
print(anti_clockwise(480, 480))
# 45.0
print(anti_clockwise(-480, -480))
# 225.0
x should just be the difference in X coordinates between the green and red LEDs. Same goes for y.
I have a set of GPS coordinates in decimal notation, and I'm looking for a way to find the coordinates in a circle with variable radius around each location.
Here is an example of what I need. It is a circle with 1km radius around the coordinate 47,11.
What I need is the algorithm for finding the coordinates of the circle, so I can use it in my kml file using a polygon. Ideally for python.
see also Adding distance to a GPS coordinate for simple relations between lat/lon and short-range distances.
this works:
import math
# inputs
radius = 1000.0 # m - the following code is an approximation that stays reasonably accurate for distances < 100km
centerLat = 30.0 # latitude of circle center, decimal degrees
centerLon = -100.0 # Longitude of circle center, decimal degrees
# parameters
N = 10 # number of discrete sample points to be generated along the circle
# generate points
circlePoints = []
for k in xrange(N):
# compute
angle = math.pi*2*k/N
dx = radius*math.cos(angle)
dy = radius*math.sin(angle)
point = {}
point['lat']=centerLat + (180/math.pi)*(dy/6378137)
point['lon']=centerLon + (180/math.pi)*(dx/6378137)/math.cos(centerLat*math.pi/180)
# add to list
circlePoints.append(point)
print circlePoints
Use the formula for "Destination point given distance and bearing from start point" here:
http://www.movable-type.co.uk/scripts/latlong.html
with your centre point as start point, your radius as distance, and loop over a number of bearings from 0 degrees to 360 degrees. That will give you the points on a circle, and will work at the poles because it uses great circles everywhere.
It is a simple trigonometry problem.
Set your coordinate system XOY at your circle centre. Start from y = 0 and find your x value with x = r. Then just rotate your radius around origin by angle a (in radians). You can find the coordinates of your next point on the circle with Xi = r * cos(a), Yi = r * sin(a). Repeat the last 2 * Pi / a times.
That's all.
UPDATE
Taking the comment of #poolie into account, the problem can be solved in the following way (assuming the Earth being the right sphere). Consider a cross section of the Earth with its largest diameter D through our point (call it L). The diameter of 1 km length of our circle then becomes a chord (call it AB) of the Earth cross section circle. So, the length of the arc AB becomes (AB) = D * Theta, where Theta = 2 * sin(|AB| / 2). Further, it is easy to find all other dimensions.
I have this code:
def getAngle(x1, y1, x2, y2):
rise = y1 - y2
run = x1 - x2
angle = math.atan2(run, rise) # get the angle in radians
angle = angle * (180 / math.pi) # convert to degrees
angle = (angle) % 360 # adjust for a right-facing sprite
return angle
... which is returning an angle depending on mouse position on the screen.
I want to set an interval where rotation of my object will stop at the specific point. For an example: if angle is bigger than 90° i want my object to stop getting higher angle. In this case 90° should be like some border where rotation stops.
I think i need 2 conditions becouse angle shouldn't be higher from 90° on the left and right.
Anyone got an idea how to solve that?
this part of the code is in the game loop (it uses defined getAngle):
mousex, mousey = pygame.mouse.get_pos()
for cannonx, cannony in (((width/2)-45, height-25), ((width/2)-45, height-25)):
degrees = getAngle(cannonx, cannony, mousex, mousey)
rotcannonImg = pygame.transform.rotate(cannonImg, degrees)
rotcannonRect = rotcannonImg.get_rect()
rotcannonRect.center = (cannonx, cannony)
windowSurface.blit(rotcannonImg, rotcannonRect)
The phrase “angle shouldn't be higher from 90° on the left and right” does not have a well-defined meaning in English, so I am not sure what you intend. However, the following figure shows the relations between various angles and the signs of rise and run when point (x1, y1) is located where the lines cross and (x2, y2) is in particular octants. Note, dy = rise, dx = run. That is, you might be able to test signs of rise and run to get the information you want.