I have this code:
def getAngle(x1, y1, x2, y2):
rise = y1 - y2
run = x1 - x2
angle = math.atan2(run, rise) # get the angle in radians
angle = angle * (180 / math.pi) # convert to degrees
angle = (angle) % 360 # adjust for a right-facing sprite
return angle
... which is returning an angle depending on mouse position on the screen.
I want to set an interval where rotation of my object will stop at the specific point. For an example: if angle is bigger than 90° i want my object to stop getting higher angle. In this case 90° should be like some border where rotation stops.
I think i need 2 conditions becouse angle shouldn't be higher from 90° on the left and right.
Anyone got an idea how to solve that?
this part of the code is in the game loop (it uses defined getAngle):
mousex, mousey = pygame.mouse.get_pos()
for cannonx, cannony in (((width/2)-45, height-25), ((width/2)-45, height-25)):
degrees = getAngle(cannonx, cannony, mousex, mousey)
rotcannonImg = pygame.transform.rotate(cannonImg, degrees)
rotcannonRect = rotcannonImg.get_rect()
rotcannonRect.center = (cannonx, cannony)
windowSurface.blit(rotcannonImg, rotcannonRect)
The phrase “angle shouldn't be higher from 90° on the left and right” does not have a well-defined meaning in English, so I am not sure what you intend. However, the following figure shows the relations between various angles and the signs of rise and run when point (x1, y1) is located where the lines cross and (x2, y2) is in particular octants. Note, dy = rise, dx = run. That is, you might be able to test signs of rise and run to get the information you want.
Related
I am calcualting the angle between two lines on the following image
with this code
# the lines are in the format (x1, y1, x2, y2)
def getAngle(line_1, line_2):
angle1 = math.atan2(line_1[1] - line_1[3], line_1[0] - line_1[2])
angle2 = math.atan2(line_2[1] - line_2[3], line_2[0] - line_2[2])
result = math.degrees(abs(angle1 - angle2))
if result < 0:
result += 360
return result
Now the function works between the two red lines (almost on top of each other) and the red and the green line. However, between the red and the blue line the fuction returns 239.1083 when it should be ~300. Since it is working in some of the cases and not others i am not sure what the problem is.
Some example inputs and outputs:
getAngle((316,309,316,-91), (316,309,421,209)) = 46.3971 # working
getAngle((316,309,316,-91), (199,239,316,309)) = 239.108 # should be around 300
For the example getAngle((316,309,316,-91), (199,239,316,309)), The culprit is measurement of angles in this case.
Angles are getting calculated w.r.t. positive X axis. The angle which you have defined here, calculates phi in the given image below rather than theta, which you should be expecting. Since the rotation is negative in nature (observe the arrow for phi), any subsequent calculation must ensure positive rotation, rather than the negative one. Otherwise, you'll be short by the complementary angle, roughly.
In the given example, the correct angle of line2 should be about +210 degrees, or about -150 degrees. Similarly, the angle of line1 could be +90 or -90 degrees. Now, it's all in the game of which ones to add or subtract and how?
The 239.something, let's call it 240 is gotten by abs(90-(-150)) The
300 you are expecting is gotten by abs(-90 - (+210)).
The difference of 60 degrees is the complement of theta = 30 degrees.
So, it's not so much as bad formula, it's bad argument passing and checking to get positive or negative angles.
I am trying to determine the amount of degrees between two points on a circle. I am having trouble doing so. I have looked at other posts but I cannot seem to figure out how to implement it.
I am trying to get the distance between the two in degrees.
Hopefully the model below gives you a better understanding.
Here is the data chart
def get_angle(x, y):
#75 being the center circle coordinate
(dx, dy) = (75-x, y-75)
angle = degrees(atan2(float(dy), float(dx)))
if angle < 0:
angle += 180
return angle
The values it is returning don't seem right as the values are very similar for some reason. Such as 157 and 120 though it clearly shouldn't return that. I am somewhat new to image processing so I could be looking at it wrong.
Center coordinates: cx, cy
Point coordinates: ax, ay for point A and bc, by for point B
Pseudocode for angle:
dp = dot(A-C, B-C) = (ax-cx)*(bx-cx) + (ay-cy)*(by-cy)
cross = vectorproduct(A-C, B-C) = (ax-cx)*(by-cy) - (ay-cy)*(bx-cx)
angle = degrees(atan2(cross, dp))
if angle < 0:
angle += 180
If angle direction is not what you expect, change sign of cross
I've been trying to make a code with pygame to simulate simple gravity. At the moment, there is only one object (HOM) which is orbiting the sun. However, for reasons unknown to me, whenever I run the code, HOM travels round the sun in an orbit at the start, but then accelerates away from the sun when it reaches ~135 degrees from vertical.
Does anyone know why this is happening and how I can fix it? I have been printing some variables to try and source the problem, but have had no luck so far.
Code:
import pygame,sys,time
from math import *
screen=pygame.display.set_mode((800,600))
G = 5
class Object: #Just an object, like a moon or planet
def __init__(self,mass,init_cds,init_vel,orbit_obj='Sun'):
self.mass = mass
self.cds = init_cds
self.velocity = init_vel
self.accel = [0,0]
self.angle = 0
self.orb_obj = orbit_obj
def display(self):
int_cds = (round(self.cds[0]),round(self.cds[1]))#Stores its co-ordinates as floats, has to convert to integers for draw function
pygame.draw.circle(screen,(255,0,0),int_cds,10)
def calc_gravity(self):
if self.orb_obj == 'Sun':
c_x,c_y = 400,300
c_mass = 10000
else:
c_x,c_y = self.orb_obj.cds
c_mass = self.orb_obj.mass
d_x = self.cds[0]-c_x
d_y = self.cds[1]-c_y
dist = sqrt(d_x**2+d_y**2) #Find direct distance
angle = atan(d_x/d_y) #Find angle
print(d_x,d_y)
print(dist,degrees(angle))
if dist == 0:
acc = 0
else:
acc = G*c_mass/(dist**2) #F=G(Mm)/r^2, a=F/m -> a=GM/r^2
print(acc)
acc_x = acc*sin(angle) #Convert acceleration from magnitude+angle -> x and y components
acc_y = acc*cos(angle)
self.accel = [acc_x,acc_y]
print(self.accel)
self.velocity = [self.velocity[0]+self.accel[0],self.velocity[1]+self.accel[1]] #Add acceleration to velocity
print(self.velocity)
self.cds = (self.cds[0]+self.velocity[0],self.cds[1]+self.velocity[1]) #Change co-ordinates by velocity
print(self.cds)
print('-------------------') #For seperating each run of the function when printing variables
HOM = Object(1000000,(400,100),[10,0]) #The problem planet
clock = pygame.time.Clock()
while True:
for event in pygame.event.get():
if event.type == pygame.QUIT:
pygame.quit()
sys.exit()
screen.fill((0,0,0))
pygame.draw.circle(screen,(255,255,0),(400,300),25)
HOM.display()
HOM.calc_gravity()
clock.tick(30)
pygame.display.flip()
Your main issue has to do with this line:
angle = atan(d_x/d_y) #Find angle
The atan function is very limited in its ability to compute angles because it can't tell the signs of the coordinates you combined in your division. For instance, it will give the same result for atan(1/1) and atan(-1/-1), since both divisions compute the same slope (1).
Instead you should use atan2, and pass the coordinates separately. Since this will let the code see both coordinates, it can pick an angle in the right quadrant of the circle every time.
But there's an even better fix. Instead of computing an angle and then immediately converting it back to a unit vector (by calling sin and cos on it), why not compute the unit vector directly? You already have the original vector's length! Instead of:
acc_x = acc*sin(angle) #Convert acceleration from magnitude+angle -> x and y components
acc_y = acc*cos(angle)
Use:
acc_x = acc * d_x / distance
acc_y = acc * d_y / distance
The d_x / distance and d_y / distance values are the same as the sin and cos values you were getting before (for the angles when they were working correctly), but there's no need for the trigonometry. You can get rid of the line I quoted up top completely!
Note that you might need to reverse the way you're computing d_x and d_y, so that you get a vector that points from the orbiting object towards the object it's orbiting around (instead of pointing the other way, from the center of the orbit towards the orbiting object). I'm not sure if I'm reading your code correctly, but it looks to me like you have it the other way around right now. That means that you were actually getting the wrong results from atan in the cases where your current code was working the way you expected, and the bad behavior (flying off into nowhere) is the code working "correctly" (from a mathematical point of view). Alternatively, you could compute acc to be negative, rather than positive.
As several commenters mentioned, you may have other issues related to your choice of integration algorithm, but those errors are not going to be as large as the main issue with the acceleration angle. They'll crop up as you run your simulation over longer time periods (and try to use larger time steps to make the simulation go faster). Your current algorithm is good enough for an orbit or two, but if you're simulating dozens or hundreds of orbits, you'll start seeing errors accumulate and so you should pick a better integrator.
I am using Garden-MapView in my Kivy app.
My issue is that the user can move outside the bounds of the map (pulled from OpenStreetMap) and continue to pan into the surrounding blue 'sea' area. Image below:
I'm not sure if the issue is specific to Garden-MapView, or if a generic kivy/widget answer could solve it.
My best attempt to solve this (out of many) is some crude code posted below. When the map extents move past the edge of the screen, the code calculates the center screen coordinate and pulls the center of the map to it. It works better for longitude. But this can slow down the app significantly due to the frequency of on_map_relocated event calls. I have also set MapView.min_zoom = 2:
class CustMapView(MapView):
def on_map_relocated(self, *kwargs):
x1, y1, x2, y2 = self.get_bbox()
centerX, centerY = Window.center
latRemainder = self.get_latlon_at(centerX, centerY, zoom=self.zoom)[0]-(x1+x2)/2
if x1 < -85.8: self.center_on((x1+x2)/2+latRemainder+.01, self.lon)
if x2 > 83.6: self.center_on((x1+x2)/2+latRemainder-.01, self.lon)
if y1 == -180: self.center_on(self.lat, (y1+y2)/2+0.01)
if y2 == 180: self.center_on(self.lat, (y1+y2)/2-0.01)
Full code to reproduce yourselves: https://pastebin.com/xX0GtPUb
I'm working on a particle filter for an autonomous robot right now, and am having trouble producing expected distance measurements by which to filter the particles. I have an image that I'm using as a map. Each pixel represents a certain scaled area in the enviroment. Space the robot can occupy is white, walls are black, and areas that are exterior to the enviroment are grey.
If you are unfamiliar with what a particle filter is, my python code will create a predetermined number of random guesses as to where it might be (x,y,theta) in the white space. It will then measure the distance to the nearest wall with ultrasonic sensors at several angles. The script will compare these measurements with the measurements that would have been expected at each angle for each guessed location/orientation. Those that most closely match the actual measurements will survive while guesses that are less likely to be right will be eliminated.
My problem is finding the nearest wall AT a given angle. Say the sensor is measuring at 60°. For each guess, I need to adjust the angle to account for the guessed robot orientation, and then measure the distance to the wall at that angle. It's easy enough find the nearest wall in the x direction:
from PIL import Image
#from matplotlib._png import read_png
from matplotlib.pyplot import *
mapp = Image.open("Map.png")
pixels = mapp.load()
width = mapp.size[0]
height = mapp.size[1]
imshow(mapp)
pixelWidth = 5
for x in range(width):
if mapp.getpixel((x, 100)) == (0,0,0,255): #Identify the first black pixel
distance = x*pixelWidth self.x
The problem is that I can't tell the script to search one pixel at a time going at a 60°, or 23°, or whatever angle. Right now the best thing I can think of is to go in the x direction first, find a black pixel, and then use the tangent of the angle to determine how many pixels I need to move up or down, but there are obvious problems with this, mostly having to do with corners, and I can't imagine how many if statements it's going to take to work around it. Is there another solution?
Okay, I think I found a good approximation of what I'm trying to do, though I'd still like to hear if anyone else has a better solution. By checking the tangent of the angle I've actually traveled so far between each pixel move, I can decide whether to move one pixel in the x-direction, or in the y-direction.
for i in range(len(angles)):
angle = self.orientation+angles[i]
if angle > 360:
angle -= 360
x = self.x
y = self.y
x1 = x
y1 = y
xtoy_ratio = tan(angle*math.pi/180)
if angle < 90:
xadd = 1
yadd = 1
elif 90 < angle < 180:
xadd = -1
yadd = 1
elif 180 < angle < 270:
xadd = -1
yadd = -1
else:
xadd = 1
yadd = -1
while mapp.getpixel(x,y) != (0,0,0,255):
if (y-y1)/(x-x1) < xtoy_ratio:
y += yadd
else:
x += xadd
distance = sqrt((y-y1)^2+(x-x1)^2)*pixel_width
The accuracy of this method of course depends a great deal on the actual length represented by each pixel. As long as pixel_width is small, accuracy will be pretty good, but if not, it will generally go pretty far before correcting itself.
As I said, I welcome other answers.
Thanks