I am trying to create an app using wxPython that has a side swipe gesture in it i.e. on the Mac touchpad. I had a search around and I can't find any reference to multitouch support in wx. Does anybody know if there is a class that will allow me to get this input? Much appreciated.
Multi-touch really isn't supported. If the touch can be translated to just normal mouse events, then wx can support that. So theoretically wx should support a swipe but you'll have to do it yourself by watching EVT_MOTION and checking which direction the mouse movement is going in.
See also the following links:
https://groups.google.com/forum/?fromgroups=#!topic/wxpython-users/E4SMMUPwgNI
http://pymt.eu/
Related
I'm trying to embed multitouch events into my wxPython GUI, however i'm a bit lost as to the best approach.
Currently, I have a TUIO server which transmits the multitouch events to be captured. I then use the pytuio library to receive the multitouch events in a separate thread for my GUI. My GUI is composed of a wxFrame with multiple matplotlib panels and a single OpenGL panel.
The problem is that I have had to manually write code to determine how many fingers are being used, the locations and the touch type. I then send a custom event which can be received by my GUI.
This works fine for the matplotlib panels (albeit I have to provide a very small constant offset to the reported location of the fingers), but for the OpenGL panel the finger locations seem to be incorrect. This is a problem as the offset of the touch locations in the OpenGL panel is not even a constant, it seems to vary depending where on the panel the touch event occurs. So I cannot compensate for it.
I feel like there must be a more comprehensive multitouch library, which does all the hard work determining the number of fingers and touch type (tap, double tap, drag, release etc). And possibly would overcome my issue with the OpenGL panel. I have looked but I've not seen a library which can distinguish the touch type etc., they just seem to provide a list of the number of fingers and the locations.
The only comprehensive GUI library supporting:
Python
More than one OS
Multitouch
is Kivy. I was able to cobble together something which works for Windows 7 and higher and wxPython (by extracting the relevant part from Kivy for processing WM_TOUCH events), so in principle it could be done. But none of this would solve your specific problem.
I'm helping to implement an experiment using PsychoPy on a Windows 8 tablet. It doesn't seem to be possible to get direct access to touch events through either PsychoPy, or the pyglet or PyGame interfaces.
Most other sources I've found have referred to using mouse move events in place of touch events. This works fine for recording position, but for recording time it doesn't work for us. We would like to collect the timing of the start of the touch, whereas the mouse event comes at the end of the touch.
Does anyone know of a way to do this, either in PsychoPy or by importing another library into the experiment?
Update: Logging ioHub mouse events, it looks like press and release mouse events are both sent at the end of the touch. This makes sense as this is the point at which the OS is sure that the touch is not a swipe. (Also, it will decide whether the touch is a left- or right-click depending on the duration of the touch).
I've managed this using a hook into the WndProc, it's not pretty but it works. The solution, for posterity:
https://github.com/alisdt/pywmtouchhook
A brief summary:
I used a combination of ctypes and pywin32 (unfortunately neither alone could do the job) to register the target HWND to receive touch messages, and replace its WndProc, passing through all non-touch messages to the original WndProc.
i need to get a joystick's velocity data, but pygame (also my os) only tells me if a button on a specified controller is pressed or else..
i'm trying to convert guitar hero drum controller signals into midi, the controller is seen by the os as a normal joystick and the pads are buttons. i know for sure that the controller sends out informations about the velocity at wich a button is pressed but i cannot find a way to get that info.
i'm writing my "translator" in python, but it's not a big piece of code so i can easily switch to another language that provides me the right libraries, is there a way to get the buttons' velocities?
I found this on a site i used to use.
Using horiz_axis_pos= my_joystick.get_axis(0) and vert_axis_pos= my_joystick.get_axis(1) , i think you can read the velocity, because it reads both horizontal and vertical velocity, which you can get an angle from.
Try it! (Honestly, this seemed to work for me, but idk for you)
I'm working on a python application that controls mouse movement.
I have absolute mouse position working perfectly, using the Quartz.CoreGraphics library, which exposes some CGEvent commands for mouse control (like "CGEventCreateMouseEvent" and "CGEventPost").
However, I can't find anything in the docs about relative mouse movement. I would really like to simulate actual mouse movements (i.e. "x sideways, y up" instead of "x,y"), because some of the people using my application have multiple monitors, and I imagine it would be a lot easier just to inform the OS that there was a mouse movement rather than setting the position myself.
The nature of my interface also lends itself to relative mouse movement.
In Windows, there is a function in the win32 API that allows for "raw" mouse commands that can do exactly what I am looking for. Is there any way to achieve this in OS X?
I do not think that it's not possible with the way that events are managed. You need to capture the old (X,Y) and calculate the delta yourself. This means that you'll have a problem when you hit the end of the screen.
The good news is that you can move the mouse. So if the mouse hits the edge, you can reposition it to the center. You can also make the mouse pointer invisible. Finally, you can catch the mouse movements with a tracking rectangle that covers the entire screen. So the good news is that you can simulate precisely what you want to do , but the bad news is that it will take some work.
Useful APIs for this include:
CGEventTap (See Quartz Event Services Reference
CGPostEvent
CGDisplayMoveCursorToPoint (See Quartz Display Services Reference)
Other SO references include:
Limiting mouse to one display on Mac (Potentially using Cocoa)
Cocoa: Limit mouse to screen
Cocoa Move Mouse
Is there anyway I could register button pushes on a joystick without using any function from pygame?
If you're looking for a way to do it with just the standard Python library, the answer is probably "no" - or at the very least, "not in any straightforward manner". There are many things in the standard library, but gaming hardware support is not one of them.
If your on linux, you could check out Python Joystick Class using Gobject