A window can be moved by holding down CTRL and Mouse1, I would love to deactivate it, or deactivate window moving in general.
My problem is, that I use the CTRL key to mark multiple images in my explorer.
After my image is marked, the image is processed in PIL, which takes like 100-200ms.
So when the user holds down CTRL and clicks mouse1, the image gets processed in PIL. The user releases mouse1 and moves to the next element, the whole window is moved a fair amount(~100 pixels], where the user is moving the mouse to.(I guess the mouse1 click is still in buffer, and gets not updated to "is_released", while the program is busy)
My code is way to split(multiple files) up and too long to post, but I think my question is pretty straightforward and simple enough. toggle/block/unblock move window ability. Favorably deactivating CTRL-move in general, so the user can still use the titlebar.
I searched the docs, and found window.move(), but putting the window back to the original position would not be a pretty solution...lol
Thank you.
Refer PySimpleGUI Call Reference
Option in Window
grab_anywhere_using_control
If True can use CONTROL key + left mouse mouse to click and drag to move the window. DEFAULT is TRUE. Unlike normal grab anywhere, it works on all elements.
window = sg.Window('Title', layout, grab_anywhere_using_control=False)
Related
I have had this problem for ages now with many different projects now, so I really want to find out how to fix it. For example, in one of my projects, I created a level editor for a game. The level editor had the option to save and load different levels from a file using tkinter.filedialog.
Now after I select the file, the game will still work, but the 'X' close button doesn't work anymore, and I can't move the window.
The game itself works as usual, and I can still interact with everything inside of the window, but I can't move or close the window.
Thanks in advance.
Ok. I figured this one out. The problem went away when I called the tkinter functions from a key press, not a mouse press.
SPECS:
Using PsychoPy v1.90.3
Window 10 Pro 6 running Windows 10
BACKGROUND:
I am programming a touchscreen task to use with children (reference image linked below). In the task, children need to press and hold the red 'home' button for 1.5 seconds at the bottom of the screen to advance through trials(i.e., opening the windows at the top of the screen to reveal boxes they can open). I also collect the time of the mouse button being pressed and released (used to calculate response time).
I originally programmed this task on my desktop using a mouse, so pressing and holding the home button was no problem using this code:
mouse = event.Mouse(visible=True)
while not homePressed:
if mouse.isPressedIn(home) and home.contains(mouse):
core.wait(1.5, hogCPUperiod=1.5) ## when home button is pressed, wait for 1.5s
if mouse.isPressedIn(home) and home.contains(mouse): ## check if home button is still pressed
homePressed=True
When I tried to run the task on the Surface Pro I ran into a problem with the touchscreen not registering a 'press and hold'. I've learned that the touchscreen doesn't register mouse clicks unless the screen has been pressed AND released because a press and hold could be (1) a right click or (2) a swipe. I've tried disabling the 'press and hold' registering as a right-click option on the Surface Pro but this has not solved my issue.
QUESTIONS:
Is there a way to get the Surface Pro or PsychoPy to register a press and hold on the touchscreen the same way it does using a mouse so children press the 'home' button down to continue the trials?
If yes, can I get PsychoPy to output the 'press' (when the screen is touched) and 'release' (when the touch is no longer present) output the same way it does for a mouse click?
If this cannot be accomplished with the PsychoPy library, are there possible python solutions outside of PsychoPy I could try?
SOLUTIONS TRIED:
using only home.contains(mouse) solution found here
resetting the mouse location solution found here
fixing the 'double tap' issue solution found here
Disabling the Surface Pro's right-click function for touch
task set up image
You could try and write a loop that checks the hold time yourself. For example, in a gaze contingent study we have loops to check how long someone has been looking at something, similar to your "hold" variable.
tCueOn=expClock.getTime()
while True:
curtime=expClock.getTime()-tCueOn
eventType=eyelink.getNextData()
sample=eyelink.getNewestSample()
<<... a bunch of sample processing cut out ...>>
if curtime>=cueTime:
break
In your case you detect the press, get the time, and enter the loop where you repeatedly check the press status is still true and that the time is less than 1.5. When it exceeds 1.5 you break the loop, or if they let up for less than 1.5 you return to whereever you need to in your use case. You might find it convenient to bundle this logic in a function that you could just call whenever "press home" is true.
Hope this helps, cheers.
I have a kivy file with several widgets, one of them is a switch.
The problem is that wherever I press on the screen, it flips the switch.
I have 2 other widgets - one of them is a check-box and another is radio-buttons, which also have some problems and I think they are occuring because of the switch.
The problem is that in order to press them I need to press on a different part of the screen and not on the widget itself.
Any help would be appreciated.
UPDATE
I am using Windows only for development.
You probably have a switch that is larger than you expect. From the documentation of the switch: "The entire widget is active, not just the part with graphics. As long as you swipe over the widget’s bounding box, it will work." I would try changing the background of the switch to see how big it really is.
Does anybody know how to move a mouse pointer to a specific Gtk widget with Python code? Google does not seem to yield any results. I want to hover the pointer over the print button when popping up the GtkPrintOperation dialog. Theoretically, the code should work with any Gtk widget. What I found so far: you can get the position of the mouse. However, I need to set the position of the mouse. If I need to rewrite the whole Gtk stack it probably won't happen. One alternative would be to set the GtkButton as the default and then ENTER would print the document without moving the mouse to select the button.
Why I need this: the project is for my brother in law who is using OpenOffice Basic and he has this feature. I would definitely like to make an impression with Gtk (which is way more powerful). As he prints a lot of documents this is very needed. Any suggestions?
In X-Windows, you can use xdotool:
$ xdotool mousemove column row
A part of a small project I am working on involves 'calibrating' the coordinates of the screen of which to take a screen capture of.
By the 'screen', I refer to the entire desktop, not my GUI window.
The coordinates are calibrated when a QDialog window appears (which I've subclassed).
The user is prompted to click several locations on the screen.
I need the program to record the locations of all mouse clicks occuring anywhere on the screen - ones that don't natively trigger a QDialog mouseEvent, since they are outside this window.
Obviously overwriting the mouseEvent method does not work, since the QDialog doesn't recieve the clicks.
How can I capture global mouse clicks, so that an event is triggered and sent to the QDialog when any part of the screen is clicked?
(I'd prefer a Qt based solution, but am open to other libraries if need be).
Thanks!
There are some cross-platform examples of how to do this with http://pypi.python.org/pypi/autopy/0.51
I've assumed this isn't possible and am instead using pyHook,
letting Qt pump the messages.