I have a Qt program with many buttons, user-interactable widgets, etc.
At one stage in the program, I would like all the widgets to temporarily 'stop working'; stop behaving to mouse clicks and instead pass the event on to one function.
(This is so the User can select a widget to perform meta operations. Part explanation here: Get variable name of Qt Widget (for use in Stylesheet)? )
The User would pick a widget (to do stuff with) by clicking it, and of course clicking a button must not cause the button's bound function to run.
What is the correct (most abstracted, sensible) method of doing this?
(which doesn't involve too much new code. ie; not subclassing every widget)
Is there anything in Qt designed for this?
So far, I am able to retrieve a list of all the widgets in the program (by calling
QObject.findChildren(QtGui.QWidget)
so the solution can incorporate this.
My current horrible ideas are;
Some how dealing with all the applications events all the time in one
function and not letting through the events when I need the
application to be dormant.
When I need dormancy, make a new transparent widget which recieves
mouse clicks and stretch it over the entire window. Take coordinates
of click and figure out the widget underneath.
Somehow create a new 'shell' instance of the window.
THANKS!
(Sorry for the terrible write-up; in a slight rush)
python 2.7.2
PyQt4
Windows 7
You can intercept events send to specific widgets with QObject::installEventFilter.
graphite answered this one first so give credit where credit is due.
For an actual example in PySide, here's an example you might draw some useful code from:
my_app.py
from KeyPressEater import KeyPressEater
if __name__ == "__main__":
app = QApplication(sys.argv)
eater = KeyPressEater()
app.installEventFilter(eater)
KeyPressEater.py
class KeyPressEater(QObject):
# subclassing for eventFilter
def eventFilter(self, obj, event):
if self.ignore_input:
# swallow events
pass
else:
# bubble events
return QObject.eventFilter(self,obj,event)
Related
Considering a very basic HelloWorld PyQt5 application like:
app = QApplication(sys.argv)
window = QWidget()
window.setWindowTitle('PyQt5 app')
window.setGeometry(100, 100, 280, 80)
window.move(60, 15)
helloMsg = QLabel('<h1>Hello World!</h1>', parent=window)
helloMsg.move(60, 15)
window.show()
sys.exit(app.exec_())
It constructs a QApplication, a parent-less QWidget becoming the main window, adds a QLabel and shows it.
My question is: how does the QApplication know about the main window?
There is nothing in this code connecting the two.
Perhaps it is a naive question but just looking at this, it seems like magic.
How is the main window's paint event added to the application's event queue without telling so in the source code ? How does the QApplication instance know what is going to be added below in the source code?
tl;dr
There's no "magic" involved: sub-modules can access their "main" modules, and each module of Qt can know if a QApplication instance is running.
Long version
I think that's an interesting question, especially for those who are not that into low level programming. For instance, I've always given the QApplication as some sort of a "cartesian" assumption: «it exists».
As a premise, I'm not going to give you a very technical and low-level explanation: I don't have enough skills to do so (and I really welcome any other answer or edit to this), but I'm assuming that's not what you're looking for.
[Almost] technically speaking, you've to remember that Qt - and PyQt along with it - is an environment (the exact term is framework). As such, each one of its sub elements (classes, and eventually instances of them) "know" about that environment.
QApplication (and its base classes QGuiApplication and QCoreApplication) is a class that is internally accessible from any "sub" Qt module.
It's something like the builtin types (str, int, bool, etc.) that are accessible to any module. For example, the os.path is a python module that you can import as standalone, but it knows what the main os module is, and each function of os.path actually uses some part of that main module.
Like most frameworks, Qt has what is called called an event loop, which is usually run as soon as you call Q[*]Application.exec(). An event loop is something that generally blocks itself waiting for something to happen (an event) and eventually react to it.
Whenever a Qt class needs it, it internally calls the Q[*]Application.instance() method to ensure that an instance of the application is running, meaning that an event loop is active and running. For example Qt widgets need that to be able to show the interface and interact with it: tell the operating system that a new window has been created, therefore it has to be drawn on the screen, so the OS will say "ok, let's show it" by sending Qt an event requesting the drawing, then Qt will "send" that event to that window that will finally draw itself by telling Qt how it's being painted; finally Qt will "tell" the OS what's going to be shown. At the same time, that window might need to know if some keyboard or mouse event has been sent to it and react in some way.
You can see this in the Qt sources: whenever a new QWidget is created, it ensures that a QApplication exists by calling QCoreApplication.instance().
The same happens for other Qt objects that require an application event loop running. This is the case of QTimer (that doesn't require a graphical interface, but has to interface with the system for correct timing) and QPixmap (which needs to know about the graphical environment to correctly show its image), but in some specific cases it also depends on the platform (for example, creation of a QIcon on MacOS requires a running event loop, while that's not necessary on Linux and Windows).
So, finally, that's what (roughly) happens when you run your code:
# create an application instance; at this point the loop is not "running"
# (but that might be enough to let know most classes about the current system
# environment, such as available desktop geometries or cursor position)
app = QApplication(sys.argv)
# create a widget; an application exists and the widget can "begin" to create its
# interface using the information provided by it, like the system default font
# (it's actually a bit more complicated due to cross-platform issues, but let's
# ignore those things now)
window = QWidget()
window.setWindowTitle('PyQt5 app')
window.setGeometry(100, 100, 280, 80)
window.move(60, 15)
helloMsg = QLabel('<h1>Hello World!</h1>', parent=window)
helloMsg.move(60, 15)
# "ask Qt to prepare" the window that is going to be shown; at this point the
# widget's window is not shown yet even if it's "flagged as shown" to Qt, meaning
# that "window.isVisible()" will return True even if it's not actually visible yet
window.show()
# start the event loop by running app.exec(); sys.exit will just "wait" for the
# application to return its value as soon as it actually exits, while in the
# meantime the "exec" function will run its loop almost as a "while True" cycle
# would do; at this point the loop will start telling the OS that a new window
# has to be mapped and wait from the system to tell what to do: it will probably
# "answer" that it's ok to show that window, then Qt will tell back the widget
# that it can go on by "polishing" (use the current style and app info to finally
# "fix" its size) and begin drawing itself, then Qt will give back those drawing
# information allowing the OS to actually "paint" it on the screen; then it will
# be probably waiting for some user (keyboard/mouse) interaction, but the event
# loop might also tell the OS that the window is willing to close itself (as a
# consequence of a QTimer calling "widget.close", for instance) which could
# possibly end with ending the whole event loop, which is the case of
# https://doc.qt.io/qt-5/qguiapplication.html#quitOnLastWindowClosed-prop
# which would also cause the application to, finally, return "0" to sys.exit()
sys.exit(app.exec_())
I have a Python 3.7 tkinter GUI, and within the GUI I have implemented up-down arrow key controls for the main part of the application. Next to it I have a list box that also controls the application but in a different way, and by default AFTER a listbox selection has been made the listbox selection will scroll with up and down arrows. So, after I've used the list box in the app, the arrow key triggers an up arrow event in both the main part of the application and in the list box. This triggers my application to respond in the change in list box selection by loading new data into the main app. This is obviously unacceptable.
How can I disable the arrow key controls feature of tkinter's
ListBox?
I've tried configuring the listbox to not take focus, but this doesn't seem to disable the feature.
Edit:
I solved this by binding the list box's FocusIn event to a function that immediately focus's something else. This is far from ideal, as the code now focus's in then changes focus for no reason. If there is a way to disable focus on a widget completely or disable the list box key bindings that would be a preferred solution.
from tkinter import *
class App:
def __init__(self):
self.root = Tk()
self.dummy_widget = Label()
self.lb = ListBox(master=self.root)
self.lb.bind("<FocusIn>", lambda event: self.dummy_widget.focus())
# Additional setup and packing widgets...
if __name__ == '__main__':
mainloop()
This seems very "hacky", although it does the job perfectly.
How can I disable the arrow key controls feature of tkinter's ListBox?
Create your own bindings for the events you want to override, in the widget in which you want them overridden. Do anything you want in that function (including nothing), and then return the string break, which is the documented way to prevent events from being processed any further.
For a more extensive description of how bindings work, see this answer to the question Basic query regarding bindtags in tkinter. It describes how a character is inserted into an entry widget, but the mechanism is identical for all events and widgets in tkinter.
I have tried looking through the documentation but not sure which to use.
I am basically looking for an event that is called when the panel containing the event is shown.
My program is split into multiple panels, which the user switches between with buttons. I haven't been able to get the button which switches panels to be able to interact with the combobox, so I've been trying to get it to update when the panel is shown.
class SomePanel(wx.Panel):
... # Panel initilisation/Event listeners
def panelShown(self, event):
# update combobox
Edit: I have found it. Leaving question up in case anyone else needs it.
For anyone with code as weird as mine.
In the SomePanel class:
self.Bind(wx.EVT_SHOW, self.panelShown)
With wxPython, how does one trigger an event whenever the whole window goes into/out of focus?
To elaborate, I'm building a serial terminal GUI and would like to close down the connection whenever the user doesn't have my application selected, and re-open the connection whenever the user brings my app back into the foreground. My application is just a single window derived from wx.Frame.
The correct answer for this case is to use an EVT_ACTIVATE handler bound to the frame. There will be an event whenever the frame is activated (brought into the foreground relative to other windows currently open) or deactivated. You can use the event object's GetActive method to tell which just happened.
as WxPerl programmer i know there is
EVT_SET_FOCUS(
EVT_KILL_FOCUS(
if you initialize this event by listening to the frame as first parameter it should work as in Perl since the API is almost the same
Interesting article at http://www.blog.pythonlibrary.org/2009/08/27/wxpython-learning-to-focus/
Gist of it: wx.EVT_KILL_FOCUS works fine, but wx.EVT_SET_FOCUS behaves a little oddly for any panel containing widgets (the child's set-focus prevents the panel's set-focus event from firing as expected?)
In addition to what these fellows are saying, you might also want to try EVT_ENTER_WINDOW and EVT_LEAVE_WINDOW. I think these are fired when you move the mouse into and out of the frame widget, although I don't think the frame has to be in focus for those events to fire.
# Hugh - thanks for the readership!
I use GtkAboutDialog and everything works fine except the close button of this widget. All other buttons works fine, I don't know how but all buttons have default callbacks and they create and destroy the windows.
But the "Close" button of GtkAboutDialog widget does not work. I can not even see it's widget. So, can I access it?
[CLARIFICATION] What you're looking at is gtk.AboutDialog — popup window displaying information about an application (new in PyGTK 2.6). This window contains the 'close' button widget which is contained in a GtkHButtonBox widget. The GtkHButtonBox widget is the highest level widget I am able to access for some. Any ideas on how to get to the "close" button and connect a handler for a callback signal?
You don't conenct signals in the same way for a dialog as you do for a window. I made the same mistake when learning PyGTK.
The most basic form of a dialog is you display and run the dialog with:
aboutdialog.run()
Often you will then immediately call:
aboutdialog.destroy()
The .run() line is a loop, which runs until something happens within the dialog.
There is a working example here.
The gtk.AboutDialog is just a gtk.Dialog, and you handle responses from it in the same way. Instead of connecting to the clicked signal of the buttons, the dialog code handles that for you and returns a reponse from your run() call. You can check the value of the response returned to figure out what button was clicked.
If you're trying to override some behaviour instead, you can connect to the response signal of gtk.Dialog.
This is an old question, but since it's one of the first hits from google, I thought I'd throw in the solution that I found. You need an event handler to show the about dialog and one to close it. The first will likely be connected to your help->about menuitem's activate signal. The latter should be connected to the response signal of the about dialog. The two handlers will look something like this:
def on_menuitemHelpAbout_activate(self, *args):
self.builder.get_object('aboutdialog').show()
def on_aboutdialog_response(self, *args):
self.builder.get_object('aboutdialog').hide()
In the example above, I'm using the GtkBuilder to find my about dialog because I've constructed the interface with glade. Note that I'm using .show() over .run() because I don't see the sense in pausing program execution until the dialog is closed. Finally, the response handler can be made to take whatever action depending upon the response, but I'm ignoring it here.