Zenity not working in python script with crontab - python

I have a python script in which I have used Zenity to display some notification. The code snippet is as follows:
if message_list:
pretty_print(message_list)
os.system("/usr/bin/zenity --notification --text='You have unread messages'")
When I run this script normally, everything works fine i.e. dialog box appears and message gets displayed. But when I schedule this script in crontab nothing appears. Any solution to this?

There is no sane way to run interactive commands from cron. There is no guarantee that there is a user, there is no guarantee that there is a single user, there is no guarantee that the user(s) who are on want to, or are even able to, interact with your software; some of them may be pseudo-users or remote on expensive metered Internet access or just idle or whatever.
The usual solution is a server/client architecture where whatever runs from cron makes its results available via some IPC mechanism, and users who want the results run a client from within their X11 session (or shell, or what have you).

Create a script info.sh (remember to grant it execute rights):
#!/bin/bash
xhost +
/usr/bin/zenity --notification --text='You have unread messages'
And in your script:
if message_list:
pretty_print(message_list)
os.system("./info.sh")
That's if you want to use the solution you mentioned.

Related

running commands in an external gnome-terminal using subprocess.Popen

I am trying to execute commands using communicate in the terminal that i spawned.
sitecreate_proc = subprocess.Popen(['gnome-terminal'], stdout=subprocess.PIPE, stdin=subprocess)
out = sitecreate_proc.communicate("pwd")
print out
the "out" variable is always empty.
Displaying the terminal is necessary.
gnome-terminal is a graphical application and as one, likely doesn't use its own standard streams that it got from the parent process.
You need to run console applications instead to communicate with them -
either the commands themselves:
>>> subprocess.check_output("pwd")
'/c/Users/Ivan\n'
or an interactive shell command, then send input to it and receive responses as per Interacting with bash from python
If you just need to output stream data to the same console that python is using, you can simply write out their data as you're getting it - either automatically with tee, or by hand at appropriate moments.
If, instead, you need to launch an independent terminal emulator window on a desktop and interact with it via IPC, that's another matter entirely - namely, UI automation, and has nothing to do with standard console streams.
The most common way for that in Linux is D-Bus (there are other options outlined on the previous link). Ppl report however (as of 2012) that gnome-terminal doesn't support D-bus and you have to jump through hoops to interact with it. There is an article on controlling konsole via D-Bus though.
As I remember communicate return a tuple,
communicate() returns a tuple (stdoutdata, stderrdata)
. so you can't user communicate("pwd"). gnome-terminal returns, then try to get that result, by sitecreate_proc.communicate()[0] for stroutdate, or sitecreate_proc.communicate()[0] for stderrdata

Registry handles leaked?

We're running a Python script (which uses multithreading) to do some work on an Amazon-EC2 based Windows Server 2008 machine. When the machine starts, I can see that it starts executing the Python script, and then I start seeing messages like the following in the event log:
Windows detected your registry file is still in use by other applications or services. The file will be unloaded now. The applications or services that hold your registry file may not function properly afterwards.
DETAIL -
19 user registry handles leaked from \Registry\User\S-1-5-21-2812493808-1934077838-3320662659-500_Classes:
Process 2872 (\Device\HarddiskVolume1\Python27\python.exe) has opened key \REGISTRY\USER\S-1-5-21-2812493808-1934077838-3320662659-500_CLASSES
Process 2844 (\Device\HarddiskVolume1\Python27\python.exe) has opened key \REGISTRY\USER\S-1-5-21-2812493808-1934077838-3320662659-500_CLASSES
Process 2408 (\Device\HarddiskVolume1\Python27\python.exe) has opened key \REGISTRY\USER\S-1-5-21-2812493808-1934077838-3320662659-500_CLASSES
What exactly does this mean, and how do I stop Windows from killing some of the threads?
When a scheduled task is configured to run as a particular user, that user's account is logged on non-interactively in order to run the task. When the task is finished, the user's registry hive is unloaded. For some reason, this is happening prematurely.
From your description, you have a single scheduled task, which launches various subprocesses. It seems likely that the parent process is exiting before the subprocesses are finished, and that this is causing the user's registry hive to be unloaded. You can verify this theory by turning on auditing for process creation and termination (in Group Policy under Advanced Audit Policy Configuration) or by using a tool such as Process Monitor (available from the MS website).
Assuming this is the cause, the fix is for the parent process to wait for the subprocesses to exit before itself exiting; alternatively, depending on your circumstances, it may be sensible for the parent task to simply never exit.
If you don't have direct control over the relationship between the parent process and the subprocesses then you'll need to create a new parent process to launch the script for you, and then either wait for all subprocesses to complete or sleep forever, as appropriate.
It may be that some your files are corrupted. Try the following:
Perform SFC(System file Checker) scan and see if it helps.
Press Windows key + X.
Select Command Prompt(Admin).
Type sfc /scannow and hit enter.
Also perform a chkdsk:
Press Windows Logo + C to open the Charms bar.
Now click Settings and then More PC Settings.
Now click General and then click Restart Now under Advanced Startup.
Now Click Troubleshoot.
Now click Advanced options and select Command prompt.
Type chkdsk /r and hit enter.
Last but not least, if the above doesn't work, you can perform a startup repair:
Press Windows logo + W to open the search box.
Type Advanced Startup options, hit enter.
Then Click Restart Now under Advanced Startup.
Now Click Troubleshoot.
Then click Advanced options and then Automatic Repair.
Hope it helps.

Python script will not run in Task Scheduler for "Run whether use is logged on or not"

I have written a python script and wanted to have it run at a set period everyday with the use of Task Scheduler. I have had no problems with Task Scheduler for running programs while logged off, before creating this task.
If I select "Run only when user is logged on" my script runs as expected with the desired result and no error code (0x0).
If I select "Run whether user is logged on or not" with "Run with highest privileges" and then leave it overnight or log off to test it, it does not do anything and has an error code of 0x1.
I have the action to "Start a program" with the Details as follows:
Program/script: C:\Python27\python2.7.exe
Add arguments: "C:\Users\me\Desktop\test.py"
I think it has to do with permissions to use python while logged off but I can't figure this one out. Wondering if anyone has suggestions or experience on this.
This is on Windows 7 (fyi)
Thanks,
JP
I think I have found the solution to this problem. My script is used to create a powerpoint slide deck and needs to open MS PPT.
I stumbled upon a post from another forum with a link to MS's policy on this. It basically boils down to the following:
"Microsoft does not currently recommend, and does not support, Automation of Microsoft Office applications from any unattended, non-interactive client application or component (including ASP, ASP.NET, DCOM, and NT Services), because Office may exhibit unstable behaviour and/or deadlock when Office is run in this environment.
Automating PowerPoint from a scheduled task falls under the unsupported scenario when scheduled task is run with the option "Run whether user logged on or not". But, using it with "Run only when the user is logged on" option falls under the supported category."
From here
I would try it with the script not in your Users directory
I have experience supporting PowerPoint automation under the Task Scheduler by way of a C++ app called p3icli (available on sourceforge). This is the approach I successfully used:
1) Add a command-line (-T) switch that indicates p3icli will run under Task Scheduler.
2) The command-line switch forces p3icli to start an instance of powerpnt.exe using CreateProcess() and then wait X milliseconds for that instance to stabilize.
3) After X milliseconds elapse, p3icli connects to the running PPT instance created in step 2 and processes automation commands.
I would guess that a similar approach can be used with Python.
Task Scheduler compatibility is easily the most troublesome feature I ever added to p3icli. For example, manipulating multiple presentations by changing the active window simply does not work. And as I'm sure you've discovered, debugging problems is no fun at all.
NB: Your python solution must include code that forces PowerPoint to unconditionally close when your python script is complete (modulo a python crash). Otherwise, orphaned instances of PowerPoint will appear in Task Manager.
Click the link for some thoughts on the Task Scheduler from a p3icli point of view.

pause system functionality until my python script is done

I have written a simple python script that runs as soon as a certain user on my linux system logs in. It ask's for a password... however the problem is they just exit out of the terminal or minimize it and continue using the computer. So basically it is a password authentication script. So what I am curious about is how to make the python script stay up and not let them exit or do anything else until they entered the correct password. Is there some module I need to import or some command that can pause the system functions until my python script is done?
Thanks
I am doing it just out of interest and I know a lot could go wrong but I think it would be a fun thing to do. It can even protect 1 specific system process. I am just curious how to pause the system and make the user do the python script before anything else.
There will always be a way for the user to get past your script.
Let's assume for a moment that you actually manage to block the X-server, without blocking input to your program (so the user can still enter the password). The user could just alt-f1 out of the X-server to a console and kill "your weird app". If you manage to block that too he could ssh to the box and kill your app.
There is most certainly no generic way to do something like this; this is what the login commands for the console and the session managers (like gdm) for the graphical display are for: they require a user to enter his password before giving him some form of interactive session. After that, why would you want yet another password to do the same thing? the system is designed to not let users use it without a password (or another form of authentication), but there is no API to let programs block the system whenever they feel like it.
You want the equivalent of a "modal" window, but this is not (directly) possible in a multiuser, multitasking environment.
The next best thing is to prevent the user from accessing the system. For example, if you create an invisible window as large as the display, that will intercept any mouse events, and whatever is "behind" will be unaccessible.
At that point you have the problem of preventing the user from using the keyboard to terminate the application, or to switch to another application, or to another virtual console (this last is maybe the most difficult). So you need to access and lock the keyboard, not only the "standard" keyboard but the low-level keys as well.
And to do this, your application needs to have administrative rights, and yet run in the user environment. Which starts to look like a recipe for disaster, unless you really know what you are doing.
What you want to do should be done through a Pluggable Authentication Module (PAM) that will integrate with your display manager. Maybe, you can find some PAM module that will "outsource" or "callback" some external program, i.e., your Python script.
Since this thing is just for fun, here's a work arond: make the script log the user out if he ignores the prompt for some time, or closes the terminal/kills the process. Here's how it might look like for gnome:
import os
def set_exit_handler(func):
import signal
signal.signal(signal.SIGHUP, func) # on closing the terminal
signal.signal(signal.SIGTERM, func) # on killing the process
import atexit
atexit.register(func) # on Ctrl-C, Ctrl-D and other proper exits
if __name__ == "__main__":
def on_exit(a=None, b=None):
print "exit handler triggered"
os.system("gnome-session-quit --logout --no-prompt")
set_exit_handler(on_exit)
print "Enter password:"
raw_input()
# ... some verification/timeout code ...
If the user will now close the terminal or kill the process, he'll be logged out instantaneously :)

How do I know if jobs have been/are performing? - Crontab

I have followed the suggestion in this question
as I am using Django, I have set the script to store date and time of each run of the script in the db, but no entry has been stored yet in the database.
Is there a way to figure out, other than typing "top" and searching through?
First, I would probably configure cron to mail yourself any output by using MAILTO:
In /etc/crontab:
MAILTO=username
Second, I usually add something to my script that (almost) cannot possibly fail, like the following:
#!/bin/sh
echo "$0 ran on `date +%c`" >> /tmp/crontab_test.log
# ... rest of program
If you're calling a python script directly from cron, you could do something similar or create a wrapper shell script.
If you have sendmail installed, you can add the following to '/etc/aliases'
root: your_name#domain.com
After you do that, update the aliases running this command:
sudo newaliases
Cron will automatically email you every time a job is run. No need to specify that in the crontab file.
Also, make sure you test your email capabilities (e.g. make sure you are able to send emails from the server) and lastly, create a trivial cronjob and test if you receive an email.
Do not assume!
In addition to setting up cron to send email, you can send the output of cron to a seperate syslog log facility by adding the following to your /etc/syslog.conf.
# Log cron stuff
cron.* /var/log/cron.log
This should log a message to /var/log/cron.log each time a job is run.

Categories

Resources