I've got two versions of Python installed (2.7 and 3.5) via ArcGIS (Desktop 10.4 and Pro). I have multiple scripts that must run daily. The Python 2 scripts all run perfectly via Windows Task Scheduler, however, my Python 3 scripts simply refuse to run via the task scheduler.
I originally had them set to run from .bat files which works fine when operated manually.
#echo off
python3 \\Myserver\Pythonstuff\myscript.py
These .bat files will not operate via the task scheduler. Some things I've tried:
set python3.exe as the program/script and add in the script location as the argument
set cmd as the program/script and add python3 \Myserver\Pythonstuff\myscript.py as the argument
Nothing seems to work here. I need the other scripts to run in Python 2 and I can't mess everything up with them so I'd like to maintain their Python2/3 integrity between the scripts. Any ideas?
Related
Recently I need to affect bash shell that runs python script from python script itseft. I develop a Python utility package that add some additional functionallity to pip. One of the workflows in this package needs to active Virtualenv to work as planned. Here is the problem.
When you run something like:
os.system('/bin/bash ./venv/bin/activate')
Or:
subprocess.Popen(['/bin/bash', './venv/bin/activate')
It doesn't do anything to the shell when the script is executed. Basically because these commands are executed in isolated processes (I guess) and therefore does not affect bash process itself.
Question: how can you affect parent shell that execute python script from inside the script (add some environments, run other script, etc.)?
Thanks in advance.
how can you affect parent shell that execute python script from inside the script (add some environments, run other script, etc.)?
It is not possible to do that, unless your operating system is broken. Process isolation is one of the very basic concepts of an operating system.
Instead, resaerch what venv does and how it works and what activate script does and just add the proper directory to python module search path.
How can I use a python script which needs certain versions of packages without being forced to install each package separately each time a new user runs the script. I read about virtual environments, but how do I create one for all users in Windows 10 and how do I run the script with the respective environment?
Have you come across a similar challenge?
I am looking to run my new Python script as a service. The problem is the users most definitely would not have Python installed. After doing research I found the two main options: nssm and the win32serviceutil code, but as far as I understand both require a Python installation on the PC, since you have to specify the path to your python.exe.
Is there any other way to make a Python script to run as soon as Windows is started and run in the background, which doesn't require an existing Python installation?
We're using Python 2.7 Fabric 1.13 on Windows 12 Server, on which is also installed Git-Bash for it's almost linux-like terminal. Amazingly, we've gotten it all to work - almost. It's a hodgepodge, but seems less clunky than Cygwin.
I have a script in my user's ~/.bash_profile which works fine when I run it manually inside of Git-bash CLI. This script is a local preparation for what is to come next.
When I try to use Fabric's local command to run this function, it just errors with return code 1. I'm assuming that my function, a function in my user's environment, is not present when Fabric is running.
How do I find the user that my fabric script runs as (again, on Windows)? And more to the point, how do I inject this code (function) into that user's environment? Or, how do I make it global to all users?
I am trying to daemonize celery and celerybeat. I have downloaded the celeryd and celeybeat files from github and placed them in /etc/init.d/ (celery and celerybeat) with the corresponding config files under /etc/default/.
My problem is that when I run these two files, celeryd and celerybeat use system python (2.4), and as a result cannot find other installed applications under python 2.7. Python 2.7 is in ~/.bashrc and /.bash_profile files, so I do not have any problems running other applications, except when workers fail to work. When I run python ...../manage.py celery ( with all options) everything works like a charm.
Please let me know how I can force /init.d/function to run python2.7.
I have tried to implement #! /bin/sh python, but it does not work.
scripts in /etc/init.d are usually run as root on system startup. root's ~/.bashrc (that is /root/.bashrc) will look totally different from yours (e.g. /home/reza/.bashrc).
your shell will behave slightly differently if you are running it interactively or not.
hence there is no use in trying to run the python interpreter through /bin/sh, it only adds overhead.
what you want is to add a proper shebang that tells the system which interpreter to use for your script.
e.g.
#!/usr/bin/python2.7
will use the python2.7 binary installed in /usr/bin.
(so whenever you run /etc/init.d/foo.py the system really runs /usr/bin/python2.7 /etc/init.d/foo.py)