Running bash commands from python in a single session - python

How do i run bash commands from python while the session is maintained.
For example if I pwd, then cd .., then pwd, it SHOULD move to a directory level one level lower than the current directory . I dont want to run all these commands as a single command with | or &. I want to run them on individual lines.

In general, processes can't modify the environment of their parent process, or any other existing process. So you can't do this easily in the way you're describing, unless you deliberately save the environment from the child process somehow (e.g. end all your bash commands by redirecting env to a file, prefixing each entry in the file with export, and source that file at that at the start of every subsequent command...).
Alternatives:
Add all of your inter-dependent bash commands to a single bash script and run the bash script from python, rather than running bash commands in python one by one.
Use os.chdir and other methods to change the python process environment variables as needed before running each bash command.

Related

How to execute commands from python in one shell instance [duplicate]

How do i run bash commands from python while the session is maintained.
For example if I pwd, then cd .., then pwd, it SHOULD move to a directory level one level lower than the current directory . I dont want to run all these commands as a single command with | or &. I want to run them on individual lines.
In general, processes can't modify the environment of their parent process, or any other existing process. So you can't do this easily in the way you're describing, unless you deliberately save the environment from the child process somehow (e.g. end all your bash commands by redirecting env to a file, prefixing each entry in the file with export, and source that file at that at the start of every subsequent command...).
Alternatives:
Add all of your inter-dependent bash commands to a single bash script and run the bash script from python, rather than running bash commands in python one by one.
Use os.chdir and other methods to change the python process environment variables as needed before running each bash command.

How do I create a cronjob for these commands?

I want to a up a cron job to run a python script each day within a virtual environment. So I've tried to set up a cron job but it does not seem to execute.
If I were to run the program from terminal normally, I would type:
source ig/venv/bin/activat enter to activate my virtual environment
cd ig/mybot/src/ navigate to my directory
python ultimate.py run my program
SO FAR this is my cron job. I've set it to 1 to run every minute just so I can see that it is working but nothing happens.
1 * * * * source ig/venv/bin/activate && cd ig/mybot/src/ && python ultimate.py
Edit: I have updated my program so no command line prompts are required. I am just needing to run these three simple commands.
You can wrap this up with another python script itself. Create a new python script and run it instead of cron.
Have a look into subprocess module.
Example:
Your command would become subprocess.call(['source','ig/venv/bin/activate'])
inside the wrapper python script.
Also, input("Enter the value") will prompt you for user input.
With the above two, your problem will be solved pythonically.
I'm not sure if it's a good idea, but you could do a script like this.
#!/usr/bin/env bash
PYTHON_PROJECT_DIR=/path/to/python/project/dir
pushd ${PYTHON_PROJECT_DIR}
VALUES="first line of stream\nsecondline of stream\n"
pipenv run /path/to/your/script.py < (echo -e $VALUES)
popd
pushd and popd are commands to move with the directory stack, so you'll be in the directory in the top of the stack, so by adding one directory, you move to the working directory, and by poping you'll get back to the initial position.
Using pipenv allows you to run the scripts in the virtual enviroment (It's not that hard to configure), that way you'll use the enviroment variables in the .env files for the project, and you'll only use the dependencies of this project. (python related).
If you pass the values like this, the python script when ever it requests a value from stdin it will use the values that you echoed (line by line, first line is first input and so on)
This could be a way.
Personally when ever I do cronjobs I like to run directly bash scripts, because, I could add extra logging, so having the wrapper script doesn't seem that unreasonable.
Another thing you could do, Is to get the python executable path (for the virtual enviroment), and use that as interpreter, by replacing the #!/usr/bin/env python to #!/path/to/pythons/virtual/env/interpreter
but you won't get the .env variables (may be there is a way to actually get them.

How to run a command inside virtual environment using Python

I have the virutalenv created and installed. I have also installed jsnapy tool inside my virutal env.
This is the script that we are using:
Filename : venv.py
import os
os.system('/bin/bash --rcfile ~/TestAutomation/End2EndAutomation/bin/activate')
os.system('End2EndAutomation/bin/jsnapy')
ubuntu#server:~/TestAutomation$ python venv.py
(End2EndAutomation) ubuntu#sdno-server:~/TestAutomation$ ^C
We need to know, is how we can get into virutalenv, run a command and deactivate it using python script?
[EDIT1]
i used the code given in the comment. its just entering virutal env. When i issue exit, its running jsnapy command.
ubuntu#server:~/TestAutomation$ python venv.py
(End2EndAutomation) ubuntu#server:~/TestAutomation$ exit
exit
usage:
This tool enables you to capture and audit runtime environment of
networked devices running the Junos operating system (Junos OS)
Tool to capture snapshots and compare them
It supports four subcommands:
--snap, --check, --snapcheck, --diff
1. Take snapshot:
jsnapy --snap pre_snapfile -f main_configfil
Each call to os.system() will create a new bash instance and terminate the previous one. To run all the commands in one bash instance you could put all your commands inside a single bash script and call that from os.system()
run.sh
source ~/TestAutomation/End2EndAutomation/bin/activate
End2EndAutomation/bin/jsnapy
deactivate
Python
os.system('source run.sh')
Alternatively, you could write a multiline bash command, as long as it's all in one os.system() call.
Two successive calls to os.system() will create two independent processes, one after the other. The second will run when the first finishes. Any effects of commands executed in the first process will have been forgotten and flushed when the second runs.
You want to run the activation and the command which needs to be run in the virtualenv in the same process, i.e. the same single shell instance.
To do that, you can use bash -c '...' to run a sequence of commands. See below.
However, a better solution is to simply activate the virtual environment from within Python itself.
p = os.path.expanduser('~/TestAutomation/End2EndAutomation/bin/activate_this.py')
execfile(p, dict(__file__=p))
subprocess.check_call(['./End2EndAutomation/bin/jsnapy'])
For completeness, here is the Bash solution, with comments.
import subprocess
subprocess.check_call(['bash', '-c', """
. ~/TestAutomation/End2EndAutomation/bin/activate
./End2EndAutomation/bin/jsnapy"""])
The preference for subprocess over os.system is recommended even in the os.system documentation.
There is no need to explicitly deactivate; when the bash command finishes, that will implicitly also deactivate the virtual environment.
The --rcfile trick is a nice idea, but it doesn't work when the shell you are calling isn't interactive.

Use python to dynamically create bash aliases

I'm attempting to use python to dynamically create bash aliases (like, for example, aliases to log in to a set of servers). I'd love to be able to do something like this:
from subprocess import call
SERVERS = [
("example", "user#example.com"),
#more servers in list
]
for server in SERVERS:
call('alias %s="ssh %s"' % (server[0], server[1]), shell=True)
The problem is that subprocess launches the jobs in a separate shell session, so the program runs fine, but does nothing to the shell session I run it from.
The same problem occurs with python's os.system or attempting to print the commands and pipe them to bash (all of these create the aliases, but in a new shell that is promptly destroyed after the program finishes).
Ultimately, the goal of this is to run this script from .bashrc
How does one do this?
You should write the alias commands to stdout. (eg. just use print).
Then the shell that is calling the Python script can run the alias commands itself.
As commented by #that other guy
eval "$(python yourscript.py)"
in your .bashrc should do it

problem with python script

I want to run a csh file from a python script,
example,
#!/usr/bin/python
import os
os.system("source path/to/file.csh")
and I want this file to run in the same shell as I am running the python script, because the file.csh script is settings some environment variables that I need.
Does anyone know how to do this in Python?
A child process cannot affect the environment of the parent process. The best you can do is to run your csh script in a separate process, get the environment variables that it defines, then set each environment variable in your python script.
Even with that, the python script won't be able to affect the shell in which you run the python script.
The common way to solve this (AFAIK) is to have your script emit shell commands to set the environment, then from the main shell you run the script and eval what you get back.
For more information you might want to check out this question: can a shell script set environment variables of the calling shell
You can kludge it this way:
#!/usr/bin/env python
# This is kludge.py
print "setenv VARNAME \"the value\""
In your case, you can have the file.sh print the setenv line.
Then from csh:
$ eval `./kludge.py`
$ echo $VARNAME
the value
This isn't clean, but it is the only way to have a child process effect the environment of its parent. This is only because the parent process is explicitly letting it happen with eval.

Categories

Resources