Python3 executes terminal differently than manual input - python

I have a small piece of python3 code. Which runs a command from the terminal.
import os
os.system('"C:/directory/program.exe" -k "C:/directory/options.txt" & pause')
When I run this code in IDLE, I get the following error:
The filename, directory name, or volume label syntax is incorrect.
Both of the paths are valid. So thats not the problem. In addition, running:
"C:/directory/program.exe" -k "C:/directory/options.txt" & pause
from the terminal works correctly.

You don't need quotes around the system paths, this should work:
import os
os.system("C:/directory/program.exe -k C:/directory/options.txt & pause")
Hopefully that helps.
[Edit] Working with spaces like you're doing it with os.system is to my knowledge impossible, referring to this python bug tracker thread
A solution might be using the subprocess module insead.
import subprocess
subprocess.call(["C:/direc tory/program.exe", "-k", "C:/direc tory/program.exe"])

Related

Untar file with subprocess.call is running successfully but with no effect

Does anyone know what I am doing wrong with this command:
import subprocess
subprocess.call('tar -zvxf %s -C %s' % ("file.tar.gz", '/folder'), shell=True)
The code runs without any errors but the file is only unzipped on random occasions. I know I can use tarfile, but I would like to run it as a Linux command. Any ideas?
If you read the man page, you'll see that the -C parameter is sensitive to order -- it only affects operations that come after it on the command line. So, your file is being unzipped into whatever random directory you happen to be in.
You don't need shell for this. Do:
import os
import subprocess
os.chdir('/folder')
subprocess.call( ['tar','xvfz', 'file.tar.gz'] )

running bash script from python file

I have a bash script which changes the path on my command line,
This one,
#!/usr/bin/env python
cd /mnt/vvc/username/deployment/
I have a python script which i wish to run after the path changes to the desired path,
The script,
#!/usr/bin/env python
import subprocess
import os
subprocess.call(['/home/username/new_file.sh'])
for folder in os.listdir(''):
print ('deploy_predict'+' '+folder)
I get this
File "/home/username/new_file.sh", line 2
cd /mnt/vvc/username/deployment/
^
SyntaxError: invalid syntax
Any suggestions on how can i fix this?thanks in advance
You need to explicitly tell subprocess which shell to run the sh file with. Probably one of the following:
subprocess.call(['sh', '/home/username/new_file.sh'])
subprocess.call(['bash', '/home/username/new_file.sh'])
However, this will not change the python program's working directory as the command is run in a separate context.
You want to do this to change the python program's working directory as it runs:
os.chdir('/mnt/vvc/username/deployment/')
But that's not really great practice. Probably better to just pass the path into os.listdir, and not change working directories:
os.listdir('/mnt/vvc/username/deployment/')

exec() python2 script from python3

I would like to know if it is possible to execute a python2 script from a python3 script.
I have a file written using py3 that must execute legacy code written in py2 to obtain dictionaries for processing within the initial file.
The line in py3 to call the mentioned py2 script is
exec(open('python2script.py').read())
The script runs without error until it begins processing python2script.py, at which point it crashes at the first difference with version3.
As the comments pointed out, exec() uses the current python implementation, so you can't execute python 2 code from python 3 using it.
Unless you port it, your best bet is simply to call it as a subprocess, using either os.system..:
./py3.py
#!/usr/bin/env python3
import os
print('running py2')
os.system('./py2.py')
print('done')
./py2.py
#!/usr/bin/env python2.7
print "hello from python2!"
Then (after making them both executable) run:
$ ./py3.py
Or alternatively you can use the more flexible subprocess, which allows you to pass data back and forward more easily using a serialising module such as json so that you can get your results from the python2 script in your python3 code:
./py3.py
#!/usr/bin/env python3
import json
from subprocess import PIPE, Popen
print('running py2')
py2_proc = Popen(['./py2.py'], stdout=PIPE)
# do not care about stderr
stdout, _ = py2_proc.communicate()
result = json.loads(stdout.decode())
print('value1 was %s, value2 was %s' % (result['value1'], result['value2']))
./py2.py
#!/usr/bin/env python2.7
import json
my_result = {
'value1': 1,
'value2': 3
}
print json.dumps(my_result)
Like that it may be easy to pack up the data you need and transport it over.
Note: I have used a very simple environment setup here using my system's python2.7 and python3. In the real world the most painful thing about getting this sort of thing to work properly is configuring the environment correctly. Perhaps, e.g., you are using virtual environments. Perhaps you are running as a user which doesn't have the right python2 version in their path. Perhaps you can't make the files executable and so have to specify the path to python in your subprocess / os.system call. There are many options and it is very complicated, but out of the scope of the question. You just have to read the doc pages very carefully and try a few things out!

python: how to run a program with a command line call (that takes a user's keystroke as input) from within another program?

I can run one program by typing: python enable_robot.py -e in the command line, but I want to run it from within another program.
In the other program, I imported subprocess and had subprocess.Popen(['enable_robot', 'baxter_tools/scripts/enable_robot.py','-e']), but I get an error message saying something about a callback.
If I comment out this line, the rest of my program works perfectly fine.
Any suggestions on how I could change this line to get my code to work or if I shouldn't be using subprocess at all?
If enable_robot.py requires user input, probably it wasn't meant to run from another python script. you might want to import it as a module: import enable_robot and run the functions you want to use from there.
If you want to stick to the subprocess, you can pass input with communicate:
p = subprocess.Popen(['enable_robot', 'baxter_tools/scripts/enable_robot.py','-e'])
p.communicate(input=b'whatever string\nnext line')
communicate documentation, example.
Your program enable_robot.py should meet the following requirements:
The first line is a path indicating what program is used to interpret
the script. In this case, it is the python path.
Your script should be executable
A very simple example. We have two python scripts: called.py and caller.py
Usage: caller.py will execute called.py using subprocess.Popen()
File /tmp/called.py
#!/usr/bin/python
print("OK")
File /tmp/caller.py
#!/usr/bin/python
import subprocess
proc = subprocess.Popen(['/tmp/called.py'])
Make both executable:
chmod +x /tmp/caller.py
chmod +x /tmp/called.py
caller.py output:
$ /tmp/caller.py
$ OK

python scripts issue (no module named ...) when starting in rc.local

I'm facing of a strange issue, and after a couple of hour of research I'm looking for help / explanation about the issue.
It's quite simple, I wrote a cgi server in python and I'm working with some libs including pynetlinux for instance.
When I'm starting the script from terminal with any user, it works fine, no bug, no dependency issue. But when I'm trying to start it using a script in rc.local, the following code produce an error.
import sys, cgi, pynetlinux, logging
it produce the following error :
Traceback (most recent call last):
File "/var/simkiosk/cgi-bin/load_config.py", line 3, in
import cgi, sys, json, pynetlinux, loggin
ImportError: No module named pynetlinux
Other dependencies produce similar issue.I suspect some few things like user who executing the script in rc.local (root normaly) and trying some stuff found on the web without success.
Somebody can help me ?
Thanx in advance.
Regards.
Ollie314
First of all, you need to make sure if the module you want to import is installed properly. You can check if the name of the module exists in pip list
Then, in a python shell, check what the paths are where Python is looking for modules:
import sys
sys.path
In my case, the output is:
['', '/usr/lib/python3.4', '/usr/lib/python3.4/plat-x86_64-linux-gnu', '/usr/lib/python3.4/lib-dynload', '/usr/local/lib/python3.4/dist-packages', '/usr/lib/python3/dist-packages']
Finally, append those paths to $PATH variable in /etc/rc.local. Here is an example of my rc.local:
#!/bin/sh -e
#
# rc.local
#
# This script is executed at the end of each multiuser runlevel.
# Make sure that the script will "exit 0" on success or any other
# value on error.
#
# In order to enable or disable this script just change the execution
# bits.
#
# By default this script does nothing
export PATH="$PATH:/usr/lib/python3.4:/usr/lib/python3.4/plat-x86_64-linux-gnu:/usr/lib/python3.4/lib-dynload:/usr/local/lib/python3.4/dist-packages:/usr/lib/python3/dist-packages"
# Do stuff
exit 0
The path where your modules are install is probably normally sourced by .bashrc or something similar. .bashrc doesn't get sourced when it's not an interactive shell. /etc/profile is one place that you can put system wide path changes. Depending on what Linux version/distro it may use /etc/profile.d/ in which case /etc/profile runs all the scripts in /etc/profile.d, add a new shell script there with execute permissions and a .sh extention.

Categories

Resources