I have mysql dump command that I would like to run from from windows shell
or command prompt. I have used shell it does work.
d= 'BkSql_'+datetime.datetime.now().strftime("%Y-%m-%d")+".sql"
fn = dn+d
cmd="""mysqldump -u hapopdy -p > %s""" %fn
print cmd
Edit:::::::
The -p needs to be a raw input.
Using the subprocess module
import subprocess
subprocess.call(cmd)
If you're running a shell command add shell=True
subprocess.call(cmd, shell=True)
You should save the password in mysql's local configuration file for the user.(In Unix it's ~/.my.cnf) or you can give it on the command line with --password=MYPASSWORD.
Either way, the password will be visible to a large audience. In the .my.cnf case, it will be visible to anyone with read access to the file. In the second case, it will be visible to anyone who can get a process listing on the system, in addition to those who have read access to your script.
Related
I am trying to open one file from gnome-terminal using python. But I am not able to do it.It is just opening terminal and not opening file.
I have tried like:
import os
os.system('gnome-terminal --working-directory = "folder_path" + "[-e, --command=" kate aaa.txt""')
Can anyone please help?
The problem is + "[-e, --command=" kate aaa.txt"", gnome-terminal doesn't know how to parse this + "[ and "", according to the manual, -e and --command mean the same thing:
man gnome-terminal
...
--command, -e=COMMAND
Split the argument to this option into a program and arguments in the same way a shell
would, and execute the resulting command-line inside the terminal.
This option is deprecated. Instead, use -- to terminate the options, and put the program
and arguments to execute after it: for example, instead of gnome-terminal -e "python3 -q",
prefer to use gnome-terminal -- python3 -q.
Note that the COMMAND is not run via a shell: it is split into words and executed as a
program. If shell syntax is required, use the form gnome-terminal -- sh -c '...'.
This works for me in Archlinux:
import os
os.system('gnome-terminal --working-directory = /home/ramsay --command="kate
os"')
I'm trying to write a script that opens a new terminal then runs a separate python script from that terminal.
I've tried:
os.system("gnome-terminal 'python f.py'")
and
p = Popen("/usr/bin/gnome-terminal", stdin=PIPE)
p.communicate("python f.py")
but both methods only open a new terminal and do not run f.py. How would I go about opening the terminal AND running a separate script?
Edit:
I would like to open a new terminal window because f.py is a simply server that is running serve_forever(). I'd like the original terminal window to stay "free" to run other commands.
Like most terminals, gnome terminal needs options to execute commands:
gnome-terminal [-e, --command=STRING] [-x, --execute]
You probably need to add -x option:
x, --execute
Execute the remainder of the command line inside the terminal.
so:
os.system("gnome-terminal -x python f.py")
That would not run your process in the background unless you add & to your command line BTW.
The communicate attempt would need a newline for your input but should work too, but complex processes like terminals don't "like" being redirected. It seems like using an interactive tool backwards.
And again, that would block until termination. What could work would be to use p.stdin.write("python f.py\n") to give control to the python script. But in that case it's unlikely to work.
So it seems that you don't even need python do to what you want. You just need to run
python f.py &
in a shell.
As of GNOME Terminal 3.24.2 Using VTE version 0.48.4 +GNUTLS -PCRE2
Option “-x” is deprecated and might be removed in a later version of gnome-terminal.
Use “-- ” to terminate the options and put the command line to execute after it.
Thus the preferred syntax appears to be
gnome-terminal -- echo hello
rather than
gnome-terminal -x echo hello
Here is a complete example of how you would call a executable python file with subprocess.call Using argparse to properly parse the input.
the target process will print your given input.
Your python file to be called:
import argparse
parser = argparse.ArgumentParser()
parser.add_argument("--file", help="Just A test", dest='myfile')
args = parser.parse_args()
print args.myfile
Your calling python file:
from subprocess import call
#call(["python","/users/dev/python/sandboxArgParse.py", "--file", "abcd.txt"])
call(["gnome-terminal", "-e", "python /users/dev/python/sandboxArgParse.py --file abcd.txt"])
Just for information:
You probably don't need python calling another python script to run a terminal window with a process, but could do as follows:
gnome-terminal -e "python /yourfile.py -f yourTestfile.txt"
The following code will open a new terminal and execute the process:
process = subprocess.Popen(
"sudo gnome-terminal -x python f.py",
stdout=subprocess.PIPE,
stderr=None,
shell=True
)
I am running a uWS server with this.In my case Popen didn't help(Even though it run the executable, still it couldn't communicate with a client -: socket connection is broken).This is working.Also now they recommends to use "--" instead of "-e".
subprocess.call(['gnome-terminal', "--", "python3", "server_deployment.py"])
#server_deployment.py
def run():
execution_cmd = "./my_executable arg1 arg2 dll_1 dll_2"
os.system(execution_cmd)
run()
The subprocess.Popen() function has a "env" parameter. But it doesn't seem to have the desired effect with sudo. This is what I get when I do this in the interactive Python shell:
import subprocess
env={"CVS_RSH":"ssh"}
command = "sudo -u user cvs -d user#1.8.7.2:/usr/local/ncvs co file.py"
p = subprocess.Popen(command, stdout=subprocess.PIPE,
stderr=subprocess.PIPE,env=env,shell=True)
(command_output, error_output) = p.communicate()
p.wait()
1
>>> error_output
b'cvs [checkout aborted]: cannot exec rsh: No such file or
directory\ncvs [checkout aborted]: end of file from server (consult
above messages if any)\n'
The message is distracting, so let me explain. I'm forced to use ancient CVS and the environment variable tells it to use ssh to connect to the server, rather than the default which sadly is rsh. It also needs an environment variable called CVS_ROOT, but fortunately there's a "-d" option for that, but none for the CVS_RSH that I know of.
Interestingly enough, if I do:
command = "sudo -u user echo $CVS_RSH"
env={"CVS_RSH":"something_else"}
p = subprocess.Popen(command, stdout=subprocess.PIPE,
stderr=subprocess.PIPE,env=env,shell=True)
(command_output, error_output) = p.communicate()
p.wait()
0
>>> command_output
b'something_else\n'
Maybe this worked because echo wasn't actually started as a child process? Is it possible to pass an environment to a process executed as another user with sudo?
This doesn't seem possible using the env parameter. The solution seems to be to just pass the environment as I was doing on the shell, for example:
command = "sudo -u user CVS_RSH=ssh
CVSROOT=:ext:user#2.8.7.2:/usr/local/ncvs cvs co dir/file.py"
p = subprocess.Popen(command, stdout=subprocess.PIPE,
stderr=subprocess.PIPE,env=env,shell=True)
The weird thing is, if I do this in a Python CGI script, I can see:
cvs [checkout aborted]: cannot exec ssh: Permission denied
cvs [checkout aborted]: end of file from server (consult above messages if
any)
But if I try on the interactive Python shell, it goes past this, so it must be another weird (because the user has permission to ssh) issue, unrelated to this question.
I'm trying to run a python script that will open a command prompt(OSGeo4W.bat is a command prompt line). I can get it to open but now I would like to send the command prompt commands.
import subprocess
myProcess = subprocess.Popen(['C:\OSGeo4W64\OSGeo4W.bat'],shell = False) #opens command prompt
myProcess.communicate('gdal2tiles -p raster -z 0-1 new.jpg abc')
myProcess.wait()
print("my process has terminated")
I've also tried
subprocess.check_call('gdal2tiles -p raster -z 0-1 new.jpg abc', shell=False)
I keep getting errors that say "WindowsError: [Error 2] The system cannot find the file specified"
although, if I were to keep the command prompt that it opens and type in " 'gdal2tiles -p raster -z 0-1 new.jpg abc' " then it will work just as I wanted. Help would be great, thanks!
Try:
check_call('gdal2tiles -p raster -z 0-1 new.jpg abc', shell=True)
shell=True changes how the executable is searched on Windows.
Or if gdal2tiles works only in the environment created by OSGeo4W.bat:
shell = Popen(r'C:\OSGeo4W64\OSGeo4W.bat', stdin=subprocess.PIPE)
shell.communicate('gdal2tiles -p raster -z 0-1 new.jpg abc')
# you don't need shell.wait() here
Notice: r"" literal. It is necessary to avoid escaping the backslashes in the path.
For those of you that are still trying to figure this one out, this is what I found. The "stdin=subprocess.PIPE" method above works with some command line tool in OSGeo4W64 but not all. It works with gdal_translate but not pdal translate for example. Not sure why;(
My Solution:
OSGeo4Wenv = r'CALL "C:/OSGeo4W64/bin/o4w_env.bat" '
pdal_translate_String = r'c:/OSGeo4W64/bin/pdal translate c:\inputFile c:\outputFile radiusoutlier --filters.radiusoutlier.min_neighbors=2 --filters.radiusoutlier.radius=8.0 --filters.radiusoutlier.extract=true'
Cmd = str(OSGeo4Wenv)+' & '+str(pdal_translateCmd)
shell = subprocess.call(Cmd, stdout=None, shell=True)
What is going on?
1) Open shell and set up the OSGeo4W environment by calling "OSGeo4Wenv". This is normally called by the OSGeo4W.bat file. Without this, the command line programs don't know where to look for the libraries.
2) The pdal_translate command is then sent to the dos shell because in Windows, multiple commands can be separated by the "&" symbol. I use the .call method of python 2.7. It has the advantage that it waits for the end of the process. That is nice if you use the multiprocessing map.pool method to launch multiple processes at the same time.
Hope this help others!
Nicolas
I need to execute this script from my Python script.
Is it possible? The script generate some outputs with some files being written. How do I access these files? I have tried with subprocess call function but without success.
fx#fx-ubuntu:~/Documents/projects/foo$ bin/bar -c somefile.xml -d text.txt -r aString -f anotherString >output
The application "bar" also references to some libraries, it also create the file "bar.xml" besides the output. How do I get access to these files? Just by using open()?
Thank you,
Edit:
The error from Python runtime is only this line.
$ python foo.py
bin/bar: bin/bar: cannot execute binary file
For executing the external program, do this:
import subprocess
args = ("bin/bar", "-c", "somefile.xml", "-d", "text.txt", "-r", "aString", "-f", "anotherString")
#Or just:
#args = "bin/bar -c somefile.xml -d text.txt -r aString -f anotherString".split()
popen = subprocess.Popen(args, stdout=subprocess.PIPE)
popen.wait()
output = popen.stdout.read()
print output
And yes, assuming your bin/bar program wrote some other assorted files to disk, you can open them as normal with open("path/to/output/file.txt"). Note that you don't need to rely on a subshell to redirect the output to a file on disk named "output" if you don't want to. I'm showing here how to directly read the output into your python program without going to disk in between.
The simplest way is:
import os
cmd = 'bin/bar --option --otheroption'
os.system(cmd) # returns the exit status
You access the files in the usual way, by using open().
If you need to do more complicated subprocess management then the subprocess module is the way to go.
For executing a unix executable file. I did the following in my Mac OSX and it worked for me:
import os
cmd = './darknet classifier predict data/baby.jpg'
so = os.popen(cmd).read()
print so
Here print so outputs the result.