python--can we find the directory of running processes using psutil? - python

I just want to kill all running "python" processes which are running from particular directory.. Means the files(sample.py) resides in specific folder.
For ex: C:\myFolder\*
Using psutil can we find the path of the processes or kill all the process which are from C:\myFolder\* except some processess?
import psutil
for process in psutil.process_iter():
print process.cmdline:

As per the comment, if you want to find file location for the running python scripts - use psutil.Process.name() == 'python' to filter the python processes. Then use os.path.abspath() to get the full path.
The following code example might work:
import psutil
import os
"""
Python script path using psutil
"""
processes = filter(lambda p: psutil.Process(p).name() == "python", psutil.pids())
scripts = []
paths = []
for pid in processes:
try:
scripts.append(psutil.Process(pid).cmdline()[1])
except IndexError:
pass
for script in scripts:
paths.append(os.path.abspath(script))
print paths

If the scripts path in cmdline[1] is not an absolute path, you can use cwd() from psutil to get the working dir, after that, join the dir string with the script path string, and then you can get the location of the python scripts.

Related

How to run Open Pose binary (.exe) from within a Python script?

I am making a body tracking application where I want to run Open Pose if the user chooses to track their body movements. The OpenPose binary file can be run like so:
bin\OpenPoseDemo.exe --write_json 'path\to\dump\output'
So, in my Python script, I want to have a line of code that would run Open Pose, instead of having to ask the user to manually run OpenPose by opening a separate command line window. For that, I have tried:
import os
os.popen(r"C:\path\to\bin\OpenPoseDemo.exe --write_json 'C:\path\to\dump\output'")
But this gives the following error:
Error:
Could not create directory: 'C:\Users\Admin\Documents\Openpose\. Status error = -1. Does the parent folder exist and/or do you have writing access to that path?
Which I guess means that OpenPose can be opened only by going inside the openpose directory where the bin subdirectory resides. So, I wrote a shell script containing this line:
bin\OpenPoseDemo.exe --write_json 'C:\path\to\dump\output'
and saved it as run_openpose_binary.sh in the openpose directory (i.e., the same directory where bin is located).
I then tried to run this shell script from within my Python script like so:
import subprocess
subprocess.call(['sh', r'C:\path\to\openpose\run_openpose_binary.sh'])
and this gives the following error:
FileNotFoundError: [WinError 2] The system cannot find the file specified
I also tried the following:
os.popen(r"C:\path\to\openpose\run_openpose_binary.sh")
and
os.system(r"C:\path\to\openpose\run_openpose_binary.sh")
These do not produce any error, but instead just pop up a blank window and closes.
So, my question is, how do I run the OpenPoseDemo.exe from within my Python script?
For your last method, you're missing the return value from os.popen, which is a pipe. So, what you need is something like:
# untested as I don't have access to a Windows system
import os
with os.popen(r"/full/path/to/sh C:/path/to/openpose/run_openpose_binary.sh") as p:
# pipes work like files
output_of_command = p.read().strip() # this is a string
or, if you want to future-proof yourself, the alternative is:
# untested as I don't have access to a Windows system
popen = subprocess.Popen([r'/full/path/to/sh.exe', r'/full/path/to/run_openpose_binary.sh')], stdin=subprocess.PIPE, stdout=subprocess.PIPE,encoding='utf-8')
stdout, stderr = popen.communicate(input='')
Leave a comment if you have further difficulty.
I've had to fight this battle several times and I've found a solution. It's likely not the most elegant solution but it does work, and I'll explain it using an example of how to run OpenPose on a video.
You've got your path to the openpose download and your path to the video, and from there it's a 3-line solution. First, change the current working directory to that openpose folder, and then create your command, then call subprocess.run (I tried using subprocess.call and that did not work. I did not try shell=False but I have heard it's a safer way to do so. I'll leave that up to you.)
import os
import subprocess
openpose_path = "C:\\Users\\me\\Desktop\\openpose-1.7.0-binaries-win64-gpu-python3.7-flir-3d_recommended\\openpose\\"
video_path = "C:\\Users\\me\\Desktop\\myvideo.mp4"
os.chdir(openpose_path)
command = "".join(["bin\\OpenPoseDemo.exe", " -video ", video_path])
subprocess.run(command, shell=True)

Running all Python scripts with the same name across many directories

I have a file structure that looks something like this:
Master:
First
train.py
other1.py
Second
train.py
other2.py
Third
train.py
other3.py
I want to be able to have one Python script that lives in the Master directory that will do the following when executed:
Loop through all the subdirectories (and their subdirectories if they exist)
Run every Python script named train.py in each of them, in whatever order necessary
I know how to execute a given python script from another file (given its name), but I want to create a script that will execute whatever train.py scripts it encounters. Because the train.py scripts are subject to being moved around and being duplicated/deleted, I want to create an adaptable script that will run all those that it finds.
How can I do this?
You can use os.walk to recursively collect all train.py scripts and then run them in parallel using ProcessPoolExecutor and the subprocess module.
import os
import subprocess
from concurrent.futures import ProcessPoolExecutor
def list_python_scripts(root):
"""Finds all 'train.py' scripts in the given directory recursively."""
scripts = []
for root, _, filenames in os.walk(root):
scripts.extend([
os.path.join(root, filename) for filename in filenames
if filename == 'train.py'
])
return scripts
def main():
# Make sure to change the argument here to the directory you want to scan.
scripts = list_python_scripts('master')
with ProcessPoolExecutor(max_workers=len(scripts)) as pool:
# Run each script in parallel and accumulate CompletedProcess results.
results = pool.map(subprocess.run,
[['python', script] for script in scripts])
for result in results:
print(result.returncode, result.stdout)
if __name__ == '__main__':
main()
Which OS are you using ?
If Ubuntu/CentOS try this combination:
import os
//put this in master and this lists every file in master + subdirectories and then after the pipe greps train.py
train_scripts = os.system("find . -type d | grep train.py ")
//next execute them
python train_scripts
If you are using Windows you could try running them from a PowerShell script. You can run two python scripts at once with just this:
python Test1.py
python Folder/Test1.py
And then add a loop and or a function that goes searching for the files. Because it's Windows Powershell, you have a lot of power when it comes to the filesystem and controlling Windows in general.

Running .vbs scipt inside Python doesn't do anything

Idea
Basically, what my script does is checking C:/SOURCE for .txt files and add a timestamp to it. To replicate it you can basically make that folder and put some txt files in there. Then, it's supposed to run a .vbs file, which then runs a .bat files with some rclone commands which don't matter here. I did it like this because there wont be a CMD window opening when running the rclone command through the .vbs file.
Python code
import time, os, subprocess
while True:
print("Beginning checkup")
print("=================")
timestamp = time.strftime('%d_%m_%H_%M') # only underscores: no naming issues
the_dir = "C:/SOURCE"
for fname in os.listdir(the_dir):
if fname.lower().endswith(".txt"):
print("found " + fname)
time.sleep(0.1)
new_name = "{}-{}.txt".format(os.path.splitext(fname)[0], timestamp)
os.rename(os.path.join(the_dir, fname), os.path.join(the_dir, new_name))
time.sleep(0.5)
else:
subprocess.call(['cscript.exe', "copy.vbs"])
time.sleep(60)
VBScript code
Set WshShell = CreateObject("WScript.Shell" )
WshShell.Run Chr(34) & "copy.bat" & Chr(34), 0
Set WshShell = Nothing
The only important part for the Python script is below the very last else, where the subprocess.call() is supposed to run the .vbs file. What happens when running the script is it shows the first two lines that always come up when running CMD, but then nothing.
How could I fix that? I tried:
subprocess.call("cscript copy.vbs")
subprocess.call("cmd /c copy.vbs")
both with the same outcome, it doesn't do anything.
Anyone have an idea?
Why are you invoking a VBScript to invoke a batch script from Python? You should be able to simple run whatever the batch script is doing directly from your Python code. But even if you wanted to keep the batch script, something like this should do just fine without VBScript as an intermediary.
subprocess.call(['cmd', '/c', 'copy.bat'])
You may want to give the full path of the batch file, though, to avoid issues like the working directory not being what you think it is.
If your batch script resides in the same directory as the Python script, you can build the path with something like this:
import os
import subprocess
scriptdir = os.path.dirname(__file__)
batchfile = os.path.join(scriptdir, 'copy.bat')
subprocess.call(['cmd', '/c', os.path.realpath(batchfile)])
It seems there is no such an operation that could not be done using plain Python. Scan a directory, copy a file -- Python has it all in the standard library. See os.path and shutil modules.
Adding VB scripts and launching subprocesses make your code complex and difficult to debug.

Python detect a python script's name in the list of processes

I know I can use psutil to get a list of running processes' names like this
import psutil
for i in psutil.pids():
print(psutil.Process(i).name())
However, If i run a python script with python, psutil only will show me that I have an instance of python running.
So, my question is - if I run a python script:
python script_name
is it possible to detect script_name by psutil?
Look at psutil.Process(i).cmdline() docs. Your example would return
['python', 'script_name']
The psutil documentation states that the cmdline() method returns the command line of the process. If the command line is python script_name, the second word will be the actual script name. To get this information I'd change psutil.Process(i).name() to psutil.Process(i).cmdline().
You will need to read the script name from the command line arguments:
import psutil
for proc in psutil.process_iter():
try:
if "python" in proc.name():
print(proc.cmdline[1])
except:
pass

Python subprocess not executing when daemonized

I try to execute a script inside a method where parent Class is daemonized.
autogamma.sh is a script that need ImageMagick to be installed (and use convert) that can be found here : http://www.fmwconcepts.com/imagemagick/autogamma/index.php
import os
import subprocess
import daemon
class MyClass():
def __init__(self):
self.myfunc()
def myfunc(self):
script = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'autogamma.sh')
cmd = ('/bin/sh %s -c average /tmp/c.jpg /tmp/d.jpg' % script).split(' ')
ret = subprocess.Popen(cmd).communicate()
with daemon.DaemonContext():
process = MyClass()
process.run()
Script executing correctly when launching the class MyClass only. I think there is a problem with env or something similar but cannot get it.
Problem is also happening with Rsync, mediainfo, ffprobe.
Using Python 2.7.3 with python-daemon 1.6 , tested on mac os, centos 5.5, ubuntu 12.04TLS
The script is pretty short, if you exclude the code for reading command line arguments, comments and other color modes it is less than 75 lines. I would just convert it to Python.
Like the comments suggest, the best way would be to use one of the python wrappers for ImageMagick.
You could also call convert directly, though it's probably going to be painful. Here a small snippet of what that would look like:
import subprocess
def image_magick_version():
output = subprocess.check_output("/usr/local/bin/convert -list configure", shell=True)
for line in output.split('\n'):
if line.startswith('LIB_VERSION_NUMBER'):
_, version = line.split(' ', 1)
return tuple(int(i) for i in version.split(','))
im_version = image_magick_version()
if im_version < (6,7,6,6) or im_version > (6,7,7,7) :
cspace = "RGB"
else:
cspace = "sRGB"
if im_version < (6,7,6,7) or im_version > (6,7,7,7):
setcspace = "-set colorspace RGB"
else:
setcspace = ""
When I suspect environment issues, I do one of two things:
Run the script in the foreground with "env - whatever-script". This
should clear out the environment, and send errors to your terminal's
stderr.
Run the script normally, but with stdout and stderr redirected to a
file in /tmp: whatever-script > /tmp/output 2>&1
These make tty-less scripts a bit less opaque.
I finally found the problem. It was effectivly a path problem. After looking into the lib I found this usefull param:
`working_directory`
:Default: ``'/'``
Full path of the working directory to which the process should
change on daemon start.
Since a filesystem cannot be unmounted if a process has its
current working directory on that filesystem, this should either
be left at default or set to a directory that is a sensible “home
directory” for the daemon while it is running.
So I setted up the daemon like this:
with daemon.DaemonContext(working_directory='.'):
process = MyClass()
process.run()
And now I have the correct path and my script is properlly executed.

Categories

Resources