Running a python script in a windows shell multiple times - python

I'd like to run the following shell command 10 times
./file.py 1111x
with the 'x' ranging from 0 to 9
i.e. a different port for each .file.py file. I need each instance running in its own shell. I already tried creating a batch file and a python script that calls the windows shell but both without success.

What about this...
import os
import subprocess
for x in range(0,10):
command = './file.py 1111' + str(x)
os.system(command)
#or
subprocess.call('cmd ' + command, shell=True)

What you're looking for is a powershell job. You may need to tweak this a bit to accommodate your specific requirements, but this should do what you need.
[ScriptBlock]$PyBlock = {
param (
[int]$x,
[string]$pyfile
)
try {
[int]$Port = (11110 + $x)
python $pyfile $Port
}
catch {
Write-Error $_
}
}
try {
0..9 | ForEach-Object {
Start-Job -Name "PyJob $_" -ScriptBlock $PyBlock -ArgumentList #($_, 'path/to/file.py')
}
Get-Job | Wait-Job -Timeout <int>
#If you do not specify a timeout then it will wait indefinitely.
#If you use -Timeout then make sure it's long enough to accommodate the runtime of your script.
Get-Job | Receive-Job
}
catch {
throw $_
}

Related

How to run a PowerShell cmdlet in Python to get a list of connected USB devices?

I try to list connected USB devices in my Python project.
I tried to use os.system() with a command prompt but I cannot find a command for command prompt to list connected USB devices (names).
I found a PowerShell command which is
Get-PnpDevice -PresentOnly | Where-Object { $_. InstanceId -match '^USB' }
That works fine.
I want to know if there is either a command prompt to list USB connected devices with os.system() or how to run the PowerShell cmdlet in Python using os.system() or any other command.
There is a module called pyUSB that works really well.
Alternatively, to run Powershell commands, you can use the subprocess package.
import subprocess
result = subprocess.run(["powershell", "-Command", MyCommand], capture_output=True)
My poor take at Python, I guess if you want to work with the output produced by PowerShell you might want to serialize the objects and de-serialize them in Python, hence the use of ConvertTo-Json.
import subprocess
import json
cmd = '''
Get-PnpDevice -PresentOnly |
Where-Object { $_.InstanceId -match '^USB' } |
ConvertTo-Json
'''
result = json.loads(
subprocess.run(["powershell", "-Command", cmd], capture_output=True).stdout
)
padding = 0
properties = []
for i in result[0].keys():
if i in ['CimClass', 'CimInstanceProperties', 'CimSystemProperties']:
continue
properties.append(i)
if len(i) > padding:
padding = len(i)
for i in result:
for property in properties:
print(property.ljust(padding), ':', i[property])
print('\n')
Use subprocess module.
In your case it will look like:
import subprocess
devices_raw: str = subprocess.run(
["Get-PnpDevice -PresentOnly | Where-Object { $_. InstanceId -match '^USB' }"],
capture_output=True
).stdout
# ... working with devices_raw as a string, probably you'll need to split it or smth.

How can I use Python path in my Visual Studio Code extension?

I'm writing my first VSCode extension. In short, the extension opens a terminal (PowerShell) and executes a command:
term = vscode.window.activeTerminal;
term.sendText("ufs put C:\\\Users\\<userID>\\AppData\\Local\\Programs\\Python\\Python38-32\\Lib\\site-packages\\mymodule.py");
After selecting a Python environment, VSCode should know where Python is located (python.pythonPath). But the path to Python will obviously vary depending on the Python installation, version and so on. So I was hoping that I could do something like:
term.sendText("ufs put python.pythonPath\\Lib\\site-packages\\mymodule.py");
But how can I do this in my extension (TypeScript)? How do I refer to python.pythonPath?
My configuration:
Windows 10
Python 3.8.2
VSCode 1.43.2
Microsofts Python extension 2020.3.69010
Node.js 12.16.1
UPDATE:
Nathan, thank you for your comment. I used a child process as suggested. It executes a command to look for pip. Relying on the location of pip is not bullet proof, but it works for now:
var pwd = 'python.exe -m pip --version';
const cp = require('child_process');
cp.exec(pwd, (err, stdout, stderr) => {
console.log('Stdout: ' + stdout);
console.log('Stderr: ' + stderr);
if (err) {
console.log('error: ' + err);
}
});
Not sure where to go from here to process stdout, but I tried child_process.spawn using this accepted answer:
function getPath(cmd, callback) {
var spawn = require('child_process').spawn;
var command = spawn(cmd);
var result = '';
command.stdout.on('data', function (data) {
result += data.toString();
});
command.on('close', function (code) {
return callback(result);
});
}
let runCommand = vscode.commands.registerCommand('extension.getPath', function () {
var resultString = getPath("python.exe -m pip --version", function (result) { console.log(result) });
console.log(resultString);
});
Hopefully, this would give me stdout as a string. But the only thing I got was undefined. I'm way beyond my comfort zone now. Please advise me how to proceed.

Run a perl script with Python on multiple files at once in a folder

This is my perl script at the moment:
#!/usr/bin/perl
use open qw/:std :utf8/;
use strict;
use warnings;
if (defined $ARGV[0]){
my $filename = $ARGV[0];
my %count;
open (my $fh, $filename) or die "Can't open '$filename' $!";
while (<$fh>)
{
$count{ lc $1 }++ while /(\w+)/g;
}
close $fh;
my $array = 0;
foreach my $word ( sort { $count{$b} <=> $count{$a} } keys %count)
{
print "$count{$word} $word\n" if $array++ < 10;
}
}else{
print "Please enter the name of the file: ";
my $filename = ($_ = <STDIN>);
my %count;
open (my $fh, $filename) or die "Can't open '$filename' $!";
while (<$fh>)
{
$count{ lc $1 }++ while /(\w+)/g;
}
close $fh;
my $array = 0;
foreach my $word ( sort { $count{$b} <=> $count{$a} } keys %count)
{
print "$count{$word} $word\n" if $array++ < 10;
}
}
And this is my Python script at the moment:
#!/usr/bin/env python3
import os
perlscript = "perl " + " perlscript.pl " + " /home/user/Desktop/data/*.txt " + " >> " + "/home/user/Desktop/results/output.txt"
os.system(perlscript)
Problem: When there are multiple txt-files in the data folder the script only runs on one file and ignores all the other txt-files. Is there a way to run the perlscript on all the txt-files at once?
Another problem: I'm also trying to delete the txt-files with the os.remove after they have been executed but they get deleted before the perlscript has a chance to execute.
Any ideas? :)
That Perl script processes one file. Also, that string passed to shell via os.system doesn't get expanded into a valid command with a file list as intended with the * shell glob.
Instead, build the file list in Python, using os.listdir or glob.glob or os.walk. Then iterate over the list and call that Perl script on each file, if it must process only one file at a time. Or, modify the Perl script to process multiple files and run it once with the whole list.
To keep the current Perl script and run it on each file
import os
data_path = "/home/user/Desktop/data/"
output_path = "/home/user/Desktop/result/"
for file in os.listdir(data_path):
if not file.endswith(".txt"):
continue
print("Processing " + file) # better use subprocess
run_perlscript = "perl " + " perlscript.pl " + \
data_path + file + " >> " + output_path + "output.txt"
os.system(run_perlscript)
The Perl script need be rewritten to lose that unneeded code duplication.
However, it is better to use subprocess module to run and manage external commands. This is advised even in the os.system documentation itself.
For instance
import subprocess
with open(output_path + "output.txt", "a") as fout:
for file in os.listdir(path):
if not file.endswith(".txt"):
continue
subprocess.run(["perl", "script.pl", data_path + file], stdout=fout)
where the file is opened in the append mode ("a") following the question's >> redirection.
The recommended subprocess.run is available since python 3.5; otherwise use Popen.
Another, and arguably "right," option is to adjust the Perl script so that it can process multiple files. Then you only need run it once, with the whole file list.
use strict;
use warnings;
use feature 'say';
use open ':std', ':encoding(UTF-8)';
foreach my $filename (#ARGV) {
say "Processing $filename";
my %count;
open my $fh, '<', $filename or do {
warn "Can't open '$filename': $!";
next;
};
while (<$fh>) {
$count{ lc $1 }++ while /(\w+)/g;
}
close $fh;
my $prn_cnt = 0;
foreach my $word ( sort { $count{$b} <=> $count{$a} } keys %count) {
print "$count{$word} $word\n" if $prn_cnt++ < 10;
}
}
This prints a warning on a file that it can't open and skips to the next one. If you'd rather have the script exit on any unexpected file replace or do { ... }; with the original die.
Then, and using glob.glob as an example now
import subprocess
data_path = "/home/user/Desktop/data/"
output_path = "/home/user/Desktop/result/"
files = glob.glob(data_path + "*.txt")
with open(output_path + "output.txt", "a") as fout:
subprocess.run(["perl", "script.pl", files], stdout=fout)
Since this passes the whole list as command arguments it assumes that there aren't (high) thousands of files, to exceed some length limits on pipes or command-line.

How to send a single pipelined command to python using bash from a groovy script console (Jenkins)?

I am using the groovy script console as offered by Jenkins.
I have this nicely working line for a Jenkins slave (Windows based):
println "cmd /c echo print(\"this is a sample text.\") | python".execute().text
Now i want the functional equivalent for a Jenkins slave (Linux based).
So i started on the Linux command line and got this core command working for me:
bash -c 'echo print\(\"this is a sample text.\"\) | python'
Then i wrapped all of this console command line into a some more escape codes and invocation decoration - but by this it went to a no longer functional state:
println "bash -c \'echo print\\(\\\"this is a sample text.\\\"\\) | python\'".execute().txt
The result when running it is just this:
empty
I feel i am stuck at the moment due failing to solve the multitude of effecting escape character levels.
Whats wrong? How to solve it? (And maybe: why?)
PS: if unclear - i want (if possible at all) to stick to an one-liner as the initial item was.
If you don't need to pipe bash into python, maybe this suits your fancy?
['python','-c','print("this is a sample text")'].execute().text
If you do need it, try
['bash','-c', /echo print\(\"this is a sample text.\"\) | python/].execute().text
Using List's .execute() helps with clarifying what each argument is. The slashy-strings help by changing the escape-character.
print "bash -c 'echo \"print(\\\"this is a sample text.\\\")\" | python'"
Output:
bash -c 'echo "print(\"this is a sample text.\")" | python'
After digging around for some more while i found a somewhat platform-independent, error channel (stderr) aware and execution fault capable solution that even avoids OS specific components like bash/cmd.exe:
try {
def command = ['python', '-c', /print("this is a sample text.")/];
if (System.properties['os.name'].toLowerCase().contains('windows'))
{
command[2] = command[2].replaceAll(/\"/, /\\\"/)
}
println "command=" + command
def proc = command.execute()
def rc = proc.waitFor()
println "rc=" + rc
def err = proc.err.text
if( err != "" ) { print "stderr=" + err }
def out = proc.text
if( out != "" ) { print "stdout=" + out }
} catch(Exception e) {
println "exception=" + e
}
println ""

How do I detect if my python code is running in PowerShell or the Command Prompt (cmd)

I have a python application that has a shell that needs to do some setup based on whether the shell it's running in is the Windows Command Prompt (cmd) or Powershell.
I haven't been able to figure out how to detect if the application is running in powershell or cmd.
From my searches on stackoverflow and Google, it seems the only way to do this is to use psutil to find the name of the parent process.
Is there a nicer way?
Edit: I've decided to use psutil to find the name of the parent process. Thanks to everyone who helped in the comments.
#Matt A. is right. Use psutil and os package to get the parent process id and the name.
parent_pid = os.getppid()
print(psutil.Process(parent_pid).name())
The following snippet finds md5sum on args.file in bash/powershell, I usually use the first command to check what we are running in, so I can use it later on, using shell=True in subprocess is not very portable.
import os, subprocess
running_shell = None
mycheck='/usr/bin/md5sum' # full path is needed
if not os.path.isfile(mycheck):
try:
file_md5sum = subprocess.check_output("powershell.exe Get-FileHash -Algorithm MD5 {} | Select -expand Hash".format(args.file).split())
except FileNotFoundError as e:
log.fatal("unable to generate md5sum")
sys.exit(-1)
file_md5sum = file_md5sum.lower()
running_shell = 'powershell'
else:
file_md5sum = subprocess.check_output([mycheck, args.file])
running_shell = 'bash'
Here is a sample powershell stub that can do the trick:
$SCR="block_ips.py"
$proc = $null
$procs = Get-WmiObject Win32_Process -Filter "name = 'python3.exe' or name = 'python.exe'" | Select-Object Description,ProcessId,CommandLine,CreationDate
$procs | ForEach-Object {
if ( $_.CommandLine.IndexOf($SCR) -ne -1 ) {
if ( $null -eq $proc ) {
$proc = $_
}
}
}
if ( $null -ne $proc ) {
Write-Host "Process already running: $proc"
} else {
Write-Host "$SCR is not running"
}
Based on this post, you should be able to run this CMD/PS command through the subprocess module in your Python script:
subprocess.call("(dir 2>&1 *`|echo CMD);&<# rem #>echo PowerShell", shell=True)
This will output CMD if you're in CMD, and PowerShell if you're in PS.

Categories

Resources