Need help in python subprocess to copy file from host to container
here is the python code which I have tried
import subprocess
output=subprocess.check_output(['docker','ps'],
universal_newlines=True)
x=output.split('\n')
for i in x:
if i.__contains__("name_of_container"):
container_id=i[:12]
subprocess.call(["docker cp", "some_file.py", container_id:"/tmp"])
subprocess.call(['docker','exec','-it', container_id,'bash'])
This should work:
import subprocess
output=subprocess.check_output(['docker','ps'],
universal_newlines=True)
x=output.split('\n')
for i in x:
if i.__contains__("inspiring_sinoussi"):
container_id=i[:12]
container_id_with_path=container_id+":/tmp"
subprocess.call(["docker", "cp", "/root/some_file.py", container_id_with_path])
subprocess.call(['docker','exec','-it', container_id,'bash'])
Actually in the subprocess call all the arguments are separated by comma. In your case container_id:/tmp should be a single argument since there is no any space inbetween them. As container_id is a variable in your case it can't be put with :/tmp together. So I created a new variable container_id_with_path which has the :/tmp path in it.
Running the script gives me the desired result.
$ python copy.py
/ # ls /tmp/
hsperfdata_root tomcat-docbase.1849924566121837123.9090
some_file.py
Some errors of your code:
container_id:"/tmp" is not a valid python grammar
docker cp is not valid command in subprocess
docker cp not in for loop
So, I guess next is your fix:
for i in x:
if i.__contains__("name_of_container"):
container_id = i[:12]
subprocess.call(["docker", "cp", "some_file.py", container_id + ":/tmp"])
Related
This is a follow-up question of Use tkinter based PySimpleGUI as root user via pkexec.
I have a Python GUI application. It should be able to run as user and as root. For the latter I know I have to set $DISPLAY and $XAUTHORITY to get a GUI application work under root. I use pkexec to start that application as root.
I assume the problem is how I use os.getexecvp() to call pkexec with all its arguments. But I don't know how to fix this. In the linked previous question and answer it works when calling pkexec directly via bash.
For that example the full path of the script should be/home/user/x.py.
#!/usr/bin/env python3
# FILENAME need to be x.py !!!
import os
import sys
import getpass
import PySimpleGUI as sg
def main_as_root():
# See: https://stackoverflow.com/q/74840452
cmd = ['pkexec',
'env',
f'DISPLAY={os.environ["DISPLAY"]}',
f'XAUTHORITY={os.environ["XAUTHORITY"]}',
f'{sys.executable} /home/user/x.py']
# output here is
# ['pkexec', 'env', 'DISPLAY=:0.0', 'XAUTHORITY=/home/user/.Xauthority', '/usr/bin/python3 ./x.py']
print(cmd)
# replace the process
os.execvp(cmd[0], cmd)
def main():
main_window = sg.Window(title=f'Run as "{getpass.getuser()}".',
layout=[[]], margins=(100, 50))
main_window.read()
if __name__ == '__main__':
if len(sys.argv) == 2 and sys.argv[1] == 'root':
main_as_root() # no return because of os.execvp()
# else
main()
Calling that script as /home/user/x.py root means that the script will call itself again via pkexec. I got this output (self translated to English from German).
['pkexec', 'env', 'DISPLAY=:0.0', 'XAUTHORITY=/home/user/.Xauthority', '/usr/bin/python3 /home/user/x.py']
/usr/bin/env: „/usr/bin/python3 /home/user/x.py“: File or folder not found
/usr/bin/env: Use -[v]S, to takeover options via #!
For me it looks like that the python3 part of the command is interpreted by env and not pkexec. Some is not as expected while interpreting the cmd via os.pkexec().
But when I do this on the shell it works well.
pkexec env DISPLAY=$DISPLAY XAUTHORITY=$XAUTHORITY python3 /home/user/x.py
Based on #TheLizzard comment.
The approach itself is fine and has no problem.
Just the last element in the command array cmd. It should be splitted.
cmd = ['pkexec',
'env',
f'DISPLAY={os.environ["DISPLAY"]}',
f'XAUTHORITY={os.environ["XAUTHORITY"]}',
f'{sys.executable}',
'/home/user/x.py']
Python 3.10.6
Windows 10
I have a python function that executes a DXL script using subsystem.run() or os.system() (whichever works best I guess). The problem is that when I run a custom command using python it does not work, but when I paste the same command in the command prompt, it works. I should also clarify that command prompt is not the ms store windows terminal (cannot run ibm doors commands there for some reason). It is the OG prompt
I need to use both python and IBM Doors for the solution.
Here is a summer version of my code (Obviously, the access values are not real):
#staticmethod
def run_dxl_importRTF():
dquotes = chr(0x22) # ASCII --> "
module_name = "TEST_TEMP"
script_path = "importRTF.dxl"
script_do_nothing_path = "doNothing.dxl"
user = "user"
password = "pass"
database_config = "11111#11.11.1111.0"
doors_path = dquotes + r"C:\Program Files\IBM\Rational\DOORS\9.7\bin\doors.exe" + dquotes
file_name = "LIBC_String.rtf"
# Based On:
# "C:\Program Files\IBM\Rational\DOORS\9.7\\bin\doors.exe" -dxl "string pModuleName = \"%~1\";string pFilename = \"%~2\";#include <importRTF.dxl>" -f "%TEMP%" -b "doNothing.dxl" -d 11111#11.11.1111.0 -user USER -password PASSWORD
script_arguments = f"{dquotes}string pModuleName=\{dquotes}{module_name}\{dquotes};string pFileName=\{dquotes}{file_name}\{dquotes};#include <{script_path}>{dquotes}"
command = [doors_path, "-dxl", script_arguments, "-f", "%TEMP%", "-b", script_do_nothing_path, '-d', database_config, '-user', user, '-password', password]
res = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True)
print(f"COMMAND:\n{' '.join(res.args)}")
print(f"STDERR: {repr(res.stderr)}")
print(f'STDOUT: {res.stdout}')
print(f'RETURN CODE: {res.returncode}')
return
PYTHON SCRIPT OUTPUT:
COMMAND:
"C:\Program Files\IBM\Rational\DOORS\9.7\bin\doors.exe" -dxl "string pModuleName=\"TEST_TEMP\";string pFileName=\"LIBC_String.rtf\";#include <importRTF.dxl>" -f %TEMP% -b doNothing.dxl -d 11111#11.11.1111.0 -user USER_TEMP -password PASS_TEMP
STDERR: 'The system cannot find the path specified.\n'
STDOUT:
RETURN CODE: 1
When I run the same command in the command prompt, it works (dxl script is compiled).
I identified the problem which is the script_argument variable. Meaning that, when I try to just enter the IBM Doors server without compiling a DXL script, it works on python and the command prompt.
The python script needs to be dynamic meaning that all of the initial declared variables can change value and have a path string in it. I am also trying to avoid .bat files. They also did not work with dynamic path values
Thanks for your time
I tried:
Changing CurrentDirectory (cwd) to IBM Doors
os.system()
Multiple workarounds
Tried IBM Doors path without double quotes (it doesnt work because of the whitespaces)
.bat files
When calling subprocess.run with a command list and shell=True, python will expand the command list to a string, adding more quoting along the way. The details are OS dependent (on Windows, you always have to expand the list to a command) but you can see the result via the subprocess.list2cmdline() function.
Your problem is these extra escapes. Instead of using a list, build a shell command string that already contains the escaping you want. You can also use ' for quoting strings so that internal " needed for shell quoting can be entered literally.
Putting it all together (and likely messing something up here), you would get
#staticmethod
def run_dxl_importRTF():
module_name = "TEST_TEMP"
script_path = "importRTF.dxl"
script_do_nothing_path = "doNothing.dxl"
user = "user"
password = "pass"
database_config = "11111#11.11.1111.0"
doors_path = r"C:\Program Files\IBM\Rational\DOORS\9.7\bin\doors.exe"
file_name = "LIBC_String.rtf"
script_arguments = (rf'string pModuleName=\"{module_name}\";'
'string pFileName=\"{file_name}\";'
'#include <{script_path}>')
command = (f'"{doors_path}" -dxl "{script_arguments}" -f "%TEMP%"'
' -b "{script_do_nothing_path}" -d {database_config}'
' -user {user} -password {pass}')
res = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True)
print(f"COMMAND:\n{' '.join(res.args)}")
print(f"STDERR: {repr(res.stderr)}")
print(f'STDOUT: {res.stdout}')
print(f'RETURN CODE: {res.returncode}')
I'm trying to spawn multiple tmux sessions with different environment variables from the same python3 script.
I have been arguing {**os.environ, "CUDA_VISIBLE_DEVICES":str(device_id)} to the env key word argument to subprocess.Popen.
for device_id in device_ids:
new_env = {**os.environ, "CUDA_VISIBLE_DEVICES":str(device_id)}
p = subprocess.Popen([
'tmux', 'new', '-d', "-c", "./", '-s',
sesh_name,
"python3",
path_to_script
], env=new_env)
I'm finding that the CUDA_VISIBLE_DEVICES parameter, however, is equal to the first device_id that I argue across all processes. What is the meaning of this!?
Is this an inherent issue with Popen and the subprocess module? If so, how do I fix it?
I've tried to argue the device id to the script of the new process, but sadly torch won't allow me to update the environment variable after it's been imported and it would be way more trouble than it's worth to rework the code for that.
EDIT: Providing minimal example
Save this script as test.py (or whatever else you fancy):
import subprocess
import os
def sesh(name):
procs = []
for device_id in [4,5,6]:
proc_env = {**os.environ, "CUDA_VISIBLE_DEVICES": str(device_id)}
p = subprocess.Popen(['tmux', 'new', '-d', "-c", "./", '-s', name+str(device_id), "python3", "deleteme.py"], env=proc_env)
procs.append(p)
return procs
if __name__=="__main__":
sesh("foo")
Save this script as deleteme.py within the same directory:
import time
import os
if __name__=="__main__":
print(os.environ)
for i in range(11):
print("running")
if "CUDA_VISIBLE_DEVICES" in os.environ:
print(os.environ["CUDA_VISIBLE_DEVICES"])
else:
print("CUDA_VISIBLE_DEVICES not found")
time.sleep(5)
Then run test.py from the terminal.
$ python3 test.py
Then switch to the tmux sessions to figure out what environment is being created.
For anyone else running into this problem, you can use os.system instead of subprocess.Popen in the following way.
import os
def sesh(name, device_id, script):
command = "tmux new -d -s \"{}{}\" \'export CUDA_VISIBLE_DEVICES={}; python3 {} \'"
command = command.format(
name,
device_id,
device_id,
script
)
os.system(command)
if __name__=="__main__":
sesh("foo", 4, "deleteme.py")
I am attempting to run PowerShell script from python to convert .xls files to .xlsb. by looping through a list of file names. I am encountering a PowerShell error "You cannot call a method on a null-valued expression" for command 3 (i.e. cmd3), and I am unsure why (this is my first time with python and running PowerShell script in general). The error is encountered when trying to open the workbook, but when the command is run in PowerShell directly, it seems to work fine.
Code:
import logging, os, shutil, itertools, time, pyxlsb, subprocess
# convert .xls to .xlsb and / transfer new terminology files
for i in itertools.islice(FileList, 0, 6, None):
# define extension
ext = '.xls'
# define file path
psPath = f'{downdir}' + f'\{i}'
# define ps scripts
def run(cmd):
completed = subprocess.run(["powershell", "-Command", cmd], capture_output=True)
return completed
# ps script: open workbook
cmd1 = "$xlExcel12 = 50"
cmd2 = "$Excel = New-Object -Com Excel.Application"
cmd3 = f"$WorkBook = $Excel.Workbooks.Open('{psPath}{ext}')"
cmd4 = f"$WorkBook.SaveAs('{psPath}{ext}',$xlExcel12,[Type]::Missing,
[Type]::Missing,$false,$false,2)"
cmd5 = "$Excel.Quit()"
# ps script: delete.xls files
cmd6 = f"Remove-Item '{psPath}{ext}'"
run(cmd1)
run(cmd2)
run(cmd3)
# change extension
ext = '.xlsb'
run(cmd4)
run(cmd5)
run(cmd6)
# copy .xlsb files to terminology folder
shutil.copy(i + ext, termdir)
Error:
Out[79]: CompletedProcess(args=['powershell', '-Command', "$WorkBook = > > $Excel.Workbooks.Open('C:\Users\Username\Downloads\SEND Terminology.xls')"], returncode=1, stdout=b'', stderr=b"You cannot call a method on a null-valued expression.\r\nAt line:1 char:1\r\n+ $WorkBook = $Excel.Workbooks.Open('C:\Username\User\Downloads\SEND Ter ...\r\n+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\r\n + CategoryInfo : InvalidOperation: (:) [], RuntimeException\r\n + FullyQualifiedErrorId : InvokeMethodOnNull\r\n \r\n")
Any input would be helpful.
Thank you!
Problem
As commenter vonPryz correctly stated, the Powershell commands run in separate processes. The memory spaces of processes are isolated from each other, and will be cleared when a process ends.
When you run the commands in separate Powershell processes, the cmd3, cmd4 and cmd5 won't have the variable $Excel available. Powershell defaults to a $null value for non-existing variables, hence the error message "You cannot call a method on a null-valued expression". The same happens for variable $xlExcel12. These variables only exists as long as the processes that created them were running and would only be visible within these processes, even if you managed to create two processes in parallel.
Solution
Commands cmd1..5 need to be run in the same Powershell process, so each command will be able to "see" the variables created by previous commands:
run( cmd1 + ';' + cmd2 + ';' + cmd3 + ';' + cmd4 + ';' + cmd5 )
You will need to change cmd4 to use another variable for the extension that will be used for saving, e. g. extSave
cmd4 = f"$WorkBook.SaveAs('{psPath}{extSave}',$xlExcel12,[Type]::Missing,
[Type]::Missing,$false,$false,2)"
The cmd6 is completely independent, because it does not depend on Powershell variables. It only depends on python variables, which are resolved before the process starts, so it could still be run in a separate process.
I would like to pass the variable "NUMBER_CAMS" and value from my python script to a bash environmental file "env_roadrunner"
following is the code that i have written
import subprocess
import os
import sys
import ConfigParser
os.chdir("/home/vasudev/LTECamOrchestrator_docker/tools/")
NUMBER_CAMS=sys.argv[2]
cmd = "xterm -hold -e sudo /home/vasudev/LTECamOrchestrator_docker/tools/create_pcap_replay_encoder " \
" /home/vasudev/LTECamOrchestrator_docker/tools/env_roadrunner"
p = subprocess.Popen([cmd] , shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
Following is my bash script which takes environmental variables
#!/bin/bash
# the name or ip address of the orchestrator
ORCHESTRATOR_IP="192.168.212.131"
# the port of the orchestrator
ORCHESTRATOR_PORT=9000
# password for the admin user
ORCHESTRATOR_PASSWORD='.qoq~^c^%l^U#e~'
# number of cameras to create from this pcap file
NUMBER_CAMS="$N"
# three port numbers that are only used internally but need to be free
I wanted to pass the value NUMBER_CAMS through my python script but i am getting following error
Traceback (most recent call last):
File "/home/vasudev/PycharmProjects/Test_Framework/Stream_provider.py", line 19, in <module>
NUMBER_CAMS=sys.argv[2]
IndexError: list index out of range
any suggestions why i am getting index out of range error
You to set the value of N in the environment so that your script can see that value to assign to NUMBER_CANS.
import subprocess
import os
import sys
import ConfigParser
os.environ["N"] = "2" # Must be a string, not an integer
cmd = ["xterm",
"-hold",
"-e",
"sudo",
"-E",
"./create_pcap_replay_encoder",
"env_roadrunner"]
p = subprocess.Popen(cmd,
cwd="/home/vasudev/LTECamOrchestrator_docker/tools/"
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
Note that sudo ignores the current environment by default when running a command; the -E option I added allows create_pcap_replay_encoder to see the inherited environment. However, you can only use -E if sudo is configured to allow the environment to be preserved.