Passing python variable to powershell script - python

I have the following small powershell script which creates a local account.
create_new_user.ps1 :
param([string]$username)
New-LocalUser -Name $username -Description "Test account" -NoPassword
I would like to pass $username variable from python script as follows;
script.py :
import subprocess, sys
def runscript():
username = "testaccount"
p = subprocess.Popen(['powershell',"-ExecutionPolicy","Unrestricted", './create_new_user.ps1', username],
stdout=sys.stdout)
p.communicate()
runscript()
When i run the script.py, it asks me to input Name for the account and then it creates the account but it does not take the variable from script.py
How can i solve this issue?
Thanks in advance

Related

Command works on Command Prompt but it does not work when called with subprocess.run() or os.system() in python

Python 3.10.6
Windows 10
I have a python function that executes a DXL script using subsystem.run() or os.system() (whichever works best I guess). The problem is that when I run a custom command using python it does not work, but when I paste the same command in the command prompt, it works. I should also clarify that command prompt is not the ms store windows terminal (cannot run ibm doors commands there for some reason). It is the OG prompt
I need to use both python and IBM Doors for the solution.
Here is a summer version of my code (Obviously, the access values are not real):
#staticmethod
def run_dxl_importRTF():
dquotes = chr(0x22) # ASCII --> "
module_name = "TEST_TEMP"
script_path = "importRTF.dxl"
script_do_nothing_path = "doNothing.dxl"
user = "user"
password = "pass"
database_config = "11111#11.11.1111.0"
doors_path = dquotes + r"C:\Program Files\IBM\Rational\DOORS\9.7\bin\doors.exe" + dquotes
file_name = "LIBC_String.rtf"
# Based On:
# "C:\Program Files\IBM\Rational\DOORS\9.7\\bin\doors.exe" -dxl "string pModuleName = \"%~1\";string pFilename = \"%~2\";#include <importRTF.dxl>" -f "%TEMP%" -b "doNothing.dxl" -d 11111#11.11.1111.0 -user USER -password PASSWORD
script_arguments = f"{dquotes}string pModuleName=\{dquotes}{module_name}\{dquotes};string pFileName=\{dquotes}{file_name}\{dquotes};#include <{script_path}>{dquotes}"
command = [doors_path, "-dxl", script_arguments, "-f", "%TEMP%", "-b", script_do_nothing_path, '-d', database_config, '-user', user, '-password', password]
res = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True)
print(f"COMMAND:\n{' '.join(res.args)}")
print(f"STDERR: {repr(res.stderr)}")
print(f'STDOUT: {res.stdout}')
print(f'RETURN CODE: {res.returncode}')
return
PYTHON SCRIPT OUTPUT:
COMMAND:
"C:\Program Files\IBM\Rational\DOORS\9.7\bin\doors.exe" -dxl "string pModuleName=\"TEST_TEMP\";string pFileName=\"LIBC_String.rtf\";#include <importRTF.dxl>" -f %TEMP% -b doNothing.dxl -d 11111#11.11.1111.0 -user USER_TEMP -password PASS_TEMP
STDERR: 'The system cannot find the path specified.\n'
STDOUT:
RETURN CODE: 1
When I run the same command in the command prompt, it works (dxl script is compiled).
I identified the problem which is the script_argument variable. Meaning that, when I try to just enter the IBM Doors server without compiling a DXL script, it works on python and the command prompt.
The python script needs to be dynamic meaning that all of the initial declared variables can change value and have a path string in it. I am also trying to avoid .bat files. They also did not work with dynamic path values
Thanks for your time
I tried:
Changing CurrentDirectory (cwd) to IBM Doors
os.system()
Multiple workarounds
Tried IBM Doors path without double quotes (it doesnt work because of the whitespaces)
.bat files
When calling subprocess.run with a command list and shell=True, python will expand the command list to a string, adding more quoting along the way. The details are OS dependent (on Windows, you always have to expand the list to a command) but you can see the result via the subprocess.list2cmdline() function.
Your problem is these extra escapes. Instead of using a list, build a shell command string that already contains the escaping you want. You can also use ' for quoting strings so that internal " needed for shell quoting can be entered literally.
Putting it all together (and likely messing something up here), you would get
#staticmethod
def run_dxl_importRTF():
module_name = "TEST_TEMP"
script_path = "importRTF.dxl"
script_do_nothing_path = "doNothing.dxl"
user = "user"
password = "pass"
database_config = "11111#11.11.1111.0"
doors_path = r"C:\Program Files\IBM\Rational\DOORS\9.7\bin\doors.exe"
file_name = "LIBC_String.rtf"
script_arguments = (rf'string pModuleName=\"{module_name}\";'
'string pFileName=\"{file_name}\";'
'#include <{script_path}>')
command = (f'"{doors_path}" -dxl "{script_arguments}" -f "%TEMP%"'
' -b "{script_do_nothing_path}" -d {database_config}'
' -user {user} -password {pass}')
res = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True)
print(f"COMMAND:\n{' '.join(res.args)}")
print(f"STDERR: {repr(res.stderr)}")
print(f'STDOUT: {res.stdout}')
print(f'RETURN CODE: {res.returncode}')

Unable to access / var / tmp in Subprocess via Django

I created a script that outputs the execution result of a shell script to a Web screen using Django and subprocess of python.
Specifically, the following two scripts were created.
test.py
#!/usr/bin/python
import sys,os
import subprocess
import syslog
command_list = ['/bin/sh', '/var/tmp/test.sh']
proc = subprocess.Popen(args=command_list,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
cwd=os.path.dirname(command_list[0]),
shell=False)
result = proc.communicate(input=None)
print str( result )
test.sh
#!/bin/bash
echo "begin"
cat /var/tmp/data.txt
data.txt
data1
data2
Unit tests were performed on the two scripts, and they were confirmed to work properly.
However, when I used test.py via Django, test.sh's "cat" command and data.txt existed,
“Cat: /var/tmp/data.txt: No such file or directory” is displayed.
What is the cause?
version
python 2.7.13
Django 1.11.20
When I set PrivateTmp to PrivateTmp = false, httpd can now access / var / tmp.
view /usr/lib/systemd/system/httpd.service
PrivateTmp=false
systemctl daemon-reload
service http restart

Unable to pass variable to a bash command in python

I am trying to pass a python variable to a bash command like this:
subscriptionId = "xxxxx"
command = " az account show -s $subscriptionId"
subprocess.check_output(command)
I get there following error:
error : az account show: error: argument --subscription/-s: expected one argument
Assigning a Python variable like subscriptionId = "xxxxx" does not magically place it in your environment, much less pass it to a subprocess. You need to do that interpolation yourself:
command = f"az account show -s {subscriptionId}"
If you really want to use environment variables, add the variable you want and enable shell expansion:
subscriptionId = ...
env = os.environ.copy()
env['subscriptionId'] = subscriptionId
command = "az account show -s ${subscriptionId}"
subprocess.check_output(command, env=env, shell=True)
Alternatively, you can mess with your own process environment:
subscriptionId = ...
os.environ['subscriptionId'] = subscriptionId
command = "az account show -s ${subscriptionId}"
subprocess.check_output(command, shell=True)
These options are, in my opinion, not recommended, since they raise all the security issues that shell=True brings with it, while providing you with no real advantage.
since the variable command is just a string you could simply do this.
subscriptionId = "xxxxx"
command = " az account show -s " + subscriptionId
subprocess.check_output(command)

How can I get the current username from a script run with pkexec?

I'm executing a python script as root with pkexec and I'm using working_dir = os.getenv('HOME') to get the username but it always returns root instead of test1 which is the current user.
How can I get the user that ran pkexec instead?
Script is located in /usr/bin if that information is any use.
The sudo man page describes the environment variables which hold the information of the user who invoked it.
import os
print os.environ["SUDO_USER"]
Thanks to that other guy, in the end, below is what I did to get the current user:
import os
import pwd
user = pwd.getpwuid(int(os.environ["PKEXEC_UID"])).pw_name
working_dir = '/home/' + user
For those looking for a pure bash solution, this finds the user run with sudo or pkexec:
if test -z "$SUDO_USER"
then
USER=$(getent passwd $PKEXEC_UID | cut -d: -f1)
else
USER=$SUDO_USER
fi

Change output logfile name inside python subprocess to make it unique

I am using python subprocess to email logfile to a user who runs python script. However each time user runs the script logfile gets overwritten. Here is the unix subprocess command I am using inside python code:
subprocess.Popen("mail -s 'logfile.log attached' -r az12#abc.com -a logfile.log $USER#abc.com &> /dev/null",shell=True)
How could I make logfile name unique? Maybe incremant logfile name as logfile1.log, logfile2.log and so on?
Trick is how do I achieve this inside subprocess?
Also you can do this with datetime module:
import datetime
filename = "logfile-%s.log" % datetime.datetime.today().isoformat()
command = "mail -s '{0} attached' -r az12#abc.com -a {0} $USER#abc.com &> /dev/null".format(filename)
subprocess.Popen(command, shell=True)
The name of log file will look like logfile-2015-03-13T21:37:14.927095.log.
Try using timestamp to generate a logfile name. About using that one in subprocess, command is nothing but a string. So it is as simple as
import time
fileName = "logfile." + str(time.time()) + ".log" # use your logic to generate logFile name.
command = "mail -s '%s attached' -r az12#abc.com -a %s $USER#abc.com &> /dev/null" %(fileName, fileName)
subprocess.Popen(command,shell=True)

Categories

Resources