powershell: execute python code passed as arguments [duplicate] - python

In pwsh call the following:
Write-Host '{"drop_attr": "name"}'
Result ok:
{"drop_attr": "name"}
Now do the same via pwsh:
pwsh -Command Write-Host '{"drop_attr": "name"}'
Result is missing quotation marks and square brackets?
drop_attr: name

Update:
PowerShell 7.3.0 mostly fixed the problem, with selective exceptions on Windows, and it seems that in some version after 7.3.1 the fix will require opt-in - see this answer for details.
For cross-version, cross-edition code, the Native module discussed at the bottom may still be of interest.
Unfortunately, PowerShell's handling of passing arguments with embedded " chars. to external programs - which includes PowerShell's own CLI (pwsh) - is fundamentally broken (and always has been), up to at least PowerShell 7.2.x:
You need to manually \-escape " instances embedded in your arguments in order for them to be correctly passed through to external programs (which happens to be PowerShell in this case as well):
# Note: The embedded '' sequences are the normal and expected
# way to escape ' chars. inside a PowerShell '...' string.
# What is *unexpected* is the need to escape " as \"
# even though " can normally be used *as-is* inside a '...' string.
pwsh -Command ' ''{\"drop_attr\": \"name\"}'' '
Note that I'm assuming your intent is to pass a JSON string, hence the inner '' ... '' quoting (escaped single quotes), which ensures that pwsh ultimately sees a single-quoted string ('...'). (No need for an explicit output command; PowerShell implicitly prints command and expression output).
Another way to demonstrate this on Windows is via the standard choice.exe utility, repurposed to simply print its /m (message) argument (followed by verbatim [Y,N]?Y):
# This *should* preserve the ", but doesn't as of v7.2
PS> choice /d Y /t 0 /m '{"drop_attr": "name"}'
{drop_attr: name} [Y,N]?Y # !! " were REMOVED
# Only the extra \-escaping preserves the "
PS> choice /d Y /t 0 /m '{\"drop_attr\": \"name\"}'
{"drop_attr": "name"} [Y,N]?Y # OK
Note that from inside PowerShell, you can avoid the need for \-escaping, if you call pwsh with a script block ({ ... }) - but that only works when calling PowerShell itself, not other external programs:
# NOTE: Works from PowerShell only.
pwsh -Command { '{"drop_attr": "name"}' }
Background info on PowerShell's broken handling of arguments with embedded " in external-program calls, as of PowerShell 7.2.1:
This GitHub docs issue contains background information.
GitHub issue #1995 discusses the problem and the details of the broken behavior as well as manual workarounds are summarized in this comment; the state of the discussion as of PowerShell [Core] 7 seems to be:
A fix is being considered as an experimental feature, which may become an official feature, in v7.3 at the earliest. Whether it will become a regular feature - i.e whether the default behavior will be fixed or whether the fix will require opt-in or even if the feature will become official at all - remains to be seen.
Fixing the default behavior would substantially break backward compatibility; as of this writing, this has never been allowed, but a discussion as to whether to allow breaking changes in the future and how to manage them has begun: see GitHub issue #13129.
See GitHub PR #14692 for the relevant experimental feature, which, however, as of this writing is missing vital accommodations for batch files and msiexec-style executables on Windows - see GitHub issue #15143.
In the meantime, you can use the PSv3+ ie helper function from the Native module (in PSv5+, install with Install-Module Native from the PowerShell Gallery), which internally compensates for all broken behavior and allows passing arguments as expected; e.g.,
ie pwsh -Command ' ''{"drop_attr": "name"}'' ' would then work properly.

Another way. Are you in Windows or Unix?
pwsh -c "[pscustomobject]#{drop_attr='name'} | convertto-json -compress"
{"drop_attr":"name"}

Another way is to use "encoded commands".
> $cmd1 = "Write-Host '{ ""description"": ""Test program"" }'"
> pwsh -encoded ([Convert]::ToBase64String([Text.Encoding]::Unicode.GetBytes($cmd1)))
{ "description": "Test program" }

Related

Taming shlex.split() behaviour

There are other questions on SO that get close to answering mine, but I have a very specific use case that I have trouble solving. Consider this:
from asyncio import create_subprocess_exec, run
async def main():
command = r'program.exe "C:\some folder" -o"\\server\share\some folder" "a \"quote\""'
proc = await create_subprocess_exec(*command)
await proc.wait()
run(main())
This causes trouble, because program.exe is called with these arguments:
['C:\\some folder', '-o\\server\\share\\some folder', 'a "quote"']
That is, the double backslash is no longer there, as shlex.split() removes it. Of course, I could instead (as other answers suggest) do this:
proc = await create_subprocess_exec(*command, posix=False)
But then program.exe is effectively called with these arguments:
['"C:\\some folder"', '-o"\\\\server\\share\\some folder"', '"a \\"', 'quote\\""']
That's also no good, because now the double quotes have become part of the content of the first parameter, where they don't belong, even though the second parameter is now fine. The third parameters has become a complete mess.
Replacing backslashes with forward slashes, or removing quotes with regular expressions all don't work for similar reasons.
Is there some way to get shlex.split() to leave double backslashes before server names alone? Or just at all? Why does it remove them in the first place?
Note that, by themselves these are perfectly valid commands (on Windows and Linux respectively anyway):
program.exe "C:\some folder" -o"\\server\share\some folder"
echo "hello \"world""
And even if I did detect the OS and used posix=True/False accordingly, I'd still be stuck with the double quotes included in the second argument, which they shouldn't be.
For now, I ended up with this (arguably a bit of a hack):
from os import name as os_name
from shlex import split
def arg_split(args, platform=os_name):
"""
Like calling shlex.split, but sets `posix=` according to platform
and unquotes previously quoted arguments on Windows
:param args: a command line string consisting of a command with arguments,
e.g. r'dir "C:\Program Files"'
:param platform: a value like os.name would return, e.g. 'nt'
:return: a list of arguments like shlex.split(args) would have returned
"""
return [a[1:-1].replace('""', '"') if a[0] == a[-1] == '"' else a
for a in (split(args, posix=False) if platform == 'nt' else split(args))]
Using this instead of shlex.split() gets me what I need, while not breaking UNC paths. However, I'm sure there's some edge cases where correct escaping of double quotes isn't correctly handled, but it has worked for all my test cases and seems to be working for all practical cases so far. Use at your own risk.
#balmy made the excellent observation that most people should probably just use:
command = r'program.exe "C:\some folder" -o"\\server\share\some folder" "a \"quote\""'
proc = await create_subprocess_shell(command)
Instead of
command = r'program.exe "C:\some folder" -o"\\server\share\some folder" "a \"quote\""'
proc = await create_subprocess_exec(*command)
However, note that this means:
it's not easy to check or replace individual arguments
you have the problem that always comes with using create_subprocess_exec if part of your command is based on external input, someone can inject code; in the words of the documentation (https://docs.python.org/3/library/asyncio-subprocess.html):
It is the application’s responsibility to ensure that all
whitespace and special characters are quoted appropriately to avoid
shell injection vulnerabilities. The shlex.quote() function can be
used to properly escape whitespace and special shell characters in
strings that are going to be used to construct shell commands.
And that's still a problem, as quote() also doesn't work correctly for Windows (by design).
I'll leave the question open for a bit, in case someone wishes to point out why the above is a really bad idea, or if someone has a better one.
As far as I can tell, the shlex module is the wrong tool if you are dealing with the Windows shell.
The first paragraph of the docs says (my italics):
The shlex class makes it easy to write lexical analyzers for simple syntaxes resembling that of the Unix shell.
Admittedly, that talks about just one class, not the entire module. Later, the docs for the quote function say (boldface in the original, this time):
Warning The shlex module is only designed for Unix shells.
To be honest, I'm not sure what the non-Posix mode is supposed to be compatible with. It could be, but this is just me guessing, that the original versions of shlex parsed a syntax of its own which was not quite compatible with anything else, and then Posix mode got added to actually be compatible with Posix shells. This mailing list thread, including this mail from ESR seems to support this.
For the -o parameter, but the leading " at the start of it not in the middle, and double the backslashes
Then use posix=True
import shlex
command = r'program.exe "C:\some folder" -o"\\server\share\some folder" "a \"quote\""'
print( "Original command Posix=True", shlex.split(command, posix=True) )
command = r'program.exe "C:\some folder" "-o\\\\server\\share\\some folder" "a \"quote\""'
print( "Updated command Posix=True", shlex.split(command, posix=True) )
result:
Original command Posix=True ['program.exe', 'C:\\some folder', '-o\\server\\share\\some folder', 'a "quote"']
Updated command Posix=True ['program.exe', 'C:\\some folder', '-o\\\\server\\share\\some folder', 'a "quote"']
The backslashes are still double in the result, but that's standard Python representation of a \ in a string.

Execute bash-command with "at" (<<<) via python: syntax error, last token seen

I'm using a radio sender on my RPi to control some light-devices at home. I'm trying to implement a time control and had successfully used the program "at" in the past.
#!/usr/bin/python
import subprocess as sp
##### some code #####
sp.call(['at', varTime, '<<<', '\"sudo', './codesend', '111111\"'])
When I execute the program, i receive the
errmsg:
syntax error. Last token seen: <
Garbled time
This codesnipped works fine with every command by itself (as long every parameter is from type string).
It's neccessary to call "at" in this way: at 18:25 <<< "sudo ./codesend 111111" to hold the command in the queue (viewable in "atq"),
because sudo ./codesend 111111 | at 18:25 just executes the command directly and writes down the execution in "/var/mail/user".
My question ist, how can I avoid the syntax error.
I'm using a lot of other packages in this program, so I have to stay with Python
I hope someone has a solution for this problem or can help to find my mistake.
Many thanks in advance
Preface: Shared Code
Consider the following context to be part of both branches of this answer.
import subprocess as sp
try:
from shlex import quote # Python 3
except ImportError:
from pipes import quote # Python 2
# given the command you want to schedule, as an array...
cmd = ['sudo', './codesend', '111111']
# ...generate a safely shell-escaped string.
cmd_str = ' '.join(quote(x) for x in cmd))
Solution A: Feed Stdin In Python
<<< is shell syntax. It has no meaning to at, and it's completely normal and expected for at to reject it if given as a literal argument.
You don't need to invoke a shell, though -- you can do the same thing directly from native Python:
p = sp.Popen(['at', vartime], stdin=sp.PIPE)
p.communicate(cmd_str)
Solution B: Explicitly Invoke A Shell
Moreover, <<< isn't /bin/sh syntax -- it's an extension honored in bash, ksh, and others; so you can't reliably get it just by adding the shell=True flag (which uses /bin/sh and so guarantees only POSIX-baseline features). If you want it, you need to explicitly invoke a shell with the feature, like so:
bash_script = '''
at "$1" <<<"$2"
'''
sp.call(['bash', '-c', bash_script,
'_', # this is $0 for that script
vartime, # this is its $1
cmd_str, # this is its $2
])
In either case, note that we're using shlex.quote() or pipes.quote() (as appropriate for our Python release) when generating a shell command from an argument list; this is critical to avoid creating shell injection vulnerabilities in our software.

Command copied from the command line not running when called with subprocss.Popen in Python

Scratching my head... this curl command will work fine from the command line when I copy it from here and paste it in my Windows 7 command line, but I can't get it to execute in my Python 2.7.9 script. Says the system cannot find the specified file. Popen using 'ping' or something like that works just fine, so I'm sure this is a goober typo that I'm just not seeing. I would appreciate a separate set of eyes and any comments as to what is wrong.
proc = subprocess.Popen("curl --ntlm -u : --upload-file c:\\temp\\test.xlsx http://site.domain.com/sites/site/SiteDirectory/folder/test.xlsx")
Have a look at second two paragraphs of the subprocess.Popen documentation if you haven't already:
args should be a sequence of program arguments or else a single string. By default, the program to execute is the first item in args if args is a sequence. If args is a string, the interpretation is platform-dependent and described below. See the shell and executable arguments for additional differences from the default behavior. Unless otherwise stated, it is recommended to pass args as a sequence.
On Unix, if args is a string, the string is interpreted as the name or path of the program to execute. However, this can only be done if not passing arguments to the program. [emphasis mine]
Instead you should pass in a list in which each argument to the program (including the executable name itself) is given as a separate item in the list. This is generally going to be safer in a cross-platform context anyways.
Update: I see now that you're using Windows in which case the advice on UNIX doesn't apply. On Windows though things are even more hairy. The best advice remains to use a list :)
Update 2: Another possible issue (and in fact the OP's issue as reported in the comments on this answer) is that because the full path to the curl executable was not given, it may not be found if the Python interpreter is running in an environment with a different PATH environment variable.

Run mutiple gerrit queries in python

I am trying to run a gerrit cherry pick query in python
query_to_run='git fetch https://gerritserver.com/projectname refs/changes/51/1151/1 ' + '&&' + ' git cherry-pick FETCH_HEAD'
I am getting error:
fatal: Couldn't find remote ref &&
Unexpected end of command stream
My code works with other gerrit queries but not this one, is it the && which is causing problem!
thanks
Pratibha
The && token has no meaning to Git or Gerrit but is interpreted by your shell. By default the subprocess module doesn't pass off commands to the shell but runs the process directly, so the string in query_to_run is sent as a single command. To force subprocess.Popen(), subprocess.check_call() or whatever you're using to pass the command to a shell, pass shell=True:
subprocess.check_call(query_to_run, shell=True)
However, the use of shell=True is discouraged and is unnecessary in this case. What && does is simply run one command and, if successful, run another command. It's basically equivalent to this sequence of Python statements:
subprocess.check_call(command1)
subprocess.check_call(command2)
Alternatively, if you prefer not have exceptions thrown when either of the commands fail:
subprocess.call(command1) != 0 and subprocess.call(command2) != 0
In addition to this, I strongly recommend making a good habit out of passing lists of arguments to process execution functions instead of strings. Passing strings works fine a lot of the time, but when arguments contain spaces you suddenly need to think about quoting.
Putting everything together, this is what I think your code should look like:
try:
subprocess.check_call(['git', 'fetch',
'https://gerritserver.com/projectname',
'refs/changes/51/1151/1'])
subprocess.check_call(['git', 'cherry-pick', 'FETCH_HEAD'])
except (EnvironmentError, subprocess.CalledProcessError):
# Suitable error handling here. I'm not sure about
# the possibility of EnvironmentError exceptions.
Also, a note on terminology: You're talking about Gerrit queries, but using that language might confuse people. By Gerrit query one usually means the Lucene query string entered into the search box in the UI (or the equivalent REST API).

Handling lines with quotes using python's readline

I've written a simple shell-like program that uses readline in order to provide smart completion of arguments. I would like the mechanism to support arguments that have spaces and are quoted to signify as one argument (as with providing the shell with such).
I've seen that shlex.split() knows how to parse quoted arguments, but in case a user wants to complete mid-typing it fails (for example: 'complete "Hello ' would cause an exception to be thrown when passed to shlex, because of unbalanced quotes).
Is there code for doing this?
Thanks!
I don't know of any existing code for the task, but if I were to do this I'd catch the exception, try adding a fake trailing quote, and see how shlex.split does with the string thus modified.
GNU Readline allows for that scenario with the variable rl_completer_quote_characters. Unfortunatelly, Python does not export that option on the standard library's readline module (even on 3.7.1, the latest as of this writing).
I found a way of doing that with ctypes, though:
import ctypes
libreadline = ctypes.CDLL ("libreadline.so.6")
rl_completer_quote_characters = ctypes.c_char_p.in_dll (
libreadline,
"rl_completer_quote_characters"
)
rl_completer_quote_characters.value = '"'
Note this is clearly not portable (possibly even between Linux distros, as the libreadline version is hardcoded, but I didn't have plain libreadline.so on my computer), so you may have to adapt it for your environment.
Also, in my case, I set only double quotes as special for the completion feature, as that was my use case.
References
https://robots.thoughtbot.com/tab-completion-in-gnu-readline#adding-quoting-support
#eryksun's comment on how to set data to a global variable in a shared library using python
To make #caxcaxcoatl answer a little bit more portable, readline hardcoded version can be replaces with readline.__file__ and it will be:
import ctypes
import readline
libreadline = ctypes.CDLL (readline.__file__)
rl_completer_quote_characters = ctypes.c_char_p.in_dll (
libreadline,
"rl_completer_quote_characters"
)
rl_completer_quote_characters.value = '"'

Categories

Resources