How to run cmd windows netsh command using python? - python

I am trying to run the following netsh command on Windows 7 however It returns incorrect syntax
Python 2.7.3 (default, Apr 10 2012, 23:31:26) [MSC v.1500 32 bit (Intel)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import os
>>> os.system("netsh interface ipv4 set interface ""Conexão de Rede sem Fio"" metric=1")
The syntax of the file name, directory name or volume label is incorrect.
1
>>>
What's wrong?

os.systemis a very old choice and not really recommended.
Instead you should consider subprocess.call() or subprocess.Popen().
Here is how to use them:
If you don't care about the output, then:
import subprocess
...
subprocess.call('netsh interface ipv4 set interface ""Wireless Network" metric=1', shell=True)
If you do care about the output, then:
netshcmd=subprocess.Popen('netsh interface ipv4 set interface ""Wireless Network" metric=1', shell=True, stderr=subprocess.PIPE, stdout=subprocess.PIPE )
output, errors = netshcmd.communicate()
if errors:
print "WARNING: ", errors
else:
print "SUCCESS ", output

Related

Self invocation of interactive shell through Python3 with bash

I am using python3 and subprocess.Popen to spawn a process of bash and invoking the Python3 interpreter again through the standard interpreter.
bash -i states:
-s If the -s option is present, or if no arguments remain after
option processing, then commands are read from the standard
input. This option allows the positional parameters to be
set when invoking an interactive shell.
This is a minimized example but it mainly bakes down to the following code:
import subprocess
import sys
p = subprocess.Popen(["bash", "-s"], stdin=subprocess.PIPE,stderr=sys.stderr, stdout=sys.stdout)
p.stdin.write(b"python3\n")
p.stdin.flush()
print("Done")
The output is simply "Done". Any suggestions how I need to handle the stdin pipes in order to let the interactive shell pop up inside the newly executed python3 interpreter?
Actual output
% python3 test.py
Done
Expected output:
% python3 test.py
Python 3.10.8 (main, Oct 13 2022, 10:17:43) [Clang 14.0.0 (clang-1400.0.29.102)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>>

Using pwntools process interactive mode to control python3

I am trying to use pwntools to control a python3 session. Here is my code:
from pwn import process
r = process(['python3'])
r.interactive()
However, after I enter r.interactive(), when I type into the terminal, the python3 sub-process has strange reactions. At least I do not see my commands echoed back most of the times.
I also tried to call python3 in a bash session, but the same thing happens.
$ python3
Python 3.8.5 (default, Jan 27 2021, 15:41:15)
[GCC 9.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from pwn import process
>>> r = process(['bash'])
[x] Starting local process '/usr/bin/bash'
[+] Starting local process '/usr/bin/bash': pid 119080
>>> r.interactive()
[*] Switching to interactive mode
echo hello
hello
echo this is bash
this is bash
python3
print(1)
print(2)
print(3)
exit
echo hello
File "<stdin>", line 5
echo hello
^
SyntaxError: invalid syntax
Why is this happening? Is it a bug in pwntools, or are there some configurations I overlook?
You need to specify the PTY in your shell, so like this:
$ python3
Python 3.9.2 (default, Feb 28 2021, 17:03:44)
[GCC 10.2.1 20210110] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from pwn import *; r = process(['python3'], stdin=PTY, raw=False); r.interactive()
[x] Starting local process '/usr/bin/python3'
[+] Starting local process '/usr/bin/python3': pid 2984281
[*] Switching to interactive mode
Python 3.9.2 (default, Feb 28 2021, 17:03:44)
[GCC 10.2.1 20210110] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> 1+1
1+1
2
>>>

subprocess.Popen('docker', stdout=subprocess.PIPE) does not redirect result only in OSX & only for 'docker'

When I run proc = subprocess.Popen('docker', stdout=subprocess.PIPE) on OSX 10.14.6 (18G84) with Python 3.7.4(installed via Homebrew) and docker 18.09.2(build 6247962), the stdout is printed in the console instead of being redirected.(i.e. proc.stdout.readlines() == [])
This only happens to 'docker' - another subprocess, for example 'ls', returned something via proc.stdout.readlines() instead of printing its output on the console.
Also, I made sure this is OSX-specific problem by running the same command on the EC2 instance(Amazon AMI 2, Docker 18.06.1-ce, Python 2.7.16&3.7.3), and proc.stdout.readlines() of docker subprocess returned the expeced result(no output in console, proc.stdout.readlines() contains docker output).
I googled and found a SO post Problems capturing Python subprocess output on Mac OS X but it seems like it's python problem(typo in python), not OSX problem.
Why is proc.stdout.readlines() returning nothing for 'docker'?
More details:
I was using thefuck open source cli tool, and one of its command was not working (specifically fixing docker typo commands.)
So I decided to fix it myself, and debugging it led me to this code:
...
import subprocess
...
def get_docker_commands():
proc = subprocess.Popen('docker', stdout=subprocess.PIPE)
lines = [line.decode('utf-8') for line in proc.stdout.readlines()]
These lines, where docker is run as subprocess and then its stdout is read by proc.stdout.readlines() was the source of the problem.
So I tried similar command in the python interactive console and it was like this:
(on Mac OS)
Python 3.7.4 (default, Jul 9 2019, 18:13:23)
[Clang 10.0.1 (clang-1001.0.46.4)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import subprocess
>>> proc = subprocess.Popen('docker', stdout=subprocess.PIPE)
(docker help output in console)
>>> proc.stdout.readlines()
[]
>>> proc = subprocess.Popen('ls', stdout=subprocess.PIPE)
>>> proc.stdout.readlines()
(ls of directory '/' as python list)
(On EC2 Instance)
Python 3.7.3 (default, Jun 24 2019, 19:20:54)
[GCC 7.3.1 20180303 (Red Hat 7.3.1-5)] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import subprocess
>>> proc = subprocess.Popen('docker', stdout=subprocess.PIPE)
>>> proc.stdout.readlines()
(docker output as python list)
I did not include the result but the python2's results are the same. (both Mac&EC2 are Python 2.7.16)

eshell starts python IDLE instead of running script when editing remotely

With Emacs 24.3.1, I get this when editing through Tramp/ssh in eshell:
/<remotepath> $ bash
/<remotepath> $ python test.py
hello world!
/<remotepath> $ exit
exit
/<remotepath> $ python test.py
Python 2.6.6 (r266:84292, Oct 12 2012, 14:23:48)
[GCC 4.4.6 20120305 (Red Hat 4.4.6-4)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>>
The file test.py is:
print "hello world!"
Bash is version 4.1.2. Does anyone have any explanation for this behavior?
I don't know eshell, but my guess is you forget to pass the positional parameters when creating your alias:
# don't forget the quotes
# ▼ ▼
~ $ alias python '/path/to/alternate/python $*'
# ▲▲
# don't forget positional parameters
See http://www.emacswiki.org/emacs/EshellAlias

Disable python auto-escaping for environment variables

I am having the following problem with python, being used to generate
files on Linux, that are being read on Windows. Python is auto-escaping
the strings, so that when written to a file, they are incorrect.
In my shell I have the environment variable set to a UNC path:
camd011> setenv python_error "\\\\a\\b\\c"
camd011> echo $python_error
\\a\b\c
I then retrieve this in python, as it will be used to generate C code
and a #include directive. However when I retrieve the value in python:
camd011> python
Python 1.6.1 (#1, Oct 17 2013, 15:08:20) [GCC 4.4.7 20120313 (Red Hat 4.4.7-3)] on linux2
Copyright (c) 1995-2001 Corporation for National Research Initiatives.
All Rights Reserved.
Copyright (c) 1991-1995 Stichting Mathematisch Centrum, Amsterdam.
All Rights Reserved.
>>> import os
>>> value = os.environ['python_error']
>>> value
'\\\\\\\\a\\\\b\\\\c'
As you can see above it has been auto-escaped, thus when I write it to a file:
>>> f = open("temp.txt", "w")
>>> f.write(value)
>>> f.close()
I end up with double-slashes, instead of a proper UNC path, and the code
now fails to compile. File:
\\\\a\\b\\c
i.e. the code includes a #include which now fails:
#include "\\\\a\\b\\c\file.h"
How do I stop python from auto-escaping my environment variable?
This appears to be a problem with quoting and dequoting in tcsh. It has nothing to do with Python -- Python gets the same variable that you can print out with the env command.
It appears that in tcsh, echo $FOO de-quotes the value of $FOO before printing. This seems to have misled you about what is really in your environment variable, so you've added an extra layer of quoting.
unaha-closp:~> setenv FOO "\\hello\world"
unaha-closp:~> echo $FOO
\hello\world
unaha-closp:~> env | grep FOO
FOO=\\hello\world
unaha-closp:~> python
Python 2.7.3 (default, Sep 26 2013, 20:03:06)
[GCC 4.6.3] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import os
>>> print os.environ["FOO"]
\\hello\world
>>>
unaha-closp:~> bash
svk#unaha-closp:~$ echo $FOO
\\hello\world
The proper setenv command should simply be setenv python_error "\\a\b\c".

Categories

Resources