Using python script to send GET request to a server with netcat - python

I'm running Ubuntu 16.04 and I'm trying to write a python script that makes a GET request to a specified image file given the url. As an example, in the code below:
host is www.google.com
port is 80
u.path is /images/srpr/logo3w.png
proc = Popen(["netcat {} {}".format(host, port)], shell= True)
proc = Popen(["GET {} HTTP/1.1".format(u.path)], shell= True)
proc = Popen(["Host: {}".format(host)], shell= True)
proc = Popen(["Connection: close"], shell= True)
proc = Popen(["\n"], shell= True)
My problem is that I can execute these normally in the terminal, but when I try to run the script it seems like sends the GET request to www.google.com before it takes the specification of u.path. I know it is doing this for two reasons. First, just before the server response comes in I get the following:
/bin/sh: 1: Host:: not found
/bin/sh: 1: Connection:: not found
Second, I know that the server response of the image data is a bunch of ugly stuff interpreted as weird Unicode symbols on the terminal, but I'm clearly getting the www.google.com HTML text on the server response.
I was thinking I may need to make it wait to do the HTTP request until the netcat STDIN is open, but I don't know how. Or maybe it's just completing the request because it's sending a \n somehow? I really don't know.
EDIT: It seems like it actually isn't sending the request to www.google.com. I saved the server response as a .html file and it looks like a cloudfront website
EDIT2: After more research, it seems as if the problem is that since netcat is interactive and so it 'deadlocks' or something like that. I tried to use proc.communicate() but since I need to send multiple lines it doesn't allow it seeing as communicate only allows the initial input to be written to STDIN and then it sends EOF or something along those lines. This led me to trying to use proc.stdin.write but this is apparently also known to cause deadlock with something related to making the Popen commands use subprocess.PIPE for STDIN, STDOUT, and STDERR. It also requires the input to be encoded as a bytes-like object, which I have done but when I send \r\n\r\n at the end to try to close the connection it doesn't do anything and the STDOUT just contains b'' which I understand to be an empty string in the form of bytes

For anyone that has a similar problem, here is the solution that I found:
#begin the interactive shell of netcat
proc = Popen(['netcat -q -1 {} {}'.format(host, port)], shell=True, stdout=PIPE, stdin=PIPE, stderr=PIPE)
#set file status flags on stdout to non-blocking reads
fcntl.fcntl(proc.stdout.fileno(), fcntl.F_SETFL, os.O_NONBLOCK)
#each time we write a diffrent line to the interactive shell
#we need to flush the buffer just to be safe
#credit to http://nigelarmstrong.me/2015/04/python-subprocess/
proc.stdin.write(str.encode('GET %s HTTP/1.1\n' %(path+filename)))
proc.stdin.flush()
proc.stdin.write(str.encode('Host: {}\n'.format(host)))
proc.stdin.flush()
proc.stdin.write(str.encode('Connection: close\n'))
proc.stdin.flush()
proc.stdin.write(str.encode('\r\n\r\n'))
proc.stdin.flush()
#give the server time to respond
proc.wait()
#store the server response (which is bytes-like)
#attempting to decode it results in error since we're recieving data as a mix of text/image
serv_response = proc.stdout.read()

Related

error while using os.popen() to read command-line output

I wrote a python program script1.py. Its general logic flow is as follows:
while(true)
if(hasTask)
print('task flow info print')
Then I wrote another python script called monitor.py, using os.popen() to monitor the console output of script1. py, and after obtaining specific information, sent a message to the Redis channel:
redis_key = "command"
cmd = "python script1.py"
pool = redis.ConnectionPool(host="127.0.0.1")
r = redis.Redis(connection_pool=pool)
with os.popen(cmd,"r") as stream:
while True:
buf = stream.readline().strip()
if re.match("target",buf) is not None:
message = stream.readline().strip()
command_info["command"] = message
r.publish(redis_key, json.dumps(command_info))
In the beginning, the monitor can correctly read the script output and send messages to Redis. The problem is that after some time, this combination does not seem to work properly, and no messages are sent to Redis. Why does this happen?
Is the file object returned by popen is too large? or how can I deal with it, need your help.

How can I use Python to automate setting a password using the Unix pass command line program

I'm trying to automate setting new passwords using the Unix pass program.
I understand that there is a Python library, pexpect, that might help, but I would like to avoid using third-party libraries.
When using a terminal, the flow looks like this:
$ pass insert --force gmail
>> Enter password for gmail: <type in password using masked prompt>
>> Retype password for gmail: <reenter password>
What I would like my function to do:
Run the command pass insert --force {entry_name}
Capture the output (and echo it for testing)
Check output for the presence of 'password for gmail', and if True
write '{password}\n' to stdin
write '{password}\n' to stdin again
Echo any errors or messages for testing
Issues:
I'm stuck on step 2. The subprocess either hangs indefinitely, times out with an error, or produces no output.
Attempts:
I've tried configurations of Popen(), using both stdin.write() and communicate().
I've set wait() calls at various points.
I've tried both the shell=True and shell=False options (prefer False for security reasons)
Code:
def set_pass_password(entry_name, password):
from subprocess import Popen, PIPE
command = ['pass', 'insert', '--force', entry_name]
sub = Popen(command, stdin=PIPE, stdout=PIPE, stderr=PIPE, universal_newlines=True)
# At this point I assume that the command has run, and that there is an "Enter password..." message
message = sub.stdout.read() # also tried readline() and readlines()
print(message) # never happens, because process hangs on stdout.read()
if 'password for {}'.format(entry_name) in message:
err, msg = sub.communicate(input='{p}\n{p}\n'.format(p=password))
print('errors: {}\nmessage: {}'.format(err, msg))
Edit: the original answer was about passwd, which is what's used to set passwords. I noticed late that you use pass, which is a keystore (doesn't actually change the Unix password). The pass program works differently and will not print a prompt if stdin is not a tty. Therefore the following very simple program works:
def set_pass_password(entry_name, password):
from subprocess import Popen, PIPE
command = ['pass', 'insert', '--force', entry_name]
sub = Popen(command, bufsize=0, stdin=PIPE, stdout=PIPE, stderr=PIPE)
err, msg = sub.communicate(input='{p}\n{p}\n'.format(p=password))
print('errors: {}\nmessage: {}'.format(err, msg))
if __name__ == "__main__":
set_pass_password("ttt", "ttt123asdqwe")
(you will see that both stderr and stdout are empty, if the command succeeded).
For the passwd command:
FYI: the passwd command outputs the prompt to stderr, not stdout.
NOTE: rather than sending the password twice in the same 'write', you might need to wait for the second prompt before sending the password again.
For this simple case, code similar to yours should work, but in general you should use select on all the pipes and send/receive data when the other side is ready, so you don't get deadlocks.

How to list current directory files through socket API?

So right now I have a simple FTP system to transfer files.
But I am confused about how I would run commands on the server machine from a client machine.
How would I open a terminal on the server machine from my client machine to use commands such as ls or mkdir or cd? Or can I do this straight from Socket Programming
You could use the python module subprocess. (https://pymotw.com/2/subprocess/)
For example, assuming you have a client/server 'dialogue' set up using sockets, you could do something like this:
client.py
# assume 's' is your socket already connected to the server
# prompt the user for a command to send
cmd = raw_input("user > ")
s.send(cmd) # send your command to the server
# let's say you input 'ls -la'
You could put the above code inside a loop that only breaks when you enter 'quit' or something, to continually send and receive commands. You would need a loop or something similar on the server side too, to continually accept and return the output from your commands. You could also use threads.
server.py
# on the server side do this
# s is again your socket bound to a port
# but we're on the server side this time!
from subprocess import Popen, PIPE
cmd = s.recv(1024)
# cmd now has 'ls -la' assigned to it
# parse it a bit
cmd = cmd.split() # to get ['ls', '-la']
# now we execute the command on the server with subprocess
p = Popen(cmd, stdout=PIPE)
result = p.communicate()
# result is, in this case, the listing of files in the current directory
s.send(result[0]) # result[0] should be a str
# you now make sure to receive your result on the client
Note: I think a newer version is subprocess32, but all methods are the same as far as I remember.

Calling subprocess.Popen from Django

I use the following Python code in Django to run Perl scripts from Django.
def run_command(cmd, input_data=None):
assert type(cmd) == list
stdout = ''
stderr = ''
p = subprocess.Popen(cmd,
bufsize=1000*1000,
stdin=subprocess.PIPE,
stderr=subprocess.PIPE,
stdout=subprocess.PIPE)
if (input_data):
p.stdin.write(input_data)
p.stdin.close()
log("closed stdin")
stdout = p.stdout.read()
log("read stdout")
stderr = p.stderr.read()
return (stdout, stderr)
After moving the code from Ubuntu 10:04 to 12:04 the call to p.stdout.read() has started failing now and then. When it fails the last thing I see in the log file is closed stdin and nginx display a 502 Bad Gateway page.
One thing you can try to get a better look at the issue is to use Django's 'runserver' instead of Nginx temporarily. This will allow you to see the traceback on the page instead of the 502.
If you aren't sure how to do this, you can use the following example:
python manage.py runserver 1.1.1.1:8080
Where you can replace 1.1.1.1 with 'localhost' or an IP of your choosing, and 8080 can be replaced with whatever open non-privileged port you would like. You might need to modify your settings file to set DEBUG=True.

Use subprocess to send a password

I'm attempting to use the python subprocess module to log in to a secure ftp site and then grab a file. However I keep getting hung up on just trying to send the password when it is requested. I so far have the following code:
from subprocess import Popen, PIPE
proc = Popen(['sftp','user#server', 'stop'], stdin=PIPE)
proc.communicate('password')
This still stops at the password prompt. If I enter the password manually it then goes to the ftp site and then enters the password on the command line. I've seen people suggest using pexpect but long story short I need a standard library solution. Is there anyway with subprocess and/or any other stdlib? What am I forgetting above?
Try
proc.stdin.write('yourPassword\n')
proc.stdin.flush()
That should work.
What you describe sounds like stdin=None where the child process inherits the stdin of the parent (your Python program).
Perhaps you should use an expect-like library instead?
For instance Pexpect (example). There are other, similar python libraries as well.
from subprocess import Popen, PIPE
proc = Popen(['sftp','user#server', 'stop'], stdin=PIPE)
proc.communicate(input='password')
Try with input=‘password’ in communicate, that worked for me.
Use Paramiko for SFTP. For anything else, this works:
import subprocess
args = ['command-that-requires-password', '-user', 'me']
proc = subprocess.Popen(args,
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
proc.stdin.write('mypassword\n')
proc.stdin.flush()
stdout, stderr = proc.communicate()
print stdout
print stderr
For some reason, I couldn't get any of the standard library answers here to work for me - getting very strange problems with all of them. Someone else here: unable to provide password to a process with subprocess [python] had the same problem, and concluded that ultimately you just have to go with pexpect to be able to send a password.
I wanted to add my final code here to just save the time of anyone having a similar problem, since I wasted so much time on this (Python 3, 2020):
ssh_password = getpass("user's password: ")
ssh_password = (ssh_password + "\n").encode()
scp_command = 'scp xx.xx.xx.xx:/path/to/file.log /local/save/path/'
child = pexpect.spawn(scp_command)
# make output visible for debugging / progress watching
child.logfile = sys.stdout.buffer
i = child.expect([pexpect.TIMEOUT, "password:"])
if i == 0:
print("Got unexpected output: {} {}".format(child.before, child.after))
return
else:
child.sendline(ssh_password)
child.read()
The above code runs an SCP command to pull a file from the remote server onto your local computer - alter the server IP and paths as necessary.
Key things to remember:
Have to have a pexpect.TIMEOUT in the child.expect call
Have to encode to bytes whatever strings you pass in, and have to use the default encode
Write pexpect output to sys.stdout.buffer so that you can actually see what is going on
Have to have a child.read() at the end
I would recommend scrapping the subprocess approach and using the paramiko package for sftp access.
This same problem plagued me for a week. I had to submit a password from user input through subprocess securely because I was trying to avoid introducing a command injection vulnerability. Here is how I solved the problem with a little help from a colleague.
import subprocess
command = ['command', 'option1', '--password']
subprocess.Popen(command, stdin=subprocess.PIPE).wait(timeout=60)
The .wait(timeout=int) was the most important component because it allows the user to feed input to stdin. Otherwise, the timeout is defaulted to 0 and leaves the user no time to enter input, which consequently results in a None or null string. Took me FOREVER to figure this out.
For repeat use-cases where you know you'll have to do this multiple times, you can override the popen function and use it as a private method which I was told by the same programmer is best practice if you anticipate someone else will be interested in maintaining the code later on and you don't want them to mess with it.
def _popen(cmd):
proc_h = subprocess.Popen(cmd, stdin=subprocess.PIPE)
proc_h.wait(timeout=60)
return proc_h.poll() == os.EX_OK
It is important to remove stdout=subprocess.PIPE if the user is going to be prompted for input. Otherwise, the process appears to hang for 60 seconds, and the user doesn't get a prompt, nor do they realize they are expected to give a password. The stdout will naturally go to the shell window and allow the user to pass input to popen().
Also, just to explain why you return proc_h.poll() == os.EX_OK, is that it returns 0 if the command succeeded. This is just c-style best-practice for when you want to return system error codes in the event the function fails, while accounting for the fact that return 0 will be treated as "false" by the interpreter.
This is a pure Python solution using expect - not pexpect.
If on Ubuntu you first need to install expect with:
sudo apt install expect
Python 3.6 or later:
def sftp_rename(from_name, to_name):
sftp_password = 'abigsecret'
sftp_username = 'foo'
destination_hostname = 'some_hostname'
from_name = 'oldfilename.txt'
to_name = 'newfilename.txt'
commands = f"""
spawn sftp -o "StrictHostKeyChecking no"
{sftp_username}#{destination_hostname}
expect "password:"
send "{sftp_password}\r"
expect "sftp>"
send "rename {from_name} {to_name}\r"
expect "sftp>"
send "bye\r"
expect "#"
"""
sp = subprocess.Popen(['expect', '-c', commands], stdin=subprocess.PIPE, stdout=subprocess.PIPE)
since what you want is just grab a file, I am trying to use "sub process" but it is not works for me. So now I am using paramiko, here is my code:
here is one tutorial I found online
Transfer a file from local server to remote server and vice versa using paramiko of python
"https://www.youtube.com/watch?v=dtvV2xKaVjw"
underneath is my code for transfering all the files in one folder from Linux to windows
import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostname='11.11.11.1111', username='root', password='********', port=22)
sftp_client = ssh.open_sftp()
source_folder = '/var/ftp/file_pass'
local_folder = 'C:/temp/file_pass'
inbound_files = sftp_client.listdir(source_folder)
print(inbound_files)
for ele in inbound_files:
try:
path_from = source_folder + '/' + ele
path_to = local_folder + '/'+ ele
sftp_client.get(path_from, path_to)
except:
print(ele)
sftp_client.close()
ssh.close()
Python have a built in library called ftplib, that can be used for ftp processes without any hassle. (Assuming the remote server have a ftp service running)
from ftplib import FTP
ftp = FTP('ftp.us.debian.org') # connect to host, default port
ftp.login() # user anonymous, passwd anonymous#
##'230 Login successful.'
ftp.cwd('debian') # change into "debian" directory
##'250 Directory successfully changed.'
ftp.retrlines('LIST')
Otherwise, You can use scp command, which is a command line tool. The problem with the password can be avoided creating password less user for remote host.
import os
os.system('scp remoteuser#remotehost:/remote/location/remotefile.txt /client/location/')
To create a passwordless user in linux systems,
Fallow below Steps. Fallow this SO answer.
> ssh-keyscan remotehost
> known_hosts ssh-keygen -t rsa # ENTER toevery field (One time)
> ssh-copy-id remoteuser#remotehost
The safest way to do this is to prompt for the password beforehand and then pipe it into the command. Prompting for the password will avoid having the password saved anywhere in your code. Here's an example:
from getpass import getpass
from subprocess import Popen, PIPE
password = getpass("Please enter your password: ")
proc = Popen("sftp user#server stop".split(), stdin=PIPE)
# Popen only accepts byte-arrays so you must encode the string
proc.communicate(password.encode())
import subprocess
args = ['command', 'arg1', 'arg2']
proc = subprocess.Popen(args, stdin=subprocess.PIPE, stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
proc.stdin.write(b'password') ##The b prefix is necessary because it needs a byte type
proc.stdin.flush()
stdout, stderr = proc.communicate()
print(stdout)
print(stderr)
You just forgot the line return (aka user pressing Enter) in your password.
from subprocess import Popen, PIPE
proc = Popen(['sftp','user#server', 'stop'], stdin=PIPE)
proc.communicate('password\n'.encode())
Also .encode() because by default proc.communicate() accept bytes-like object.

Categories

Resources