I'm new in a company for IT and very few people here know Python so I can't ask then for help.
The problem: I need to create a script in Python that connects via ssh from my VM to my client server, after I access with my script I need to find a log file and search for a few data.
I tested my script within my Windows with a copy of that file and it searched everything that I need. However, I don't know how to do that connection via SSH.
I tried like this but I don't know where to start:
from subprocess import Popen, PIPE
import sys
ssh = subprocess.check_output(['ssh', 'my_server', 'password'], shell = True)
ssh.stdin.write("cd /path/")
ssh.stdin.write("cat file | grep err|error")
This generates a error "name 'subprocess' is not defined".
I don't understand how to use the subprocess nor how to begin to develop the solution.
Note: I can't use Paramiko because I don't have permission to install packages via pip or download the package manually.
You didn't import subprocess itself so you can't refer to it.
check_output simply runs a process and waits for it to finish, so you can't use that to run a process you want to interact with. But there is nothing interactive here, so let's use that actually.
The first argument to subprocess.Popen() and friends is either a string for the shell to parse, with shell=True; or a list of token passed directly to exec with no shell involved. (On some platforms, passing a list of tokens with shell=True actually happens to work, but this is coincidental, and could change in a future version of Python.)
ssh myhost password will try to run the command password on myhost so that's not what you want. Probably you should simply set things up for passwordless SSH in the first place.
... But you can use this syntax to run the commands in one go; just pass the shell commands to ssh as a string.
from subprocess import check_output
#import sys # Remove unused import
result = check_output(['ssh', 'my_server',
# Fix quoting and Useless Use of Cat, and pointless cd
"grep 'err|error' /path/file"])
Related
I'm trying to write my own shell script in Python for SSH to call to (using the SSH command= parameters in authorized_keys files). Currently I'm simply calling the original SSH command (it is set as an environment variable prior to the script being called my SSH). However, I always end up with a git error regarding the repository hanging up unexpectedly.
My Python code is literally:
#!/usr/bin/python
import os
import subprocess
if os.environ('SSH_ORIGINAL_COMMAND') is not None:
subprocess.Popen(os.environ('SSH_ORIGINAL_COMMAND'), shell=True)
else:
print 'who the *heck* do you think you are?'
Please let me know what is preventing the git command from successfully allowing the system to work. For reference, the command that is being called on the server when a client calls git push is git-receive-pack /path/to/repo.git.
Regarding the Python code shown above, I have tried using shell=True and shell=False (correctly passing the command as a list when False) and neither work correctly.
Thank you!
Found the solution!
You'll need to call the communicate() method of the subprocess object created by Popen call.
proc = subprocess.Popen(args, shell=False)
proc.communicate()
I'm not entirely sure why, however I think it has to do with the communicate() method allowing data to also be given via stdin. I thought the process would automatically accept input since I didn't override the input stream at all anywhere, but perhaps a manual call to communicate is needed to kick things off...hopefully someone can weigh in here!
You also can't stdout=subprocess.PIPE as it will cause the command to hang. Again, not sure if this is because of how git works or something to do about the whole process. Hopefully this at least helps someone in the future!
There is any possibility to run the python script in remote machine with out transfer files using SCP or better method? I have search a lot of libraries for solve this issue. But did not find the best solution for it.
I have found some libraries which perform SSH and SCP to remote machine using python script. In their approach first copy files to remote system using SCP and execute command over SSH.
Thanking you
You could use fabric.
Obviously it does depend on what exactly you want your remote python script to do, but it has a lot of helper functions for interacting with the OS including file upload & download but it's written all in Python.
You can invoke python with a flag to read the script from its standard input, and then feed the script to the python instance through the ssh connection, for example:
cat /some/script.py | ssh user#host 'python -'
Running python - will cause the python interpreter to read the script from the process's standard input. In this case, the standard input of the ssh process is passed to the remote system as the standard input of the python instance.
You can supply additional command-line arguments to the python script, if desired:
cat /some/script.py | ssh user#host 'python - arg1 arg2...'
Note that any import statements in the python script will be resolved on the remote system, so the script has to limit itself to modules which are available to the remote python interpreter.
Sure.
ssh remotehost python -c "'print(5)'"
A script needs not to be copied to execute it. The Python interpreter can run a script which is given as argument.
Double quoting is necessary (and will quickly become a hassle) because one level of quoting is removed by the ssh call and another is still necessary to group the script into one argument for the python call.
Multiline is also no problem:
ssh remotehost python -c "'
import random
print(random.random())'"
will print something like
0.998373816572
Just be aware of the quoting stuff when you use ' or " in your Python script as well.
Another option is to have the script in a variable and then use printf to quote the contents properly:
a='
import random
print(random.random())'
# or use something like a=$(cat myscript.py)
ssh ttiqadm#ttiq-hv02 python -c "$(printf "%q" "$a")"
Short version:
How to execute a linux command which requires input after execution using Python?
Long version:
I am building some fancy website stuff using Python to give my SVN server a way to be managed easier. (I can't remember all the linux commands)
So I want to create, delete and edit repo's and users using a webpage. I just came to the problem I do not know how to execute the following command using Python:
sudo htdigest /etc/apache2/dav_svn.htdigest "Subversion Repo" [username]
Well I know how to execute the command with os.system() or subprocess.Popen(), but the problem is that once that command is executed it asks to enter a password twice before continuing. Using multiple calls using os.system() or subprocess.Popen() won't work since they just create a new shell.
Is there a way in Python to let an argument be used once it is required?
It all depends, you can either use popen and handle bidirectional communication or if you are just waiting for known prompts, I would use pexpect:
So assuming, you wanted to spawn a program called myprocess and waited for the password prompt that had a > (greater than sign):
import pexpect
child = pexpect.spawn('myprocess')
child.expect('>')
child.sendline(password_var)
child.expect('>')
child.sendline(password_var)
So, this is kind of confusing but essentially I'm using Django and I want to instantiate a subprocess to run a perl script. I've read that this can be done with
arg = "/some/file/path/"
pipe = subprocess.Popen(["./uireplace", arg], stdin=subprocess.PIPE)
which works when I call it in the appropriate function in views.py but the script requires a sudo. I then call this
pipe = subprocess.Popen(["sudo","./uireplace", arg], stdin=subprocess.PIPE)
which works when I run it in python from a terminal but this doesn't work when it's called by a random user on the web. Is there any way to be able to automatically enter in a username and password for that sudo? The issue is that this can't be done with a prompt so it simply fails.
Solves this problem on the OS level. Giving any user from the web the right to use sudo does not sound right. Just make ./uireplace executable. There are lots of options for chmod to fine tune this.
i want to run and control PSFTP from a Python script in order to get log files from a UNIX box onto my Windows machine.
I can start up PSFTP and log in but when i try to run a command remotely such as 'cd' it isn't recognised by PSFTP and is just run in the terminal when i close PSFTP.
The code which i am trying to run is as follows:
import os
os.system("<directory> -l <username> -pw <password>" )
os.system("cd <anotherDirectory>")
i was just wondering if this is actually possible. Or if there is a better way to do this in Python.
Thanks.
You'll need to run PSFTP as a subprocess and speak directly with the process. os.system spawns a separate subshell each time it's invoked so it doesn't work like typing commands sequentially into a command prompt window. Take a look at the documentation for the standard Python subprocess module. You should be able to accomplish your goal from there. Alternatively, there are a few Python SSH packages available such as paramiko and Twisted. If you're already happy with PSFTP, I'd definitely stick with trying to make it work first though.
Subprocess module hint:
# The following line spawns the psftp process and binds its standard input
# to p.stdin and its standard output to p.stdout
p = subprocess.Popen('psftp -l testuser -pw testpass'.split(),
stdin=subprocess.PIPE, stdout=subprocess.PIPE)
# Send the 'cd some_directory' command to the process as if a user were
# typing it at the command line
p.stdin.write('cd some_directory\n')
This has sort of been answered in: SFTP in Python? (platform independent)
http://www.lag.net/paramiko/
The advantage to the pure python approach is that you don't always need psftp installed.