Run python script in remote machine without using SCP - python

There is any possibility to run the python script in remote machine with out transfer files using SCP or better method? I have search a lot of libraries for solve this issue. But did not find the best solution for it.
I have found some libraries which perform SSH and SCP to remote machine using python script. In their approach first copy files to remote system using SCP and execute command over SSH.
Thanking you

You could use fabric.
Obviously it does depend on what exactly you want your remote python script to do, but it has a lot of helper functions for interacting with the OS including file upload & download but it's written all in Python.

You can invoke python with a flag to read the script from its standard input, and then feed the script to the python instance through the ssh connection, for example:
cat /some/script.py | ssh user#host 'python -'
Running python - will cause the python interpreter to read the script from the process's standard input. In this case, the standard input of the ssh process is passed to the remote system as the standard input of the python instance.
You can supply additional command-line arguments to the python script, if desired:
cat /some/script.py | ssh user#host 'python - arg1 arg2...'
Note that any import statements in the python script will be resolved on the remote system, so the script has to limit itself to modules which are available to the remote python interpreter.

Sure.
ssh remotehost python -c "'print(5)'"
A script needs not to be copied to execute it. The Python interpreter can run a script which is given as argument.
Double quoting is necessary (and will quickly become a hassle) because one level of quoting is removed by the ssh call and another is still necessary to group the script into one argument for the python call.
Multiline is also no problem:
ssh remotehost python -c "'
import random
print(random.random())'"
will print something like
0.998373816572
Just be aware of the quoting stuff when you use ' or " in your Python script as well.
Another option is to have the script in a variable and then use printf to quote the contents properly:
a='
import random
print(random.random())'
# or use something like a=$(cat myscript.py)
ssh ttiqadm#ttiq-hv02 python -c "$(printf "%q" "$a")"

Related

Start two shell sudo scripts in two different terminals from python3

I have an embedded system on which I run code live. Every time I want to run code, I start two scripts in two different terminals: "run1.sh" and "run2.sh". I can see the output of those scripts in my terminals (I wish to too).
Now I want to make a python script that starts those two scripts in two different terminals. I want to still see their output. Also I want to insert a password from the python script to the terminals, since the scripts run in sudo mode. I've played a lot with supbrocess and the PIPES but I've never achieved all of the above requirements simultaneously. How can these requirements be met?
I'm using Ubuntu btw (so I have gnome terminal)
Update : I was probably not clear in my question, but this has to be inside a python script. It is not for my convenience, it's part of an integration process. The code of the script will be part of a larger python program, so the whole point of the question is how do I do it in python.
Based on your new information added I've created an small python script which will launch two terminals and their output separately:
Main script:
mortiz#florida:~/Documents/projects/python/split_python_execution$ cat split_pythonstuff.py
#!/usr/bin/python3
import subprocess
subprocess.call(['gnome-terminal', '-x', 'python', '/home/mortiz/Documents/projects/python/split_python_execution/script1.py'])
subprocess.call(['gnome-terminal', '-x', 'python', '/home/mortiz/Documents/projects/python/split_python_execution/script2.py'])
Script 1:
mortiz#florida:~/Documents/projects/python/split_python_execution$ cat script1.py
#!/usr/bin/python3
while True :
print ('script 1')
Script 2:
mortiz#florida:~/Documents/projects/python/split_python_execution$ cat script2.py
#!/usr/bin/python3
while True:
print ('script 2')
From here I guess you can develop anything you want.
UPDATE: About sudo
Sudoers is a great way of controlling which things can be executed by specific users providing passwords or not.
If you add this line in /etc/sudoers there's not need for a password when you pass sudo to your command:
<YOUR_USER> ALL = NOPASSWD : /usr/bin/python <SCRIPT.py>
In your question as far as I understand you have the password stored inside the script. There's no need to do that and it's a bad practice. Sudoers would be a better way.
Anyway, if you want to do it in an insecure way then refer to this question and place it before the commands in the scripts provided in this answer.
The linked provided works:
echo -e "mypassword\n" | sudo -S python test.py
15
You only need to implement that on the previous code.
You could install Terminator and configure one profile per terminal to run any script you want.
I have a default template which will load 3 terminals and run 3 different commands / or scripts if you wanted to:
When I load that profile the first one will move me to my projects dir and list them. The next one will run df -h to see the space available and the lower my ip configuration.
This way would save you lots of programming and it's quite easy.
UPDATE: It will run any command, bash, zsh, python, etc.. available for your terminal. If the script is locally in your machine:
python <your_script_1> # first terminal profile
python <your_script_2> # second terminal profile
both would be executed "at the same time".
If your scripts are remote in the target machine, simply create a bash script using ssh to connect to the remote machine with a private key and then running the script, the result is the same in both scenarios.
EDIT: The best thing is setting colors and transparency for each terminal, so you can enjoy the penguin's selfie while you work.

automated python script via vm

I'm new in a company for IT and very few people here know Python so I can't ask then for help.
The problem: I need to create a script in Python that connects via ssh from my VM to my client server, after I access with my script I need to find a log file and search for a few data.
I tested my script within my Windows with a copy of that file and it searched everything that I need. However, I don't know how to do that connection via SSH.
I tried like this but I don't know where to start:
from subprocess import Popen, PIPE
import sys
ssh = subprocess.check_output(['ssh', 'my_server', 'password'], shell = True)
ssh.stdin.write("cd /path/")
ssh.stdin.write("cat file | grep err|error")
This generates a error "name 'subprocess' is not defined".
I don't understand how to use the subprocess nor how to begin to develop the solution.
Note: I can't use Paramiko because I don't have permission to install packages via pip or download the package manually.
You didn't import subprocess itself so you can't refer to it.
check_output simply runs a process and waits for it to finish, so you can't use that to run a process you want to interact with. But there is nothing interactive here, so let's use that actually.
The first argument to subprocess.Popen() and friends is either a string for the shell to parse, with shell=True; or a list of token passed directly to exec with no shell involved. (On some platforms, passing a list of tokens with shell=True actually happens to work, but this is coincidental, and could change in a future version of Python.)
ssh myhost password will try to run the command password on myhost so that's not what you want. Probably you should simply set things up for passwordless SSH in the first place.
... But you can use this syntax to run the commands in one go; just pass the shell commands to ssh as a string.
from subprocess import check_output
#import sys # Remove unused import
result = check_output(['ssh', 'my_server',
# Fix quoting and Useless Use of Cat, and pointless cd
"grep 'err|error' /path/file"])

Launching subprocesses on resource limited machine

Edit:
The original intent of this question was to find a way to launch an interactive ssh session via a Python script. I'd tried subprocess.call() before and had gotten a Killed response before anything was output onto the terminal. I just assumed this was an issue/limitation with the subprocess module instead of an issue somewhere else.This was found not to be the case when I ran the script on a non-resource limited machine and it worked fine.
This then turned the question into: How can I run an interactive ssh session with whatever resource limitations were preventing it from running?
Shoutout to Charles Duffy who was a huge help in trying to diagnose all of this .
Below is the original question:
Background:
So I have a script that is currently written in bash. It parses the output of a few console functions and then opens up an ssh session based on those parsed outputs.
It currently works fine, but I'd like to expand it's capabilities a bit by adding some flag arguments to it. I've worked with argparse before and thoroughly enjoyed it. I tried to do some flag work in bash, and let's just say it leaves much to be desired.
The Actual Question:
Is it possible to have python to do stuff in a console and then put the user in that console?
Something like using subprocess to run a series of commands onto the currently viewed console? This in contrast to how subprocess normally runs, where it runs commands and then shuts the intermediate console down
Specific Example because I'm not sure if what I'm describing makes sense:
So here's a basic run down of the functionality I was wanting:
Run a python script
Have that script run some console command and parse the output
Run the following command:
ssh -t $correctnode "cd /local_scratch/pbs.$jobid; bash -l"
This command will ssh to the $correctnode, change directory, and then leave a bash window in that node open.
I already know how to do parts 1 and 2. It's part three that I can't figure out. Any help would be appreciated.
Edit: Unlike this question, I am not simply trying to run a command. I'm trying to display a shell that is created by a command. Specifically, I want to display a bash shell created through an ssh command.
Context For Readers
The OP is operating on a very resource-constrained (particularly, it appears, process-constrained) jumphost box, where starting an ssh process as a subprocess of python goes over a relevant limit (on number of processes, perhaps?)
Approach A: Replacing The Python Interpreter With Your Interactive Process
Using the exec*() family of system calls causes your original process to no longer be in memory (unlike the fork()+exec*() combination used to start a subprocess while leaving the parent process running), so it doesn't count against the account's limits.
import argparse
import os
try:
from shlex import quote
except ImportError:
from pipes import quote
parser = argparse.ArgumentParser()
parser.add_argument('node')
parser.add_argument('jobid')
args = parser.parse_args()
remote_cmd_str = 'cd /local_scratch/pbs.%s && exec bash -i' % (quote(args.jobid))
local_cmd = [
'/usr/bin/env', 'ssh', '-tt', node, remote_cmd_str
]
os.execv("/usr/bin/env", local_cmd)
Approach B: Generating Shell Commands From Python
If we use Python to generate a shell command, the shell can invoke that command only after the Python process exited, such that we stay under our externally-enforced process limit.
First, a slightly more robust approach at generating eval-able output:
import argparse
try:
from shlex import quote
except ImportError:
from pipes import quote
parser = argparse.ArgumentParser()
parser.add_argument('node')
parser.add_argument('jobid')
args = parser.parse_args()
remoteCmd = ['cd', '/local_scratch/pbs.%s' % (args.jobid)]
remoteCmdStr = ' '.join(quote(x) for x in remoteCmd) + ' && bash -l'
cmd = ['ssh', '-t', args.correctnode, remoteCmdStr]
print(' '.join(pipes.quote(x) for x in cmd)
To run this from a shell, if the above is named as genSshCmd:
#!/bin/sh
eval "$(genSshCmd "$#")"
Note that there are two separate layers of quoting here: One for the local shell running eval, and the second for the remote shell started by SSH. This is critical -- you don't want a jobid of $(rm -rf ~) to actually invoke rm.
This is in no way a real answer, just an illustration to my comment.
Let's say you have a Python script, test.py:
import argparse
if __name__ == '__main__':
parser = argparse.ArgumentParser()
parser.add_argument('myarg', nargs="*")
args = parser.parse_args()
print("echo Hello world! My arguments are: " + " ".join(args.myarg))
So, you create a bash wrapper around it, test.sh
set -e
$(python test.py $*)
and this is what you get:
$ bash test.sh
Hello world! My arguments are:
$ bash test.sh one two
Hello world! My arguments are: one two
What is going on here:
python script does not execute commands. Instead, it outputs the commands bash script will run (echo in this example). In your case, the last command will be ssh blabla
bash executes the output of the python script (the $(...) part), passing on all its arguments (the $* part)
you can use argparse inside the python script; if anything is wrong with the arguments, the message will be put to stderr and will not be executed by bash; bash script will stop because of set -e flag

How to run a bash script from another computer?

I have developed a Python application which needs to call a bash script stored in another computer (Raspberry Pi).
I don't need to get any return value nor confirmation.
What are the feasible ways to do that?
Thanks!
From the shell you could do it like this:
ssh pi#theraspberrypi "./myscript"
To run a shell command from in Python:
import os
os.system("ssh pi#theraspberrypi ./myscript")
Or, as Eevee suggested below:
import subprocess
subprocess.call(['ssh pi#theraspberrypi ./myscript'], shell=True)
Of course, you will probably want to put your public key in the raspberry pi's authorized_keys file so it won't prompt for a password.

Can i control PSFTP from a Python script?

i want to run and control PSFTP from a Python script in order to get log files from a UNIX box onto my Windows machine.
I can start up PSFTP and log in but when i try to run a command remotely such as 'cd' it isn't recognised by PSFTP and is just run in the terminal when i close PSFTP.
The code which i am trying to run is as follows:
import os
os.system("<directory> -l <username> -pw <password>" )
os.system("cd <anotherDirectory>")
i was just wondering if this is actually possible. Or if there is a better way to do this in Python.
Thanks.
You'll need to run PSFTP as a subprocess and speak directly with the process. os.system spawns a separate subshell each time it's invoked so it doesn't work like typing commands sequentially into a command prompt window. Take a look at the documentation for the standard Python subprocess module. You should be able to accomplish your goal from there. Alternatively, there are a few Python SSH packages available such as paramiko and Twisted. If you're already happy with PSFTP, I'd definitely stick with trying to make it work first though.
Subprocess module hint:
# The following line spawns the psftp process and binds its standard input
# to p.stdin and its standard output to p.stdout
p = subprocess.Popen('psftp -l testuser -pw testpass'.split(),
stdin=subprocess.PIPE, stdout=subprocess.PIPE)
# Send the 'cd some_directory' command to the process as if a user were
# typing it at the command line
p.stdin.write('cd some_directory\n')
This has sort of been answered in: SFTP in Python? (platform independent)
http://www.lag.net/paramiko/
The advantage to the pure python approach is that you don't always need psftp installed.

Categories

Resources