I am using a Mac.
I have a Python file on a server. Let's say it's here: http://jake.com/python_file.py. I want to make a script that I can run in Terminal by double-clicking and will run the Python file in Terminal on the local machine. Any ideas?
Would I have to use SSH to connect to the server and download the file to a temporary location on the hard drive (/tmp?) and then delete it when I'm done? But there's another problem with this. It would have to download the Python file to a location in the user's home folder because I don't think users have the necessary permissions to write to the folder /tmp or /var or something like that.
I was looking around for a solution to my problem and found this. It talks about how to execute a remote script using SSH on unix but I tried doing this with my Python file and it didn't work.
In case you didn't realize, the main reason why I am looking to do this is so that a user can run the file locally but they are unable to read/edit the actual Python file which is stored on the server.
If anyone has any ideas on how to accomplish this (whether using the ideas mentioned above or not) please let me know I'd really appreciate it!
Thanks,
Jake
How about a python script to download and execute the code like so?
import requests
py1 = requests.get('https://example.com/file.py').content
exec(py1, globals(), locals())
note: I'm using the requests library, but you could just as easily use the built in httplib's HTTPSConnection. It's just more verbose.
note 2: When you say you don't want the user to be able to "read/edit the actual Python file", they will be able to read it if they open the url themselves and view the content. they are just less likely to edit the file locally and mess something up. You can also deploy updates to your python script to the URL rather than having to copy them locally to every machine. This could be a security risk depending on the scope of what you are using it for.
Assuming that the remote Python script is a single file with no dependencies other than those of the Python standard library, and that a compatible version of Python is installed on the user's local machine, there are a few obvious ways to do it.
Use ssh:
#!/usr/bin/env sh
ssh user#host cat /path/to/python/script.py | python
Or use scp:
#!/usr/bin/env sh
TMPFILE=/tmp/$$
scp user#host:/path/to/python/script.py $TMPFILE
python $TMPFILE
rm $TMPFILE
The first one has the obvious advantage of not requiring any mucking about with copying files and cleaning up afterwards.
If you wanted to execute the python script on the remote server, this can also be done with ssh:
ssh user#host python /path/to/python/script.py
Standard input and output will be a terminal on the user's local machine.
Related
So there are variants of this question - but none quite hit the nail on the head.
I want to run spyder and do interactive analysis on a server. I have two servers , neither have spyder. They both have python (linux server) but I dont have sudo rights to install packages I need.
In short the use case is: open spyder on local machine. Do something (need help here) to use the servers computation power , and then return results to local machine.
Update:
I have updated python with my packages on one server. Now to figure out the kernel name and link to spyder.
Leaving previous version of question up, as that is still useful.
The docker process is a little intimidating as does paramiko. What are my options?
(Spyder maintainer here) What you need to do is to create an Spyder kernel in your remote server and connect through SSH to it. That's the only facility we provide to do what you want.
You can find the precise instructions to do that in our docs.
I did a long search for something like this in my past job, when we wanted to quickly iterate on code which had to run across many workers in a cluster. All the commercial and open source task-queue projects that I found were based on running fixed code with arbitrary inputs, rather than running arbitrary code.
I'd also be interested to see if there's something out there that I missed. But in my case, I ended up building my own solution (unfortunately not open source).
My solution was:
1) I made a Redis queue where each task consisted of a zip file with a bash setup script (for pip installs, etc), a "payload" Python script to run, and a pickle file with input data.
2) The "payload" Python script would read in the pickle file or other files contained in the zip file. It would output a file named output.zip.
3) The task worker was a Python script (running on the remote machine, listening to the Redis queue) that would would unzip the file, run the bash setup script, then run the Python script. When the script exited, the worker would upload output.zip.
There were various optimizations, like the worker wouldn't run the same bash setup script twice in a row (it remembered the SHA1 hash of the most recent setup script). So, anyway, in the worst case you could do that. It was a week or two of work to setup.
Edit:
A second (much more manual) option, if you just need to run on one remote machine, is to use sshfs to mount the remote filesystem locally, so you can quickly edit the files in Spyder. Then keep an ssh window open to the remote machine, and run Python from the command line to test-run the scripts on that machine. (That's my standard setup for developing Raspberry Pi programs.)
I normally use a bash script to grab all the files onto local machine and use glob to process all the files. Just wondering what would be the best way to use python (instead of another bash script) to ssh into each server and process those files?
My current program runs as
for filename in glob.glob('*-err.txt'):
input_open = open (filename, 'rb')
for line in input_open:
do something
My files all have the ending -err.txt and the directories where they reside in the remote server have the same name /documents/err/. I am not able to install third party libraries as I don't have the permission.
UPDATE
I am trying to not to scp the files from the server but to read it on the remote server instead..
I want to use a local python script LOCALLY to read in files on remote server.
The simplest way to do it is to use paramico_scp to use ssh copy from the remote server (How to scp in python?)
If you are not allowed to download any libraries, you can create SSH key pair so that connecting to server does not require a password (https://www.debian.org/devel/passwordlessssh). You then can for each file do
import os
os.system('scp user#host:/path/to/file/on/remote/machine /path/to/local/file')
Note that using system is usually considered less portable than using libraries. If you give the script that use system('scp ...') to copy the files and they do not have SSH key pair set up, they will experience problems
Looks like you want to use a local Python script remotely. This has been answered here.
I am executing a python script on one server and needing to read the contents of the passwd file from a remote machine. Does anyone know of a way to do this? Normally I would do:
import pwd
pwlist = pwd.getpwall()
#perform operations
This only works for the current system of course, and I'm needing a way to access another machine (like you would via ssh). Any help is appreciated.
The pwd module will only work for the current machine. It uses the C library functions defined in <pwd.h>, which do not provide any parameters for a remote machine. However, this does not prevent you from using ssh tools to run a script on the remote machine.
You can use a utility such as scp to copy the passwd file locally. Its easy to split the lines and get the information.
I would like to achieve the following things:
Given file contains a job list which I need to execute one by one in a remote server using SSH APIs and store results.
When I try to call the following command directly on remote server using putty it executes successfully but when I try to execute it through python SSH programming it says cant find autosys.ksh.
autosys.ksh autorep -J JOB_NAME
Any ideas? Please help. Thanks in advance.
Fabric is a good bet. From the home page,
Fabric is a Python (2.5 or higher) library and command-line tool for streamlining the use of SSH for application deployment or systems administration tasks.
A quick example,
>>> from fabric.api import run, env, cd, settings, hide, show
>>> env.host_string='xxx.xxx.com'
>>> env.user='user'
>>> env.password='password'
>>> run('ls -lart')
After reading your comment on the first answer, you might want to create a bash script with bash path as the interpreter line and then the autosys commands.
This will create a bash shell and run the commands from the script in the shell.
Again, if you are using autosys commands in the shell you better set autosys environment up for the user before running any autosys commands.
With Python, I need to read a file into a script similar to open(file,"rb"). However, the file is on a server that I can access through SSH. Any suggestions on how I can easily do this? I am trying to avoid paramiko and am using pexpect to log into the SSH server, so a method using pexpect would be ideal.
Thanks,
Eric
You can mount the remote file system to local by using sshfs, then you can use is like normal file. The fuse module is needed by the sshfs.
If it's a short file you can get output of ssh command using subprocess.Popen
ssh root#ip_address_of_the_server 'cat /path/to/your/file'
Note: Password less setup using keys should be configured in order for it to work.