Running local python files on remote server - python

In order to run a local python script on remote server, I use the following command:
cat hello.py | ssh user#192.168.1.101 python -
But when I want to run a script which imports some files;
import _init_paths
from fast_rcnn.config import cfg
from fast_rcnn.test import im_detect
from fast_rcnn.nms_wrapper import nms
the above command doesn't solve my problem. How can I run such a python file on remote server?
Thanks,

Related

Create a tar.gz file and transfer it to network server

I have a Raspberry pi running on my home network along with a Macintosh that acts as a backup server. I am attempting to craft a Python3 script that will run on the Pi to create a tar.gz backup of the Pi's home folder (along with all of the contents) and transfer it to the Macintosh using SSH. I have the SSH connection (using keys) running but am stumped when I try to create the backup file and transfer it.
My code so far:
#!/usr/bin/python3
import paramiko
import tarfile
import os
ssh_client = paramiko.SSHClient()
ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh_client.connnect(hostname = '192.168.1.151', username = 'usernamehere')
print("I'm connected!")
#The following statement is the problem as I see it
tar czvf - ./home/PiOne/ | ssh_client -w:gz "cat > ./Volumes/PiBackups/PiOne/PiOneClone.tar.gz"
print("File transferred!")
ssh_client.close
I would appreciate any help in creating the script!
I think there is something easier to transfer a file from your VM to your local machine. Also that you are using SSH
scp -i /path/your/targz/file user#ip:/path/to/local/machine
Check permission while doing that using
ls -lh

Airflow Remote PythonOperator

I have a python script in my locale file and I don't want to SCP it to the remote machine and run with SSHOperator remotely triggered by airflow. How can I run a locale .py file in a remote machine and get results?
I need SSHOperator with python_callable, not bash_command.
Can anyone show me a remote custom operator sample like SSHPYTHONOperator ?
I solve problem following:
gettime="""
import os
import datetime
def gettimes():
print(True)
gettimes()
"""
remote_python_get_delta_times=SSHOperator(task_id= "get_delta_times",do_xcom_push=True,
command="MYVAR=`python -c" + ' "%s"`;echo $MYVAR' % gettime ,dag=dag,ssh_hook=remote)
I see an SSH operator in the Airflow docs: https://airflow.apache.org/docs/apache-airflow/1.10.13/_api/airflow/contrib/operators/ssh_operator/index.html
If that doesn't work out for you then, you'll have to create a custom Operator using an SSH library like Paramiko
and then use it to pull code from either Github/S3 or SCP your file to the server and then execute it there.
You would need to make sure all your dependencies are also installed on the remote server.

Python Fabric 2.4 no environment variables

I have problem with fabic (2.4). I have no access to environment variables in remote server (I'm using FreeBSD).
In my ~/.profile file i have variable:
export MY_KEY="123456789"
In my fabfile.py i have simple task:
from fabric import task
#task(hosts=['user#myhost.com'])
def deploy(context):
context.run('echo 123')
context.run('echo $MY_KEY')
When I run the fab deploy command, I see only 123 but after connecting via ssh my variable is visible.
And what about using Connection.prefix as a context manager ?
with context.prefix('MY_KEY="123456789"'):
context.run('echo 123')
context.run('echo $MY_KEY')

Python process on remote server need to ssh to other and continue running

Is it possible to solve that a running Python process which is running on a remote server, ssh to other and then continue running?
run in 192.168.100.1
|
ssh to there 192.168.100.2
|
run continuously on 192.168.100.2 and do other functions from py
I tried with subprocess call but when the script call ssh command and connect to the other, it stops and wait there.
You need use RPC to call python function on remote server.
There are many libs around. Let's take RPyC lib for example:
>>> import rpyc
>>> c = rpyc.classic.connect("localhost")
>>> c.execute("print 'hi there'") # this will print on the host
>>> import sys
>>> c.modules.sys.stdout = sys.stdout
>>> c.execute("print 'hi here'") # now this will be redirected here
hi here
Please note that you need to install RPyC server on your remote host and deploy your python modules there as well.
Please read tutorials to learn how to start RPyC server and deploy your code on remote machine

How do I run a Python script that is part of an application I uploaded in an AWS SSH session?

I'm trying to run a Python script I've uploaded as part of my AWS Elastic Beanstalk application from my development machine, but can't figure out how to. I believe I've located the script correctly, but when I attempt to run it under SSH, I get an import error.
For example, I have a Flask-Migrate migration script as part of my application (pretty much the same as the example in the documentation), but after successfully SSHing to my EB instance with
> eb ssh
and locating the script with
$ sudo find / -name migrate.py
when I run in the directory (/opt/python/current) where I located it with
$ python migrate.py db upgrade
at the SSH prompt I get
Traceback (most recent call last):
File "db_migrate.py", line 15, in <module>
from flask.ext.script import Manager
ImportError: No module named flask.ext.script
even though my requirements.txt (present along with the rest of my files in the same directory) has flask-script==2.0.5.
On Heroku I can accomplish all of this in two steps with
> heroku run bash
$ python migrate.py db upgrade
Is there equivalent functionality on AWS? How do I run a Python script that is part of an application I uploaded in an AWS SSH session? Perhaps I'm missing a step to set up the environment in which the code runs?
To migrate your database the best is to use container_commands, they are commands that will run every time you deploy your application. There is a good example in the EBS documentation (Step 6) :
container_commands:
01_syncdb:
command: "django-admin.py syncdb --noinput"
leader_only: true
The reason why you're getting an ImportError is because EBS installs your packages in a virtualenv. Before running arbitrary scripts in your application in SSH, first change to the directory containing your (latest) code with
cd /opt/python/current
and then activate the virtualenv
source /opt/python/run/venv/bin/activate
and set the environment variables (that your script probably expects)
source /opt/python/current/env

Categories

Resources