I have a python script in my locale file and I don't want to SCP it to the remote machine and run with SSHOperator remotely triggered by airflow. How can I run a locale .py file in a remote machine and get results?
I need SSHOperator with python_callable, not bash_command.
Can anyone show me a remote custom operator sample like SSHPYTHONOperator ?
I solve problem following:
gettime="""
import os
import datetime
def gettimes():
print(True)
gettimes()
"""
remote_python_get_delta_times=SSHOperator(task_id= "get_delta_times",do_xcom_push=True,
command="MYVAR=`python -c" + ' "%s"`;echo $MYVAR' % gettime ,dag=dag,ssh_hook=remote)
I see an SSH operator in the Airflow docs: https://airflow.apache.org/docs/apache-airflow/1.10.13/_api/airflow/contrib/operators/ssh_operator/index.html
If that doesn't work out for you then, you'll have to create a custom Operator using an SSH library like Paramiko
and then use it to pull code from either Github/S3 or SCP your file to the server and then execute it there.
You would need to make sure all your dependencies are also installed on the remote server.
Related
I have problem with fabic (2.4). I have no access to environment variables in remote server (I'm using FreeBSD).
In my ~/.profile file i have variable:
export MY_KEY="123456789"
In my fabfile.py i have simple task:
from fabric import task
#task(hosts=['user#myhost.com'])
def deploy(context):
context.run('echo 123')
context.run('echo $MY_KEY')
When I run the fab deploy command, I see only 123 but after connecting via ssh my variable is visible.
And what about using Connection.prefix as a context manager ?
with context.prefix('MY_KEY="123456789"'):
context.run('echo 123')
context.run('echo $MY_KEY')
Is there a way to ssh to a server and run a PythonOperator with Airflow? I am looking something like SSHExecuteOperator but instead of executing a bash command execute a python callable.
Its ssh privacy issue
goto your host ssh server and run ssh-keygen -t rsa and press enter all the way
You will get 2 rsa files.copy that file to airflow env and copy its full path
The Just add the below in your connection from the airflow UI
{"key_file": "/usr/local/airflow/.ssh/id_rsa.pub", "no_host_key_check": true}
Recompile the DAG and run it
Is it possible to solve that a running Python process which is running on a remote server, ssh to other and then continue running?
run in 192.168.100.1
|
ssh to there 192.168.100.2
|
run continuously on 192.168.100.2 and do other functions from py
I tried with subprocess call but when the script call ssh command and connect to the other, it stops and wait there.
You need use RPC to call python function on remote server.
There are many libs around. Let's take RPyC lib for example:
>>> import rpyc
>>> c = rpyc.classic.connect("localhost")
>>> c.execute("print 'hi there'") # this will print on the host
>>> import sys
>>> c.modules.sys.stdout = sys.stdout
>>> c.execute("print 'hi here'") # now this will be redirected here
hi here
Please note that you need to install RPyC server on your remote host and deploy your python modules there as well.
Please read tutorials to learn how to start RPyC server and deploy your code on remote machine
In order to run a local python script on remote server, I use the following command:
cat hello.py | ssh user#192.168.1.101 python -
But when I want to run a script which imports some files;
import _init_paths
from fast_rcnn.config import cfg
from fast_rcnn.test import im_detect
from fast_rcnn.nms_wrapper import nms
the above command doesn't solve my problem. How can I run such a python file on remote server?
Thanks,
I have a nat and it has various server
So from my local server I want to go to nat and then from nat i have to ssh to other machines
Local-->NAT(abcuser#publicIP with key 1)-->server1(xyzuser#localIP with key 2)
nat has different ssh key
and each of the server has different ssh key
how can i accomplish this type of multihop ssh using fabric
I tried using env.roledefs feature but it doesnt seems to be working
also I am not sure how to define two ssh keys.I know we can define a list of keys with env.key_filename but issue is will it check each key with each server?How can I be more specific and match a key with one server only
I have tried using command from my local machine
fab deploy -g 'ec2-user#54.251.151.39' -i '/home/aman/Downloads/aws_oms.pem'
and my script is
from __future__ import with_statement
from fabric.api import local, run, cd, env, execute
env.hosts=['ubuntu#10.0.0.77']
env.key_filename=['/home/ec2-user/varnish_cache.pem']
def deploy():
run("uname -a")
It's possible. Double hop to 10.0.0.2 (and list files) via gateway hop 10.0.0.1. Basically, you simply nest the connections with the gateway parameter.
# coding: utf-8
from fabric import Connection
path = '/'
conn1 = Connection(host='user1#10.0.0.1', connect_kwargs={'password': '***'})
conn2 = Connection(host='user2#10.0.0.2', connect_kwargs={'password': '***'}, gateway=conn1)
result = conn2.run(f'''cd {path} && ls -al''', hide=True)
conn2.close()
conn1.close()
msg = "Ran {0.command!r} on {0.connection.host}, got stdout:\n{0.stdout}"
print(msg.format(result))
Please remember to run the SSH connection manually once to introduce the servers to each other!
Install via
pip3 install --upgrade fabric
pip3 install cryptography==2.4.2 # optional to hide some annoying warnings
http://docs.fabfile.org/en/latest/concepts/networking.html
Python 3.6+.
In order to connect to remote hosts via an intermediate server, you can use the --gateway command-line option :
http://docs.fabfile.org/en/latest/usage/fab.html#cmdoption-g
Or, alternatively, set the env.gateway variable inside your fabfile :
http://docs.fabfile.org/en/latest/usage/env.html#gateway
For more detail information, see:
http://docs.fabfile.org/en/stable/concepts/networking.html#ssh-gateways