I have a Raspberry pi running on my home network along with a Macintosh that acts as a backup server. I am attempting to craft a Python3 script that will run on the Pi to create a tar.gz backup of the Pi's home folder (along with all of the contents) and transfer it to the Macintosh using SSH. I have the SSH connection (using keys) running but am stumped when I try to create the backup file and transfer it.
My code so far:
#!/usr/bin/python3
import paramiko
import tarfile
import os
ssh_client = paramiko.SSHClient()
ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh_client.connnect(hostname = '192.168.1.151', username = 'usernamehere')
print("I'm connected!")
#The following statement is the problem as I see it
tar czvf - ./home/PiOne/ | ssh_client -w:gz "cat > ./Volumes/PiBackups/PiOne/PiOneClone.tar.gz"
print("File transferred!")
ssh_client.close
I would appreciate any help in creating the script!
I think there is something easier to transfer a file from your VM to your local machine. Also that you are using SSH
scp -i /path/your/targz/file user#ip:/path/to/local/machine
Check permission while doing that using
ls -lh
Related
I am using this code
tar zcf - somefolder/ | ssh user#server "cd /path/to/remote && tar zxf -"
to copy files between 2 system
i want to do it in python
i did
import subprocess
p=subprocess.Popen('tar zcf - somefolder/ | ssh user#server "cd /path/to/remote && tar zxf -')
i also tried
p=subprocess.Popen(["tar","-zcf somefolder | ssh ubuntu#192.168.100.110 /path/to/remote && tar -zxf"])
but both not working
i also tried with run instead of popen but still not working
but
stream = os.popen("cmd")
this is working fine but problem is i am not getting status
in first methods i can use
os.waitpid(p.pid, 0)
to get live status of process
what i want is transfer files between remote and local without using external libraries
and with live status
how can i achive this?
I would keep it simple and use the os module, which is also faster than subprocess:
result = os.popen("command").read()
Update: I overlooked "no external module" sorry. But maybe it's useful for others searching.
There is a module for that :)
from paramiko import SSHClient
from scp import SCPClient
ssh = SSHClient()
ssh.load_system_host_keys()
ssh.connect('example.com')
# SCPCLient takes a paramiko transport as an argument
scp = SCPClient(ssh.get_transport())
scp.put('test.txt', 'test2.txt')
scp.get('test2.txt')
# Uploading the 'test' directory with its content in the
# '/home/user/dump' remote directory
scp.put('test', recursive=True, remote_path='/home/user/dump')
scp.close()
Or with usage of with:
from paramiko import SSHClient
from scp import SCPClient
with SSHClient() as ssh:
ssh.load_system_host_keys()
ssh.connect('example.com')
with SCPClient(ssh.get_transport()) as scp:
scp.put('test.txt', 'test2.txt')
scp.get('test2.txt')
See: https://pypi.org/project/scp/
From my experience, using subprocess.run() to run an external Ubuntu program/process I've had to use each command parameter or such as a different list entry, like so:
subprocess.run(['pip3', 'install', 'someModule'])
So maybe try putting every single space-separated argument as it's own list element.
I have a python script in my locale file and I don't want to SCP it to the remote machine and run with SSHOperator remotely triggered by airflow. How can I run a locale .py file in a remote machine and get results?
I need SSHOperator with python_callable, not bash_command.
Can anyone show me a remote custom operator sample like SSHPYTHONOperator ?
I solve problem following:
gettime="""
import os
import datetime
def gettimes():
print(True)
gettimes()
"""
remote_python_get_delta_times=SSHOperator(task_id= "get_delta_times",do_xcom_push=True,
command="MYVAR=`python -c" + ' "%s"`;echo $MYVAR' % gettime ,dag=dag,ssh_hook=remote)
I see an SSH operator in the Airflow docs: https://airflow.apache.org/docs/apache-airflow/1.10.13/_api/airflow/contrib/operators/ssh_operator/index.html
If that doesn't work out for you then, you'll have to create a custom Operator using an SSH library like Paramiko
and then use it to pull code from either Github/S3 or SCP your file to the server and then execute it there.
You would need to make sure all your dependencies are also installed on the remote server.
Am using python and paramiko to copy a 5GB file between server A and Server B and script will be executed from serverX, which will open a ssh session to serverb from serverX and run the command to copy the file from server B using sshpass. Script is working, but it is not copying the complete 5GB file. it's copying only half and some time less than half.
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(serverb, username=user, password=password)
try:
stdin, stdout, stderr = client.exec_command("sshpass -p password scp -v -r root#serverA:/tmp/file_to_copy_name /tmp/",timeout=None)
except Exception as err:
print("copy between server error")
raise
You may want to use the Rsync over SSH instead of scp (secure remote file copy) with sshpass (noninteractive ssh password provider). It supports the fast incremental file transfer (can resume unfinished upload) and using the SSH key is much more secure than passing the raw password via sshpass.
Something like:
rsync -az /root/bigfile.txt 198.211.117.129:/root/
-a for archive mode
-z to compress file data during the transfer
The manual: https://download.samba.org/pub/rsync/rsync.html
Moreover, it can resume the copy started with scp.
Here is the instruction on how to use it over SSH:
https://www.digitalocean.com/community/tutorials/how-to-copy-files-with-rsync-over-ssh
Also, as already pointed out by #pynexj, the client.exec_command() will not wait until the command execution will be finished. So you may want to have some alternative way to check if the file was successfully copied and have the same data as the source. One of the options could be checking the MD5 hash: https://stackoverflow.com/search?q=Python+md5+hash
And you may want to check the: What is the fastest hash algorithm to check if two files are equal?
I guess you can use
rsync -avP --partial source target
where source or target can be both the remote server path or the local server path in your required order.
I am creating some tar file on a remote server and I want to be able to get it to my machine. Due to security reasons I can't use FTP on that server.
So how I see it, I have two options:
get the file (as file) in some other way and then use tarfile library - if so, I need help with getting the file without FTP.
get the content of the file and then extract it.
If there is another way, I would like to hear it.
import spur
#creating the connection
shell = spur.SshShell(
hostname=unix_host,
username=unix_user,
password=unix_password,
missing_host_key=spur.ssh.MissingHostKey.accept
)
# running ssh command that is creating a tar file on the remote server
with shell:
command = "tar -czvf test.gz test"
shell.run(
["sh", "-c", command],
cwd=unix_path
)
# getting the content of the tar file to gz_file_content
command = "cat test.gz"
gz_file_content = shell.run(
["sh", "-c", command],
cwd=unix_path
)
More info:
My project is running on a virtualenv. I am using Python 3.4.
If you have SSH access, you have SFTP access for 99%.
So you can use the SFTP to download the file. See Download files over SSH using Python.
Or once you are using spur, see its SshShell.open method:
For instance, to copy a binary file over SSH, assuming you already have an instance of SshShell:
with ssh_shell.open("/path/to/remote", "rb") as remote_file:
with open("/path/to/local", "wb") as local_file:
shutil.copyfileobj(remote_file, local_file)
The SshShell.open method uses SFTP under the hood (via Paramiko library).
The remote machine has Cygwin installed and I have done
$echo "PATH=\$PATH:/cygdrive/c/Python27" >> .bash_profile
then, source .bash_profile (after doing this I am able to run a Python script from cygwin terminal).
Now, from Pyscripter installed in my laptop, I am trying to run hello_world in the remote machine through paramiko:
import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect('135.24.237.167',username = 'cyg_server',password = 'force')
stdin,stdout,stderr = ssh.exec_command("/cygdrive/c/Python27/python /cygdrive/c/cygwin64/home/hello_world.py")
##But I get the following error:
stderr.readlines()
[u"C:\\Python27\\python.exe: can't open file '/cygdrive/c/cygwin64/home/hello_world.py': [Errno 2] No such file or directory\r\n"]
Please help.
Paramiko is too raw, as I believe. Try using fabric.
Sample code would be:
from fabric.api import *
env.key_filename = /path/to/your/pem/file
def mem_usage():
run('free -m')
execute(mem_usage, host="user#IP_or_hostname")
Or if you do not have a pem file, you can leave that line and just enter the password when prompted.