I am trying to just do a simple echo $MYVAR on a remote server with Fabric.
The environment variable is in my ~/.bashrc file on the remote host.
I have tried:
run("source /home/<myusername>/.bashrc && echo $MYVAR")
This just prints an empty string. Running it when logged in on the remote machine prints "5", the value in the bashrc file. Does anyone know why this would be?
I need the environment variable to be set from a file on the remote host, and not decided by Fabric.
I am running the ssh commands as <myusername>, so the username seems correct. This seems like a duplicate of this question, except that maybe they were running things as the wrong user. I also tried the shell argument to run, with no luck.
I haven't tried the various context manager-ish things, as they just seem to be a shorthand for command1 && command2.
First, as explained in the answer you link to, fabric uses a login shell, not and interactive shell, meaning your ~/.bahsrc is not going to be source'd on the remote. You should export your variable in your ~/.bash_profile or ~/.profile.
Without the relevant content of your remote ~/.bashrc it's impossible to tell why your command fails.
Related
I am trying to run sesu command in Unix server from Python with the help of Paramiko exec_command. However when I am running this command exec_command('sesu test'), I am getting
sh: sesu: not found
When I am running simple ls command it giving me desired output. Only with sesu command it is not working fine.
This is how my code looks like:
import paramiko
host = host
username = username
password = password
port = port
ssh=paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(ip,port,username,password)
stdin,stdout,stderr=ssh.exec_command('sesu test')
stdin.write('Password')
stdin.flush()
outlines=stdout.readlines()
resp=''.join(outlines)
print(resp)
The SSHClient.exec_command by default does not run shell in "login" mode and does not allocate a pseudo terminal for the session. As a consequence a different set of startup scripts is (might be) sourced, than in your regular interactive SSH session (particularly for non-interactive sessions, .bash_profile is not sourced). And/or different branches in the scripts are taken, based on an absence/presence of TERM environment variable.
Possible solutions (in preference order):
Fix the command not to rely on a specific environment. Use a full path to sesu in the command. E.g.:
/bin/sesu test
If you do not know the full path, on common *nix systems, you can use which sesu command in your interactive SSH session.
Fix your startup scripts to set the PATH the same for both interactive and non-interactive sessions.
Try running the script explicitly via login shell (use --login switch with common *nix shells):
bash --login -c "sesu test"
If the command itself relies on a specific environment setup and you cannot fix the startup scripts, you can change the environment in the command itself. Syntax for that depends on the remote system and/or the shell. In common *nix systems, this works:
PATH="$PATH;/path/to/sesu" && sesu test
Another (not recommended) approach is to force the pseudo terminal allocation for the "exec" channel using the get_pty parameter:
stdin,stdout,stderr = ssh.exec_command('sesu test', get_pty=True)
Using the pseudo terminal to automate a command execution can bring you nasty side effects. See for example Is there a simple way to get rid of junk values that come when you SSH using Python's Paramiko library and fetch output from CLI of a remote machine?
You may have a similar problem with LD_LIBRARY_PATH and locating shared objects.
See also:
Environment variable differences when using Paramiko
Certain Unix commands fail with "... not found", when executed through Java using JSch
I am using Python to create a directory remotely from one server to the next. My code entails:
executedString = "sudo ssh -i mykey.pem server_ip %s" % (name_of_directory)
os.popen(executedString)
I also tried os.system() but that didn't work. The funny thing is that, when I run this through the terminal it works. However, when I executed it from my Python script, it doesn't.
I ensured that all the files were owned by the same user group and that didn't help.
Please note that I am also running this via CGI which is where code does not get executed even though the rest of the code works otherwise.
Please advise.
I realized it was an authentication key host issue when printed out all the logs of the ssh activity. The solution to this problem was that I copied the ~/.ssh to /root directory.
I can only presume that either the key was not present in root or it was looking for auth key in root since only root can run the group/user owned by www-data.
It appears your executing a shell to a UNIX host via a PIPE. I'm thinking your command line should be the following:
executedString = "sudo ssh -i mykey.pem server_ip **mkdir** %s" % (directory_name)
I'm running a script that's determining login information for me, and in the end is outputting the login information that I need to use.
I am running the script in a terminal, and now I want it to SSH me with the credentials it has, exit the Python script on my computer and connect my current terminal to the new server.
Say I already have my sshHost, sshUser and sshPass as variables in the script. How do I run an SSH command in the current terminal and connect to that server?
I tried subprocess and spur, however I didn't really manage to get that going.
I would really appreciate your help and thanks in advance.
assuming the python prints the settings to stdout;
#!/bin/sh
export $(credentials.py)
exec ssh hostname
I've got some code which needs to grab code from github periodically (on a Windows machine).
When I do pulls manually, I use GitBash, and I've got ssh keys running for the repos I check so everything is fine. However when I try to run the same actions in a python subprocess I don't have the ssh services which GitBash provides and I'm unable to authenticate to the repo.
How should I proceed from here. I can think of a couple of different options:
I could revert to using https:// fetches. This is problematic because the repos I'm fetching use 2-factor authentication and are going to be running unattended. Is there a way to access an https repo that has 2fa from a command line?
I've tried calling sh.exe with arguments that will fire off ssh-agent and then issuing my commands so that everything is running more or less the way it does in gitBash, but that doesn't seem to work:
"C:\Program Files (x86)\Git\bin\sh.exe" -c "C:/Program\ Files\ \(x86\)/Git/bin/ssh-agent.exe; C:/Program\ Files\ \(x86\)/Git/bin/ssh.exe -t git#github.com"
produces
SSH_AUTH_SOCK=/tmp/ssh-SiVYsy3660/agent.3660; export SSH_AUTH_SOCK;
SSH_AGENT_PID=8292; export SSH_AGENT_PID;
echo Agent pid 8292;
Could not create directory '/.ssh'.
The authenticity of host 'github.com (192.30.252.129)' can't be established.
RSA key fingerprint is XXXXXXXXXXX
Are you sure you want to continue connecting (yes/no)? yes
Failed to add the host to the list of known hosts (/.ssh/known_hosts).
Permission denied (publickey).
Could I use an ssh module in python like paramiko to establish a connection? It looks to me like that's only for ssh'ing into a remote terminal. Is there a way to make it provide an ssh connection that git.exe can use?
So, I'd be grateful if anybody has done this before or has a better alternative
The git bash set the HOME environment variable, which allows git to find the ssh keys (in %HOME%/.ssh)
You need to make sure the python process has or define HOME to the same PATH.
As explained in "Python os.environ[“HOME”] works on idle but not in a script", you need to set HOME to %USERPROFILE% (or, in python, to os.path.expanduser("~") ).
I would like to write a script that will tell another server to SVN export a SVN repository.
This is my python script:
import os
# svn export to crawlers
for s in ['work1.main','work2.main']:
cmd = 'ssh %s "cd /home/zes/ ; svn --force export svn+ssh://174.113.224.177/home/svn/dragon-repos"' % s
print cmd
os.system(cmd)
Very simple. It will ssh into work1.main, then cd to a correct directory. Then call SVN export command.
However, when I run this script...
$ python export_to_crawlers.py
ssh work1.main "cd /home/zes/ ; svn --force export svn+ssh://174.113.224.177/home/svn/dragon-repos"
Permission denied, please try again.
Permission denied, please try again.
Permission denied (publickey,gssapi-with-mic,password).
svn: Connection closed unexpectedly
ssh work2.main "cd /home/zes/ ; svn --force export svn+ssh://174.113.224.177/home/svn/dragon-repos"
Host key verification failed.
svn: Connection closed unexpectedly
Why do I get this error and cannot export the directory? I can manually type the commands in the command line and it will work. Why can't it work in the script?
If I change to this...it will not work. and instead, nothing will happen.
cmd = 'ssh %s "cd /home/zes/ ;"' % s
This is a problem with SSH.
Permission denied, please try again.
This means that ssh can't login. Either your ssh agent doesn't have the correct key loaded, you're running the script as a different user or the environment isn't passed on correctly. Check that the variables SSH_AUTH_SOCK and SSH_AGENT_PID are passed to the subprocess of your python script.
Host key verification failed.
This error means that the remote host isn't known to ssh. This means that the host key is not found in the file $HOME/.ssh/known_hosts. Again, make sure that you're checking the home directory of the effective user of the script.
[EDIT] When you run the script, then python will become the "input" of ssh: ssh is no longer connected to a console and will ask python for the password to login. Since python has no idea what ssh wants, it ignores the request. ssh tries three times and dies.
To solve it, run these commands before you run the Python script:
eval $(ssh-agent)
ssh-add path-to-your-private-key
Replace path-to-your-private-key with the path to your private key (the one which you use to login). ssh-add will ask for your password and the ssh-agent will save it in a secure place. It will also modify your environment. So when SSH runs the next time, it will notice that an ssh agent is running and ask it first. Since the ssh-agent knows the password, ssh will login without bothering Python.
To solve the second issue, run the second ssh command manually once. ssh will then add the second host to its files and won't ask again.
[EDIT2] See this howto for a detailed explanation how to login on a remote server via ssh with your private key.
I guess that it is related to ssh. Are you using a public key to automatically connect. I think that your shell knows this key but it is not the case of python.
I am not sure but it's just an idea. I hope it helps
Check out the pxssh module that is part of the pyexpect project:
https://pexpect.readthedocs.org/en/latest/api/pxssh.html
It simplifies dealing with automating ssh-ing into machines.