Python: Perforce client error while remotely executing command [duplicate] - python

This question already has answers here:
Environment variable differences when using Paramiko
(2 answers)
Closed 3 years ago.
On Linux Machine A, i have a script running which is supposed to ssh into Linux Machine B, sync the p4 workspace on B and then execute a piece of code.
I've set up Perforce on B and I'm able to manually execute p4 client and other commands on B and they work.
However, when i attempt to do the same thing while sshing from Machine A, i get the following error:
Perforce client error:
Connect to server failed; check $P4PORT.
TCP connect to perforce:1666 failed.
Temporary failure in name resolution
Here's my piece of code:
def ssh_connect(ip,user,pwd):
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(ip, username=user, password=pwd)
return ssh
def execute_command(device_details, command):
try:
ip = device_details.get('ip')
username = device_details.get('username')
password = device_details.get('password')
ssh_ns_obj = ssh_connect(ip, username, password)
ssh_stdin, ssh_stdout, ssh_stderr = ssh_ns_obj.exec_command(command)
print ssh_stdout.read()
print ssh_stderr.read()
print ssh_stdin.read()
return ssh_stdout.read()
except Exception as e:
print_exception_details()
perforce_sync_command = "cd /root/hello/;chmod -R 444 *;p4 sync ...;chmod -R 755 *"
output = execute_command(device_details, perforce_sync_command)
What am i missing?

Perforce needs a set of environment variables to run - on ssh, the environment variables were not visible on the ssh session.
To solve this, i edited the file /etc/environment and added the P4PORT, P4USER, P4CLIENT, P4PASSWD vars in the file.

Related

Paramiko sftp upload with ppk file [duplicate]

This question already has answers here:
How to ssh connect through Python Paramiko with ppk public key
(4 answers)
Closed 2 years ago.
I am trying to create a python script with Paramiko Lib to upload a file on sftp which uses a "ppk" file and a passphrase to connect.
Unfortunately I cant crack the document or found anything which can connect sftp with ppk files.
Additional details:
SFTP can manually be connected with Filezilla, WinSCP is not allowing it.
Here is the code I can go upto only. Please help!
k = paramiko.RSAKey.from_private_key_file("/key.ppk")
c = paramiko.SSHClient()
c.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
c.connect( hostname = "ftp.example.com", username = "user", pkey = k,passphrase="somephrase" )
Well that's the least of the problems, I need to upload afterwards when it gets connected.
i suggest that you convert .ppk to .pem !
see :
Conver ppk to pem
then like this :
import paramiko
k = paramiko.RSAKey.from_private_key_file("mykey.pem")
c = paramiko.SSHClient()
c.set_missing_host_key_policy(paramiko.AutoAddPolicy())
print "connecting"
c.connect( hostname = "www.host.com", username = "ubuntu", pkey = k )
print "connected"
commands = [ "/home/ubuntu/firstscript.sh", "/home/ubuntu/secondscript.sh" ]
for command in commands:
print "Executing {}".format( command )
stdin , stdout, stderr = c.exec_command(command)
print stdout.read()
print( "Errors")
print stderr.read()
c.close()

How to pass a command-line ssh parameter with Paramiko?

I'm trying to migrate from using Popen to directly run a ssh command to using Paramiko instead, because my code is moving to an environment where the ssh command won't be available.
The current invocation of the command is passed a parameter that is then used by the remote server. In other words:
process = subprocess.Popen(['ssh', '-T', '-i<path to PEM file>', 'user#host', 'parameter'])
and authorised_keys on the remote server has:
command="/home/user/run_this_script.sh $SSH_ORIGINAL_COMMAND", ssh-rsa AAAA...
So I've written the following code to try and emulate that behaviour:
def ssh(host, user, key, timeout):
""" Connect to the defined SSH host. """
# Start by converting the (private) key into a RSAKey object. Use
# StringIO to fake a file ...
keyfile = io.StringIO(key)
ssh_key = paramiko.RSAKey.from_private_key(keyfile)
host_key = paramiko.RSAKey(data=base64.b64decode(HOST_KEYS[host]))
client = paramiko.SSHClient()
client.get_host_keys().add(host, "ssh-rsa", host_key)
print("Connecting to %s" % host)
client.connect(host, username=user, pkey=ssh_key, allow_agent=False, look_for_keys=False)
channel = client.invoke_shell()
... code here to receive the data back from the remote host. Removed for relevancy.
client.close()
What do I need to change in order to pass a parameter to the remote host so that it uses it as $SSH_ORIGINAL_COMMAND?
From an SSH perspective, what you are doing is not passing a parameter, but simply executing a command. That in the end the "command" is actually injected as a parameter to some script is irrelevant from the client's perspective.
So use a standard Paramiko code for executing commands:
Python Paramiko - Run command
(stdin, stdout, stderr) = s.exec_command('parameter')
# ... read/process the command output/results

Find files in directory ignoring subdirectories - linux cmd via python [duplicate]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I need get the list of files from a remote host directory, running the code in my local machine.
Is something like os.listdir() at remote host machine, NOT is os.lisdir() in the local machine that runs the python code.
In bash this command works
ssh user#host "find /remote/path/ -name "pattern*" -mmin -15" > /local/path/last_files.txt
Your best option for running commands on a remote machine is via ssh with paramiko.
A couple of examples of how to use the library and issue a command to the remote system:
import base64
import paramiko
# Let's assign an RSA SSH key to the 'key' variable
key = paramiko.RSAKey(data=base64.b64decode(b'AAA...'))
# And create a client instance.
client = paramiko.SSHClient()
# Create an object to store our key
host_keys = client.get_host_keys()
# Add our key to 'host_keys'
host_keys.add('ssh.example.com', 'ssh-rsa', key)
# Connect to our client; you will need
# to know/use for the remote account:
#
# IP/Hostname of target
# A username
# A password
client.connect('IP_HOSTNAME', username='THE_USER', password='THE_PASSWORD')
# Assign our input, output and error variables to
# to a command we will be issuing to the remote
# system
stdin, stdout, stderr = client.exec_command(
'find /path/data/ -name "pattern*" -mmin -15'
)
# We iterate over stdout
for line in stdout:
print('... ' + line.strip('\n'))
# And finally we close the connection to our client
client.close()
As pointed out by the OP, if we already have a known hosts file locally we can do things slightly different:
import base64
import paramiko
# And create a client instance.
client = paramiko.SSHClient()
# Create a 'host_keys' object and load
# our local known hosts
host_keys = client.load_system_host_keys()
# Connect to our client; you will need
# to know/use for the remote account:
#
# IP/Hostname of target
# A username
# A password
client.connect('IP_HOSTNAME', username='THE_USER', password='THE_PASSWORD')
# Assign our input, output and error variables to
# to a command we will be issuing to the remote
# system
stdin, stdout, stderr = client.exec_command(
'find /path/data/ -name "pattern*" -mmin -15'
)
# We iterate over stdout
for line in stdout:
print('... ' + line.strip('\n'))
# And finally we close the connection to our client
client.close()

Paramiko: calling "cd" command with exec_command does nothing

I have the following program using Paramiko:
#!/usr/bin/env python
import paramiko
hostname = '192.168.1.12'
port = 22
username = 'root'
password = 'whatl0ol'
if __name__ == "__main__":
paramiko.util.log_to_file('paramiko.log')
ssh = paramiko.SSHClient()
ssh.load_system_host_keys()
ssh.connect(hostname, port, username, password)
while True:
pick = raw_input("sshpy: ")
stdin, stdout, stderr = ssh.exec_command(pick)
print stdout.readlines()
But when I connect and try to use cd, it doesn't work. How can I fix this?
It looks like you are implementing some kind of interactive program that allows executing a sequence of commands on the server.
The SSHClient.exec_command executes each command in a separate "exec" channel. The individual commands run in their own environment. So if you execute cd command, it has no effect at all on subsequent commands. They will again start in user's home directory.
If you want to implement an interactive shell session, use SSHClient.invoke_shell.
For an example, see how to interact with Paramiko's interactive shell session?
See also Execute multiple commands in Paramiko so that commands are affected by their predecessors.
Paramiko SSH_Client opens a new session and executes the command in that session and once command execution gets completed, the session channel is closed.
Executing 'cd' command would have been done in the first session and later on, for the next command the session would start again from home directory.
If you want to hold the session, use invoke_shell for an interactive session.
I needed to change directories and run an executable. I have to do this all in one command. The client unit was a windows 10 machine.
The cmd shell in windows is soo problematic! Commands are different. ';' between commands doesn't work. You need to use '&'. cd d:/someDirectory doesn't work. You need '/d'. 'pwd' doesn't work. Also, echo%cd% to pwd doesn't work reliably. 'cd' with no parameters for pwd does work reliably. I was hoping the not working list would save you time. This is where it landed.
cmd = 'cd /d D:\someDirectory & SomeExecutable.exe
someParameter'
ssh_stdin, ssh_stdout, ssh_stderr =
ssh.exec_command(cmd_1_to_execute)
To check directory change use the following:
cmd = 'cd /d D:\someDirectory & cd'
ssh_stdin, ssh_stdout, ssh_stderr =
ssh.exec_command(cmd_1_to_execute)
output = ssh_stdout.readline()
error = ssh_stderr.readline()
print("output: " + output)
print("error: " + error)

Python run mutiple ssh commands in the same session

My goal is to connect to SSH with python and authenticate which i can do with Paramiko or Fabric. But i would like to keep the session open after each execution and read the input/output. With paramiko i can only run 1 command before the session is closed and i am asked to authenticate again and the session hangs. And since fabric is using the paramiko library its giving me the same issue. For example if my directory structure is like this
-home
--myfolder1
--myfolder2
I would like to execute the below commands without having to re-authenticate because the sessions closes.
(make connection)
run cmd: 'pwd'
output: /home
run cmd: 'cd myfolder2'
run cmd: 'pwd'
output: /home/myfolder2
Is this possible with any module that is out there right now? Could it be made from scratch with native python? And also is this just not possible...?
Edit Added code. Without the new open_session it closes and i cannot run any command. After running the first command with this i will be prompted again to authenticate and it creates an infinite loop.
Edit2 If it closes after each command then there is no way this will work at all correct?
edit3 If i run this on a different server and exec_command with the paramikio.SSHClient it wont ask me to reauthenticate but if i 'cd somedir' and then 'pwd' it will output that i am back in the root directory of where i created.
class connect:
newconnection = ''
def __init__(self,username,password):
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
try:
ssh.connect('someserver', username=username,password=password,port=22,timeout=5)
except:
print "Count not connect"
sys.exit()
self.newconnection = ssh
def con(self):
return self.newconnection
#This will create the connection
sshconnection = connect('someuser','somepassword').con()
while True:
cmd = raw_input("Command to run: ")
if cmd == "":
break
try:
transport = sshconnection.get_transport()
transport.set_keepalive(999999)
chan = transport.open_session()
chan.settimeout(3)
chan.setblocking(0)
except:
print "Failed to open a channel"
chan.get_exception()
sys.exit()
print "running '%s'" % cmd
stdout_data = []
stderr_data = []
pprint.pprint(chan)
nbytes = 4096
chan.settimeout(5)
chan.get_pty()
chan.exec_command(cmd)
while True:
print "Inside loop " , chan.exit_status_ready()
time.sleep(1.2)
if chan.recv_ready():
print "First if"
stdout_data.append(chan.recv(nbytes))
if chan.recv_stderr_ready():
print "Recv Ready"
stderr_data.append(chan.recv_stderr(nbytes))
if chan.exit_status_ready():
print "Breaking"
break
print 'exit status: ', chan.recv_exit_status()
print ''.join(stdout_data)
This is possible by using the normal modules when you can concatenate the commands into one. Try
pwd ; cd myfolder2 ; pwd
as command. This should work but quickly becomes tedious when you have more complex commands which need arguments and horrible when the arguments contain spaces. The next step then is to copy a script with all the commands to the remote side and tell ssh to execute said script.
Another problem of this approach is that SSH doesn't return until all commands have executed.
Alternatively, you could build a "command server", i.e. a simple TCP server that listens for incoming connections and executes commands sent to it. It's pretty simple to write but also pretty insecure. Again, the solution is to turn the server into a (Python) script which reads commands from stdin and start that script remotely via SSH and then send commands.

Categories

Resources