how to call function in paramiko - python

for transfer entire folder to server using sftp with paramiko. I copy this code from stackoverflow
but my doubt is how to call that function, I put like this ..
sftp = paramiko.SFTPClient.from_transport(t)
M = MySFTPClient()
M.put_dir()
M.mkdir()
but Its throwing this error:
*** Caught exception: <type 'exceptions.TypeError'>: __init__() takes exactly 2 arguments (1 given)

The error message indicates that the function you are calling takes two arguments while you are sending zero. Try doing something like this instead:
t = paramiko.Transport(("ftpexample.com", 22))
t.connect(username = myusername, password = mypassword)
sftp = paramiko.SFTPClient.from_transport(t)
Use the sftp client to upload your file at localpath (e.g. /usr/tmp/test.png") to your remote path:
sftp.put("localpath","remotepath")

I haven't used Paramiko, but reading the source code it seems that you can already use the sftp object returned from the from_transport method. So no need to create another MySFTPClient()
In a Python console try reading help(paramiko.SFTPClient) and help(paramiko.SFTPClient.from_transport). Also browsing sftp.py seems helpful as the list of available commands is in the beginning (put_dir does not seem to be one of them).

Related

Python OPC UA call method without arguments gives error

I am trying to call a method with no input arguements which is as follows :
[1]
[1]: https://i.stack.imgur.com/tGFe9.png
So far I have tried this :
method=client.get_node("ns=5;s=Demo.StateMachines.Program01.Reset")
parent=client.get_node("ns=5;s=Demo.StateMachines.Program01")
output=parent.call_method(method)
but it given me this BadNotExecutable error:
"The executable attribute does not allow the execution of the method."(BadNotExecutable)
The server is telling you this method cannot be executed.
There doesn't appear to be anything wrong with your client, check the server configuration.

python values to bash line on a remote server

So i have a script from Python that connects to the client servers then get some data that i need.
Now it will work in this way, my bash script from the client side needs input like the one below and its working this way.
client.exec_command('/apps./tempo.sh' 2016 10 01 02 03))
Now im trying to get the user input from my python script then transfer it to my remotely called bash script and thats where i get my problem. This is what i tried below.
Below is the method i tried that i have no luck working.
import sys
client.exec_command('/apps./tempo.sh', str(sys.argv))
I believe you are using Paramiko - which you should tag or include that info in your question.
The basic problem I think you're having is that you need to include those arguments inside the string, i.e.
client.exec_command('/apps./tempo.sh %s' % str(sys.argv))
otherwise they get applied to the other arguments of exec_command. I think your original example is not quite accurate in how it works;
Just out of interest, have you looked at "fabric" (http://www.fabfile.org ) - this has lots of very handy funcitons like "run" which will run a command on a remote server (or lots of remote servers!) and return you the response.
It also gives you lots of protection by wrapping around popen and paramiko for hte ssh login etcs, so it can be much more secure then trying to make web services or other things.
You should always be wary of injection attacks - Im unclear how you are injecting your variables, but if a user calls your script with something like python runscript "; rm -rf /" that would have very bad problems for you It would instead be better to have 'options' on the command, which are programmed in, limiting the users input drastically, or at least a lot of protection around the input variables. Of course if this is only for you (or trained people), then its a little easier.
I recommend using paramiko for the ssh connection.
import paramiko
ssh_client = paramiko.SSHClient()
ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh_client.connect(server, username=user,password=password)
...
ssh_client.close()
And If you want to simulate a terminal, as if a user was typing:
chan=ssh_client.invoke_shell()
chan.send('PS1="python-ssh:"\n')
def exec_command(cmd):
"""Gets ssh command(s), execute them, and returns the output"""
prompt='python-ssh:' # the command line prompt in the ssh terminal
buff=''
chan.send(str(cmd)+'\n')
while not chan.recv_ready():
time.sleep(1)
while not buff.endswith(prompt):
buff+=ssh_client.chan.recv(1024)
return buff[:len(prompt)]
Example usage: exec_command('pwd')
And the result would even be returned to you via ssh
Assuming that you are using paramiko you need to send the command as a string. It seems that you want to pass the command line arguments passed to your Python script as arguments for the remote command, so try this:
import sys
command = '/apps./tempo.sh'
args = ' '.join(sys.argv[1:]) # all args except the script's name!
client.exec_command('{} {}'.format(command, args))
This will collect all the command line arguments passed to the Python script, except the first argument which is the script's file name, and build a space separated string. This argument string is them concatenated with the bash script command and executed remotely.

boto throwing an Content-MD5 error sometimes

So I basically wrote a function to upload a file to S3 using the key.set_contents_from_file() function but I'm finding it sometimes throws this error
<Error>
<Code>BadDigest</Code>
<Message>The Content-MD5 you specified did not match what we received.</Message>
<ExpectedDigest>TPCms2v7Hu43d+yoJHbBIw==</ExpectedDigest>
<CalculatedDigest>QSdeCsURt0oOlL3NxxGwbA==</CalculatedDigest>
<RequestId>2F0D40F29AA6DC94</RequestId><HostId>k0AC6vaV+Ip8K6kD0F4fkbdS13UdxoJ3X1M76zFUR/ZQgnIxlGJrAJ8BeQlKQ4m6</HostId></Error>
The function:
def uploadToS3(filepath, keyPath, version):
bucket = connectToS3() # Simple gets my bucket and returns it
key = Key(bucket)
f = open(filepath,'r')
key.name = keyPath
key.set_metadata('version', version)
key.set_contents_from_file(f) # offending line
key.make_public()
key.close()
If I open a python shell and manually call it, it works without a hitch, however the way I have to handle it (in which it doesn't work) involves calling it from a subprocess. This is because the caller is a python3 script, 2to3 didn't work and I didn't want to deal with the various years old branches for python3 versions.
Anyway, that seems to actually run it correctly as it gets in the function, the inputs are what's expected (I had them print out), but the # offending line keeps throwing this error. I have no idea what the cause is.
Is it possible bucket isn't being set properly? I feel like if that were the case calling Key(bucket) would have thrown an error
So I essentially run the below script, once as a subprocess called from a python3 script, the other from the console
sudo -u www-data python botoUtilities.py uploadToS3 /path/to/file /key/path
I have this logic inside to pass it to the correct function
func=None
args=[]
for arg in sys.argv[1:]:
if not func:
g = globals()
func = g[arg]
else:
if arg=='True':
args.append(True)
elif arg=='False':
args.append(False)
else:
args.append(arg)
if func:
wrapper(func, args)
It runs in both cases (I write to a file to print out the args) but only in the console case does it not get the error. This is incredibly frustrating. I can't figure out what is done differently. All I know is that it's not possible to send data to S3 using boto run from a subprocess

Webbrowser TypeError in Python

I am trying to use webbrowser.open to send an email using the default mail client. My code looks like this:
mailto = "mailto:me#bla.com?subject=blabla&body=blabla"
webbrowser.open(mailto)
Although the mail client (Outlook) opens normally, I keep getting the following TypeError:
TypeError: open() takes at least 1 argument (0 given)
I tried to use something like webbrowser.open(mailto,1) but the result is still the same.
Why could this happen?
Its not the webbrowser.open, mailto is 1 argument no matter what value it is. So you need to check again where the troublesome open invoking belongs to.

paramiko sftp.get

I am trying to use paramiko to download a file via SFTP. I create the SFTP object like this:
transport = paramiko.Transport((sftp_server, sftp_port))
transport.connect(username = sftp_login, password = sftp_password)
sftp = paramiko.SFTPClient.from_transport(transport)
sftp.get("file_name", '.', None)
and, I get the exception:
Exception python : Folder not found: \\$IP_ADDRESS\folder_1/folder_2\file_name.
I'm running paramiko to connect to a client chrooted SFTP. The file, 'file_name', is located at the root of my client's chroot.
I don't get why I have this error showing apparently the full path (outside the chroot) of my client's server.
I don't know why my dummy file is not going to be downloaded :O
I will provide any necessary information.
The following code worked for me in Ubuntu 11.10:
sftp.get("file_name", "file_name")
I just made a couple of changes that shouldn't affect to your problem:
localpath: Used full path to the local file name instead of just '.' (directories aren't allowed)
callback: Removed it since None is already the default value and that's not really needed
Since I'm not getting the same error you're getting regarding the remotepath parameter, I guess you might be using a different sftp server that has a different behaviour.
My advice would be to:
Verify with another client, for example the sftp command, that the file you're looking for is really where you are trying to get it.
Use sftp.chdir just to make sure that the default directory being used is the one you expect.
I hope this helps.

Categories

Resources