How to change my ssl ciphers in ldaps request? - python

Dears~,
My environment is:
OS:Ubuntu 12.04.4 LTS
Python:Python 2.7.3
When use ldap connect to AD server over ssl. I got this error
"A TLS packet with unexpected length was received"
I have got the package by tcpdump and find
hello faild
Hello details
But when I use perl script in same environment is ok, and python script running in Ubuntu16 also connect successfully(only python in ubuntu12 not work)
When successfully connected the hello request will bring more ciphers than Ubuntu12.
Run well on Ubuntu16
When faild ,AD server could found
error log
My test script is:
import ldap
TIMEOUT = 30
DEBUG_LEVEL = 8191
TRACE_LEVEL = 10
AD_HOST = "10.29.137.100"
USERNAME = "username"
PASSWORD = "password"
ldap.set_option(ldap.OPT_X_TLS_REQUIRE_CERT, ldap.OPT_X_TLS_ALLOW)
ldap.set_option(ldap.OPT_DEBUG_LEVEL, 8191)
ldapConn = ldap.initialize("ldaps://" + AD_HOST + ":636",
trace_level=TRACE_LEVEL)
ldapConn.set_option(ldap.OPT_PROTOCOL_VERSION, 3)
ldapConn.set_option(ldap.OPT_X_TLS_CIPHER_SUITE,'TLSv1:!NULL')
ldapConn.set_option(ldap.OPT_REFERRALS, 0)
ldapConn.set_option(ldap.OPT_NETWORK_TIMEOUT , TIMEOUT)
ldapConn.set_option(ldap.OPT_TIMEOUT , TIMEOUT)
ldapConn.simple_bind_s(USERNAME, PASSWORD)
My question is how to change ciphers in python scripts?
I found
ldapConn.set_option(ldap.OPT_X_TLS_CIPHER_SUITE,'TLSv1:!NULL')
not work for me. and now I have no idea where setting these cipher values. or what third party depend I can upgrade to support more ciphers.
Thanks~~~

You've just hit the python 2/3 wall.
Your script is python3 that you try to run in a python 2.7 environment which is not backward compatible. Only option is to install python3 on Ubuntu 12 and run it there with python3.X.
An example is shown here.

Like me today, you're probably in the situation explained here: https://github.com/python-ldap/python-ldap/issues/55 (and here https://github.com/pyldap/pyldap/issues/53):
Several, perhaps all set_option(OPT_X_TLS_*, ...) calls require a final set_option(ldap.OPT_X_TLS_NEWCTX, 0) call to submit all previous set_option() calls. Without OPT_X_TLS_NEWCTX, settings are effectively ignored.
=> You can either add ldap.set_option(ldap.OPT_X_TLS_CIPHER_SUITE,'TLSv1:!NULL') before the initialize call, or add ldapConn.set_option(ldap.OPT_X_TLS_NEWCTX, 0) before the bind.

Related

Fabrics 2.x ssh connection using identity fails to work

Trying to connect to the host described in ssh config using fabrics 2 and identity file.
con = Connection('my_host')
#task
def tt(c):
con.run('uname -a')
~/.ssh/config :
Host my_host
HostName 123.144.76.84
User ubuntu
IdentityFile ~/.keys/somekey
It fails with
paramiko.ssh_exception.AuthenticationException: Authentication failed.
While $ ssh my_host from the terminal works.
I've tried to do fab -i ~/.keys/somekey tt with same result.
Fabric accepts a hosts iterable as parameters in tasks. Per the documentation:
An iterable of host-connection specifiers appropriate for eventually instantiating a Connection. The existence of this argument will trigger automatic parameterization of the task when invoked from the CLI, similar to the behavior of --hosts.
One of the members of which could be:
A string appropriate for being the first positional argument to Connection - see its docs for details, but these are typically shorthand-only convenience strings like hostname.example.com or user#host:port.
As for your example, please try this for fabfile.py:
host_list = ["my_host"]
#task(hosts=host_list)
def tt(c):
c.run('uname -a')
Alternatively, you can omit the host declaration from the fabfile altogether. If you don't specify the host in fabfile.py, you can simply specify it as a host when invoking the fab cli utility. If your fabfile.py is this:
#task
def tt(c):
c.run('uname -a')
You would now run fab -H my_host tt to run it on the alias tt from your SSH client config.
Hope this helps.
There seems to be something afoot with paramiko. Without digging deeper I don't know if it's a bug or not. In any case, I had the same issue, and even a plain paramiko call got me the same error.
Following another SO question I was able to make it work by disabling rsa-sha2-256 and rsa-sha2-512 as mentioned.
Luckily, fabric exposes access to the paramiko arguments like so:
con = Connection(
'my_host',
connect_kwargs={
"disabled_algorithms": {"pubkeys": ["rsa-sha2-256", "rsa-sha2-512"]}
}
)
I find it unlucky that this is required in the fabfile. If someone else has a better/cleaner solution feel free to comment.
Same problem.
You can try add -d for more detail when fabric run:
fab2 -d tt
I found the exception: paramiko.ssh_exception.SSHException: Invalid key, then regenerate key from server, problem solved.

Python script keeps running when using pyRserve

I am trying to learn how to send a list of lists in Python to R -script which runs statistical methods and gives two or three data frames back to Python
I stumbled across the pyRserve package. I was able to follow the manual in their documentation and everything works great in command line (>>> ). When I run a script, it does not stop. I have installed Rserve package and started its service in RStudio. Below is the code:
import pyRserve
print "here1" #prints this line...
conn = pyRserve.connect(host='localhost', port=6311)
print "here2"
a= conn.eval('3+5')
print a
Can anyone please help?
The (docs) suggest:
$ python
>>> import pyRserve
>>> conn = pyRserve.connect()
And then go on with:
To connect to a different location host and port can be specified explicitly:
pyRserve.connect(host='localhost', port=6311)
This is not meant to indicate that both lines should be run. The second line should be viewed as a potential modifier for the first. So if you need an alternate address or port, then it should look like:
$ python
>>> import pyRserve
>>> conn = pyRserve.connect(host='localhost', port=6311)
Also note this caveat for windows users:
Note On some windows versions it might be necessary to always provide ‘localhost’ for connecting to a locally running Rserve instance.

python values to bash line on a remote server

So i have a script from Python that connects to the client servers then get some data that i need.
Now it will work in this way, my bash script from the client side needs input like the one below and its working this way.
client.exec_command('/apps./tempo.sh' 2016 10 01 02 03))
Now im trying to get the user input from my python script then transfer it to my remotely called bash script and thats where i get my problem. This is what i tried below.
Below is the method i tried that i have no luck working.
import sys
client.exec_command('/apps./tempo.sh', str(sys.argv))
I believe you are using Paramiko - which you should tag or include that info in your question.
The basic problem I think you're having is that you need to include those arguments inside the string, i.e.
client.exec_command('/apps./tempo.sh %s' % str(sys.argv))
otherwise they get applied to the other arguments of exec_command. I think your original example is not quite accurate in how it works;
Just out of interest, have you looked at "fabric" (http://www.fabfile.org ) - this has lots of very handy funcitons like "run" which will run a command on a remote server (or lots of remote servers!) and return you the response.
It also gives you lots of protection by wrapping around popen and paramiko for hte ssh login etcs, so it can be much more secure then trying to make web services or other things.
You should always be wary of injection attacks - Im unclear how you are injecting your variables, but if a user calls your script with something like python runscript "; rm -rf /" that would have very bad problems for you It would instead be better to have 'options' on the command, which are programmed in, limiting the users input drastically, or at least a lot of protection around the input variables. Of course if this is only for you (or trained people), then its a little easier.
I recommend using paramiko for the ssh connection.
import paramiko
ssh_client = paramiko.SSHClient()
ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh_client.connect(server, username=user,password=password)
...
ssh_client.close()
And If you want to simulate a terminal, as if a user was typing:
chan=ssh_client.invoke_shell()
chan.send('PS1="python-ssh:"\n')
def exec_command(cmd):
"""Gets ssh command(s), execute them, and returns the output"""
prompt='python-ssh:' # the command line prompt in the ssh terminal
buff=''
chan.send(str(cmd)+'\n')
while not chan.recv_ready():
time.sleep(1)
while not buff.endswith(prompt):
buff+=ssh_client.chan.recv(1024)
return buff[:len(prompt)]
Example usage: exec_command('pwd')
And the result would even be returned to you via ssh
Assuming that you are using paramiko you need to send the command as a string. It seems that you want to pass the command line arguments passed to your Python script as arguments for the remote command, so try this:
import sys
command = '/apps./tempo.sh'
args = ' '.join(sys.argv[1:]) # all args except the script's name!
client.exec_command('{} {}'.format(command, args))
This will collect all the command line arguments passed to the Python script, except the first argument which is the script's file name, and build a space separated string. This argument string is them concatenated with the bash script command and executed remotely.

PopenSpawn and ftp command

question - i am trying to make minimal example with latest pexpect,
but cant get to working basic example(src is below) in following
enviroment:
windows 10, 64bit python 3.4
command is ftp localhost ( server is filezila, running on localhost)
i am aware windows support in pexpect, is marked as experimental, but still, it would be usefull to get it working ..
problem:
if in code below i use
co.expect("",timeout=30)
, then scripts works, because it is not waiting for prompt after login... but as i need to interact and put more complex query, i need to use
co.expect("ftp>",timeout=30)
but at that moment, pexpect wait till timeout.. i found, in popen_spawn.py, nothing is coming. is it possible self.proc.stdout.fileno() is buffering, and waiting indefinetly, till buffer is filled ?
import pexpect
from pexpect import popen_spawn
import sys
try:
hostname = "127.0.0.1"
co = pexpect.popen_spawn.PopenSpawn('ftp localhost',encoding="utf-8")
co.logfile = sys.stdout
co.timeout = 4
co.expect(":")
co.sendline("test")
co.expect(".*word:.*")
co.sendline("test123")
co.expect("",timeout=30)
co.sendline('dir')
co.expect('ftp>')
co.close()
except Exception as e:
print(co)

CA SSL parameter for Python MySQLdb not working, but key does?

I'm trying to connect to a MySQL DB that requires SSL (only doing server authentication, not mutual). I have the server's CA saved as a .pem in the same directory I'm running the script from. My connection string looks like this:
ssl_settings = {'ca':'ca.pem'}
conn = MySQLdb.connect(host=HOST, user=USER, passwd=PASS, db=DB, ssl=ssl_settings}
This results in "Error 2026: SSL connection error". However, if I change ssl_settings to:
ssl_settings = {'key':'ca.pem'}
The database connects just fine and the script executes. From my understanding of the SSL parameters, 'cert' and 'key' should only be for client authentication to the server, so is there any reason the latter SSL settings seem to work and why specifying the CA file does not?
Python 2.4.3 (old, I know)
MySQL-python 1.2.1
Note: this bug has since been fixed. Per the bug:
Noted in 5.1.66, 5.5.28, 5.6.7, 5.7.0 changelogs.
The argument to the --ssl-key option was not verified to exist and be
a valid key. The resulting connection used SSL, but the key was not
used.
Old answer
For a much better description than I can give, see http://bugs.mysql.com/bug.php?id=62743 and http://www.chriscalender.com/?p=325.
From my (admittedly uneducated) understanding, it is a MySQL bug. As long as you specify only a key (as you're doing in the example that works), MySQL sets the SSL connection and you're granted access. The other interesting part is that you can change the key value to be anything at all, so in your example, you could do:
ssl_settings = {'key': 'randomstuff'}
and it should still connect.
you can change the key value to be anything at all
I see the same behavior too with MySQLdb version 1.3.12. To setup an SSL connection using MySQLdb, setting the ssl argument to anything still works (I'm using Python3):
$ python
Python 3.6.8 (default, Dec 26 2018, 09:19:39)
>>> import MySQLdb
>>> MySQLdb.__version__
'1.3.12'
>>> db = MySQLdb.connect(host='10.105.136.101', user='my-user', passwd='myPassword', ssl={'ssl' : {'ca': '/junk/file'}})
>>> db
<_mysql.connection open to '10.105.136.101' at 561aaa994f98>
Setting ssl above to a non-existent certificate /junk/file still works fine without any error.

Categories

Resources