Is there a way to add an ssh connection to Apache Airflow from the UI either via connections or vairables tab that allow connection using a pem key and not a username and password.
DISCLAIMER: Following answer is purely speculative
I think key_file param of SSHHook is meant for this purpose
And the idiomatic way to supply it is to pass it's name via extra args in Airflow Connection entry (web UI)
Of course when neither key_file nor credentials are provided, then SSHHook falls back to identityfile to initialize paramiko client.
Also have a look how SFTPHook is handling this
Related
I'm currently trying to use the dockerpy sdk to connect to my remote ubuntu server so i can manage my docker containers via python.
I am getting a few issues when attempting to do this.
docker.APIClient(base_url="ssh://user#ip")
When doing the following I am getting the error:
paramiko.ssh_exception.PasswordRequiredException: private key file is encrypted
I can resolve this issue by adding the kwarg: use_ssh_client, but then i am forced to input a password, which limits the potential for automation.
docker.APIClient(base_url="ssh://user:#ip", use_ssh_client=True)
When using the above code, I have also tried to enter my ssh key password into the base_url such as:
docker.APIClient(base_url="ssh://user:pass#ip", use_ssh_client=True)
However, this then greets me with the following error:
docker.errors.DockerException: Invalid bind address format: ssh://root:pass#ip
I have run out of ideas and am confused as to how I am supposed to get around this?
Many thanks in advance...
It's possible to make a connection as Mr. Piere answered here. Even with that question about docker.client.DockerClient which uses docker.api.client.APIClient under the hood.
You are trying to establish a connection using Password Authentication that's why you asked to prompt a password.
I guess you need to configure the Key-Based SSH Login as said in docker's docs
Steps to fix:
configure SSH Login on a remote server and fill ~/.ssh/config on your local machine
connect from the local terminal using the ssh command to ensure a connection is established without asking password ssh user#ip
connect using library client = docker. APIClient(base_url="ssh://user#ip", use_ssh_client=True)
I had a similar Problem. Your Problem is, that you Key is encrypted. The Docker Clients doesn't have a Passphrase option by default. I wrote some Code based on this Post. It works for me :)
import os
from docker import APIClient
from docker.transport import SSHHTTPAdapter
class MySSHHTTPAdapter(SSHHTTPAdapter):
def _connect(self):
if self.ssh_client:
self.ssh_params["key_filename"] = os.environ.get("SSH_KEY_FILENAME")
self.ssh_params["passphrase"] = os.environ.get("SSH_PASSPHRASE")
self.ssh_client.connect(**self.ssh_params)
client = APIClient('ssh://ip:22', use_ssh_client=True, version='1.41')
ssh_adapter = MySSHHTTPAdapter('ssh://user#ip:22')
client.mount('http+docker://ssh', ssh_adapter)
print(client.version())
Hello fellow AWS contributors, I’m currently working on a project to set up an example of connecting a Lambda function to our PostgreSQL database hosted on RDS. I tested my Python + SQL code locally (in VS code and DBeaver) and it works perfectly fine with including only basic credentials(host, dbname, username password). However, when I paste the code in Lambda function, it gave me all sorts of errors. I followed this template and modified my code to retrieve the credentials from secret manager instead.
I’m currently using boto3, psycopg2, and secret manager to get credentials and connect to the database.
List of errors I’m getting-
server closed the connection unexpectedly. This probably means the server terminated abnormally before or while processing the request
could not connect to server: Connection timed out. Is the server running on host “db endpoint” and accepting TCP/IP connections on port 5432?
FATAL: no pg_hba.conf entry for host “ip:xxx”, user "userXXX", database "dbXXX", SSL off
Things I tried -
RDS and Lambda are in the same VPC, same subnet, same security group.
IP address is included in the inbound rule
Lambda function is set to run up to 15 min, and it always stops before it even hits 15 min
I tried both database endpoint and database proxy endpoint, none of it works.
It doesn’t really make sense to me that when I run the code locally, I only need to provide the host, dbname, username, and password, that’s it, and I’m able to write all the queries and function I want. But when I throw the code in lambda function, it’s requiring all these secret manager, VPC security group, SSL, proxy, TCP/IP rules etc. Can someone explain why there is a requirement difference between running it locally and on lambda?
Finally, does anyone know what could be wrong in my setup? I'm happy to provide any information in related to this, any general direction to look into would be really helpful. Thanks!
Following the directions at the link below to build a specific psycopg2 package and also verifying the VPC subnets and security groups were configured correctly solved this issue for me.
I built a package for PostgreSQL 10.20 using psycopg2 v2.9.3 for Python 3.7.10 running on an Amazon Linux 2 AMI instance. The only change to the directions I had to make was to put the psycopg2 directory inside a python directory (i.e. "python/psycopg2/") before zipping it -- the import psycopg2 statement in the Lambda function failed until I did that.
https://kalyanv.com/2019/06/10/using-postgresql-with-python-on-aws-lambda.html
This the VPC scenario I'm using. The Lambda function is executing inside the Public Subnet and associated Security Group. Inbound rules for the Private Subnet Security Group only allow TCP connections to 5432 for the Public Subnet Security Group.
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_VPC.Scenarios.html#USER_VPC.Scenario1
I am creating a Python AWS Lambda function that connects to db to extract data as CSV then sftp that CSV into an SFTP server (abc.example.com). I am using pysftp and Paramiko. Looks like pysftp needs a private key file for password less connection to SFTP host. How do I get this private key file?
Do we need to create a public/private key pair (ssh-keygen) at destination SFTP host? And then use the public part of that key within Lambda function?
Thanks
Yes, if you don't have it already then you have to create keys using ssh-keygen on sftp host and use it.
import pysftp
with pysftp.Connection('hostname', username='me', private_key='/path/to/keyfile') as sftp:
#
# ... do sftp operations
#
Reference: https://pysftp.readthedocs.io/en/release_0.2.8/cookbook.html
Just setup a public key authentication the same way you would do it for a normal (GUI/commandline) SFTP or SSH client. There's nothing pysftp/Python/Lambda-specific about that.
There are zillions of guide on the Internet showing how to do that.
For example my article Set up SSH public key authentication.
And then use the private key in your Python/pysftp code:
Connect to SFTP with key file using Python pysftp
As pysftp requires the key in a physical file, what can be complicated to do in AWS Lambda, you can also hard-code the key in the Python code, if you switch to Paramiko:
SSH/SCP through Paramiko with key in string
(see pysftp vs. Paramiko)
I'm working on coding an application in Python 3 for users to send and retrieve data between others with this application. The process would be a client inputting an encoded string, and using a server as a middle man to then send data to another client. I'm well versed in what would be used for the client application, but this server knowledge I am new to. I have a VPS server up and running, and I researched and found the module pysftp would be good for transferring data back and forth. However, I'm concerned about the security of the server when using the application. This module requires the authentication details of the server when making a connection, and I don't think having my server's host, username and password in the application code is very safe. What would be the safe way to go about this?
Thanks,
Gunner
You might want to use pre-generated authentication keys. If you are familiar with the process of using the ssh-keygen tool to create SSH key pairs, it's the same thing. You just generate the key pair, place the private key on the client machine, and put the public key on the target server. Then you can use pysftp like this:
with pysftp.Connection('hostname', username='me', private_key='/path/to/keyfile') as sftp:
<do some stuff>
the authentication is handled using the key pair and no password is required. This isn't to say that your security issue is solved: the private key is still a sensitive credential that needs to be treated like a password. The advantage is that you don't have a plaintext password stored in a file anywhere, and you are using a well established and secure process to manage authentication. The private key is set with permission 0600 to prevent anyone but the owner from accessing it.
I am trying to connect to SAP HANA data source via Python code.
I did manage to establish a connection. I have a raw data string in my code as follows:
db = pyodbc.connect(driver = '{HDBODBC}', UID='username', PWD='password', SERVERNODE='server:<port_no>')
However, I do not want the UID and PWD fields in my string.
I did set up a DSN connection using the ODBC manager on Windows. But, I still need to enter my username and pwd as follows:
db = pyodbc.connect(DSN="MyDSN", UID='username', PWD='password')
How can I set up a connection without my UID and PWD being displayed in the python code?
I have been looking for the same option to use hdbuserstore key to be used with python for connecting to SAP HANA. Looks like HDB client hdbcli has that option added now.
The user that is running the script needs to have the PYTHON_PATH set to the location of the hdbclient or in the script you can have the path set.
from hdbcli import dbapi
conn = dbapi.connect(key='hdbuserstore key',CONNECTTIMEOUT=5)
conn.isconnected() will return True if the connection is successful.
hope this is helpful for someone!
Be carefull with parameter CONNECTTIMEOUT=5.
from hdbcli import dbapi
conn = dbapi.connect(key='hdbuserstore key',CONNECTTIMEOUT=5)
This means NOT 5 second because it is in ms. Toke me long time to find out this problem.
connectTimeout, Timeout in milliseconds
0 (use system's TCP/IP socket connection timeout)
Aborts connection attempts after the specified timeout.
for example create file in a secure place and load connection setting (UID, PWD encrypted password (heshkod)) from this file
This requirement is relatively easy to fulfill.
The SAP HANA client software (the package that also contains the ODBC driver) provides a program to set up a secure store for logon data: hdbuserstore.
In my blog I explained how that works in detail.
The core steps are
create the hdbuserstore entries for the operating system user that should use the application.
Syntax: hdbuserstore SET <KEY> <ENV> <USERNAME> <PASSWORD>
Example: hdbuserstore SET millerj "localhost:30115" JohnMiller 2wsx$RFV
The hdbuserstore key needs to be referred to in the ODBC connection.
To do that, fill the SERVERNODE parameter with #<KEYNAME> instead of the actual server address.
For the example above, the value would be #millerj.
And that's really all. The ODBC driver will try to look up the hdbuserstore entry provided upon connection and use that to connect to the database.
Check the documentation for more information on this.