Cannot connect to remote Oracle database from within Docker container - python

I am trying to connect to a remote Oracle database from inside a docker container, but fail with the following error:
oracledb.exceptions.DatabaseError: ORA-12154: TNS:could not resolve the connect identifier specified
The connection works from my host machine, and I've added port mapping and network to my docker-compose:
ports:
- "80:80"
network_mode: "host"
I am using python and oracledb library for connection, and these are the commands:
import oracledb
io = '/usr/lib/oracle/21/client64/lib'
oracledb.init_oracle_client(lib_dir=io)
dsn = 'DBNAME:1521/XXX'
connection = oracledb.connect(
user="my_user",
password="my_pwd",
dsn=dsn
)
At this point, I don't fully understand if there is even something that I could do, or if this error originates from the database settings, and I should contact the DB admins.
Thanks!

Related

"Unable to connect: Adaptive Server is unavailable or does not exist" in SQL Server Docker Image

I am trying to connect to a Docker container running SQL Server using pymssql and I am getting the following error,
Can't connect to db: (20009, b'DB-Lib error message 20009, severity 9:
Unable to connect: Adaptive Server is unavailable or does not exist (localhost)
Net-Lib error during Connection refused (111)
DB-Lib error message 20009, severity 9:
Unable to connect: Adaptive Server is unavailable or does not exist (localhost)
Net-Lib error during Connection refused (111)\n')
Now, I know similar questions have been asked before, such as the one given below,
Database connection failed for local MSSQL server with pymssql
However, most of the solutions given revolve around opening up the Sql Server Configuration Manager and making changes through the UI or allowing network traffic to pass through. However, I am not entirely certain how to do this with a Docker container.
How can I resolve this? Is there a different SQL Server I should run? Or a different Python package I should use to establish the connection?
I have spin up my container based on the instructions given here,
https://learn.microsoft.com/en-us/sql/linux/quickstart-install-connect-docker?view=sql-server-ver16&pivots=cs1-bash
Given below are the parameters I am using to establish the connection,
{
"host": "localhost",
"port": "1433",
"user": "SA",
"password": "<my-password>",
"database": "TestDB"
}
Update: I am actually trying to connect to the SQL Server Docker instance from within another container. It is the MindsDB container. This MindsDB container is, however, using pymysql to establish the connection.
Update: Given below are the two commands I am using to run my containers.
SQL Server:
sudo docker run -e "ACCEPT_EULA=Y" -e "MSSQL_SA_PASSWORD=<my-password>" \
-p 1433:1433 --name sql1 --hostname sql1 \
-d \
mcr.microsoft.com/mssql/server:2022-latest
MindsDB:
docker run -p 47334:47334 -p 47335:47335 mindsdb/mindsdb

FastAPI and PostgreSQL in docker-compose file connection error

This question has been asked already
for example
Docker: Is the server running on host "localhost" (::1) and accepting TCP/IP connections on port 5432?
but I still can't figure out how to properly connect the application to the database.
Files:
Dockerfile
FROM python:3.10-slim
WORKDIR /app
COPY . .
RUN pip install --upgrade pip
RUN pip install "fastapi[all]" sqlalchemy psycopg2-binary
docker-compose.yml
version: '3.8'
services:
ylab:
container_name: ylab
build:
context: .
entrypoint: >
sh -c "uvicorn main:app --reload --host 0.0.0.0"
ports:
- "8000:8000"
postgres:
container_name: postgr
image: postgres:15.1-alpine
environment:
POSTGRES_DB: "fastapi_database"
POSTGRES_PASSWORD: "password"
ports:
- "5433:5432"
main.py
import fastapi as _fastapi
import sqlalchemy as _sql
import sqlalchemy.ext.declarative as _declarative
import sqlalchemy.orm as _orm
DATABASE_URL = "postgresql://postgres:password#localhost:5433/fastapi_database"
engine = _sql.create_engine(DATABASE_URL)
SessionLocal = _orm.sessionmaker(autocommit=False, autoflush=False, bind=engine)
Base = _declarative.declarative_base()
class Menu(Base):
__tablename__ = "menu"
id = _sql.Column(_sql.Integer, primary_key=True, index=True)
title = _sql.Column(_sql.String, index=True)
description = _sql.Column(_sql.String, index=True)
app = _fastapi.FastAPI()
# Create table 'menu'
Base.metadata.create_all(bind=engine)
This works if I host only the postgres database in the container and my application is running locally, but if the database and application are in their own containers, no matter how I try to change the settings, the error always comes up:
"sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (127.0.0.1), port 5433 failed: Connection refused
ylab | Is the server running on that host and accepting TCP/IP connections?
ylab | connection to server at "localhost" (::1), port 5433 failed: Cannot assign requested address
ylab | Is the server running on that host and accepting TCP/IP connections?"
The error comes up in
Base.metadata.create_all(bind=engine)
I also tried
DATABASE_URL = "postgresql://postgres:password#postgres:5433/fastapi_database"
but still error:
"sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "postgres" (172.23.0.2), port 5433 failed: Connection refused
ylab | Is the server running on that host and accepting TCP/IP connections?"
There is some kind of config file or something mentioned in the answer above but I can't figure out how to manage that config.
You should update your config to reference the service name of postgres and the port the database runs on inside the container
DATABASE_URL = "postgresql://postgres:password#postgres:5432/fastapi_database"
When your app was running locally on your machine with the database running in the container then localhost:5433 would work since port 5433 on the host was mapped to 5432 inside the db container.
When you then put the app in its own container but still refer to localhost then it will be looking for the postgres database inside the same container the app is in which is not right.
When you put the right service name but with port 5433 you will also get an error since port 5433 is only being mapped on the host running the containers not from inside the containers them self.
So what you want to do in the app container is just target the database service on port 5432 as thats the port postgres will be running on inside the container.
You also probably want to look at a depends on script that will not start the fast api app until the db is up and ready.

Access on local PostgreSQL Database from the inside of a docker container with a flask-webapp using a psycopg2 connection

I am relatively new to this topic. I try to build a webapp using flask. The webapp uses data from a postgresql database which is running local (Mac OS Monterey 12.2.1).
My application uses a python code which accesses data from the database by connecting to the database with psycopg2:
con = psycopg2.connect(
host = "192.168.178.43"
database = self.database,
port = "5432",
user = "user",
password = "password")
I already added the relevant entries to the "pg_hba.conf" file and to the "postgresql.conf" file to the needed configurations for an access in my home network. But i still got an error when starting the container. The app runs perfect outside the container. I think I miss some important steps to complete a successful connection.
This error is the following
sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) could not connect to server: Connection refused
Is the server running on host "127.0.0.1" and accepting
TCP/IP connections on port 5432?

Is it possible to make a pymysql connection to a MySQL docker container?

I am having trouble making a connection to a MySQL container using pymysql from outside a container on Mac OSX, though I am able to connect to the docker MySQL with MySQL Workbench.
I used the following docker-compose.yaml:
version: '3'
services:
db:
image: mysql:latest
volumes:
- ./data/db:/var/lib/mysql
ports:
- 6603:3306
restart: always
environment:
MYSQL_ROOT_PASSWORD: password
MYSQL_USER: testuser
MYSQL_PASSWORD: password
And the following pymysql
import pymysql
conn = pymysql.connect(
host='0.0.0.0',
port=6603,
user='testuser',
password='password',
charset='utf8mb4',
cursorclass=pymysql.cursors.DictCursor
)
And received the following error:
OperationalError: (1045, "Access denied for user 'testuser'#'172.18.0.1' (using password: NO)")
The same applies when trying to login as root.
I have tried:
1) Granting privileges in many permutations, also setting up HOST as '%'
2) Removing and recreating the mounted volume
Mainly I want to be able to access the MySQL database from an interactive shell during development.
Any guidance would be great. Thanks.
You should use the IP address 127.0.0.1, or localhost, or your external IP address instead of 0.0.0.0. The address 0.0.0.0 is a special IP address that only makes sense when creating a listening socket, but it makes no sense for creating a connecting socket.

docker container connect local postgresql

I run python container, I want to connect localhost postegresql. And I try some method. But not work. Please talk me. How can I do ? Thanks.
I have run postegresql on port 5432, create datatbase and grant user.
run docker comand
docker run --name=python3 -v ${pwd}:/code -w /code python
python code
import psycopg2
def main():
#Define our connection string
conn_string = "host='localhost' dbname='testDB' user='test' password='test'"
# print the connection string we will use to connect
print ("Connecting to database\n ->{}".format(conn_string))
# get a connection, if a connect cannot be made an exception will be raised here
conn = psycopg2.connect(conn_string)
# conn.cursor will return a cursor object, you can use this cursor to perform queries
cursor = conn.cursor()
print ("Connected!\n")
if __name__ == "__main__":
main()
error message
Is the server running on host "localhost" (::1) and accepting
TCP/IP connections on port 5432?
could not connect to server: Connection refused
Is the server running on host "localhost" (127.0.0.1) and accepting
TCP/IP connections on port 5432?
It depends on your host OS and your docker version.
I assume here your database is not running in a container itself, but rather on the host itself (localhost).
For instance, as mentioned in "From inside of a Docker container, how do I connect to the localhost of the machine?", with Docker for Mac v 17.06 and above (June 2017), you can connect to the special Mac-only DNS name docker.for.mac.localhost which will resolve to the internal IP address used by the host.
On Linux directly, you would use the host mode, with an image having ifconfig installed:
docker run --network="host" -id <Docker image ID>
Reading that you're on Windows 10 and running postgresql on the host, I advise you to run postgresql in a container. It makes this way easier.
To connect the python container to the postgres container you'll need a docker network though. Let's call it postgres_backend.
docker network create postgres_backend
You can create the postgresql container with the following command. Just change the /path/to/store/data to a local directory in which you'd like to store the postgres data:
docker run --name postgres \
-e POSTGRES_PASSWORD=test \
-e POSTGRES_USER=test \
-d --restart always \
-v /path/to/store/data:/var/lib/postgresql/data \
--net postgres_backend \
postgres:9.6
Now your postresql container should be up and running :)
To connect your python container to it, you'll have to add a --net postgres_backend to your docker run command and change the host in your script to "postgres" (it's the name we gave the postgres container with --name).
If the python container can't find the host "postgres", try it with the IP shown when entering the command docker exec -ti postgres ip addr show.
to fix this bug:
First, it is normal that it does not work, because postgresql does not run in the same container as the application so the host localhost: 5432 does not exist.
to fix it :
on the properties file isntead localhost:5432 use your IPadresse like IP:5432 and in the pg_hba.conf add this
host all all 0.0.0.0/0 md5

Categories

Resources