This question has been asked already
for example
Docker: Is the server running on host "localhost" (::1) and accepting TCP/IP connections on port 5432?
but I still can't figure out how to properly connect the application to the database.
Files:
Dockerfile
FROM python:3.10-slim
WORKDIR /app
COPY . .
RUN pip install --upgrade pip
RUN pip install "fastapi[all]" sqlalchemy psycopg2-binary
docker-compose.yml
version: '3.8'
services:
ylab:
container_name: ylab
build:
context: .
entrypoint: >
sh -c "uvicorn main:app --reload --host 0.0.0.0"
ports:
- "8000:8000"
postgres:
container_name: postgr
image: postgres:15.1-alpine
environment:
POSTGRES_DB: "fastapi_database"
POSTGRES_PASSWORD: "password"
ports:
- "5433:5432"
main.py
import fastapi as _fastapi
import sqlalchemy as _sql
import sqlalchemy.ext.declarative as _declarative
import sqlalchemy.orm as _orm
DATABASE_URL = "postgresql://postgres:password#localhost:5433/fastapi_database"
engine = _sql.create_engine(DATABASE_URL)
SessionLocal = _orm.sessionmaker(autocommit=False, autoflush=False, bind=engine)
Base = _declarative.declarative_base()
class Menu(Base):
__tablename__ = "menu"
id = _sql.Column(_sql.Integer, primary_key=True, index=True)
title = _sql.Column(_sql.String, index=True)
description = _sql.Column(_sql.String, index=True)
app = _fastapi.FastAPI()
# Create table 'menu'
Base.metadata.create_all(bind=engine)
This works if I host only the postgres database in the container and my application is running locally, but if the database and application are in their own containers, no matter how I try to change the settings, the error always comes up:
"sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (127.0.0.1), port 5433 failed: Connection refused
ylab | Is the server running on that host and accepting TCP/IP connections?
ylab | connection to server at "localhost" (::1), port 5433 failed: Cannot assign requested address
ylab | Is the server running on that host and accepting TCP/IP connections?"
The error comes up in
Base.metadata.create_all(bind=engine)
I also tried
DATABASE_URL = "postgresql://postgres:password#postgres:5433/fastapi_database"
but still error:
"sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "postgres" (172.23.0.2), port 5433 failed: Connection refused
ylab | Is the server running on that host and accepting TCP/IP connections?"
There is some kind of config file or something mentioned in the answer above but I can't figure out how to manage that config.
You should update your config to reference the service name of postgres and the port the database runs on inside the container
DATABASE_URL = "postgresql://postgres:password#postgres:5432/fastapi_database"
When your app was running locally on your machine with the database running in the container then localhost:5433 would work since port 5433 on the host was mapped to 5432 inside the db container.
When you then put the app in its own container but still refer to localhost then it will be looking for the postgres database inside the same container the app is in which is not right.
When you put the right service name but with port 5433 you will also get an error since port 5433 is only being mapped on the host running the containers not from inside the containers them self.
So what you want to do in the app container is just target the database service on port 5432 as thats the port postgres will be running on inside the container.
You also probably want to look at a depends on script that will not start the fast api app until the db is up and ready.
Related
I am trying to connect to a remote Oracle database from inside a docker container, but fail with the following error:
oracledb.exceptions.DatabaseError: ORA-12154: TNS:could not resolve the connect identifier specified
The connection works from my host machine, and I've added port mapping and network to my docker-compose:
ports:
- "80:80"
network_mode: "host"
I am using python and oracledb library for connection, and these are the commands:
import oracledb
io = '/usr/lib/oracle/21/client64/lib'
oracledb.init_oracle_client(lib_dir=io)
dsn = 'DBNAME:1521/XXX'
connection = oracledb.connect(
user="my_user",
password="my_pwd",
dsn=dsn
)
At this point, I don't fully understand if there is even something that I could do, or if this error originates from the database settings, and I should contact the DB admins.
Thanks!
I have the same issue while trying to connect to mysql(percona) container upped with docker-compose.
I have this simple code
import sqlalchemy
engine = sqlalchemy.create_engine(
f'mysql+pymysql://root:admin#127.0.0.1:3306/DB_MYAPP',
encoding='utf8'
)
connection = engine.connect()
connection.close()
Here is the part of docker-compose.yml
mysql:
image: 'percona:latest'
container_name: 'mysql'
environment:
MYSQL_ROOT_PASSWORD: admin
MYSQL_DATABASE: DB_MYAPP
MYSQL_USER: test_qa
MYSQL_PASSWORD: qa_test
ports:
- '3306:3306'
volumes:
- '/home/rolf/PycharmProject/Endshpiel/mysql/myapp_db:/docker-entrypoint-initdb.d'
Container is working
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
8f1556e66ccb percona:latest "/docker-entrypoint.…" 55 minutes ago Up 9 minutes 0.0.0.0:3306->3306/tcp mysql
Also i tried increasing max_allowed_packet
I can easily connect to docker container via linux terminal (host:127.0.0.1 port:3306) and work with database. But when i try to connect to container with python i get this error
mysql | 2022-05-24T23:17:10.428251Z 6 [Note] Aborted connection 6
to db: 'DB_MYAPP' user: 'root' host: '172.20.0.1' (Got an error
reading communication packets)
This question already has an answer here:
Access Docker postgres container from another container
(1 answer)
Closed 3 years ago.
Got Error while doing docker-compose up.
conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
psycopg2.OperationalError: could not connect to server: Connection refused
Is the server running on host "localhost" (127.0.0.1) and accepting
TCP/IP connections on port 5433?
could not connect to server: Cannot assign requested address
Is the server running on host "localhost" (::1) and accepting
TCP/IP connections on port 5433?
docker-compose.yml:-
version: '3'
services:
dcs_web:
build: .
depends_on:
- db
ports:
- "5000:5000"
db:
image: postgres:latest
volumes:
- db-data:/var/lib/postgresql/data
ports:
- "5433:5433"
environment:
- 'POSTGRES_DB:dcmDB'
- 'POSTGRES_USER:postgres'
- 'POSTGRES_PASSWORD:admin'
volumes:
db-data:
In App config.ini:
[DEFAULT]
DB_NAME = user
DB_PASSWORD = admin
DB_USER = postgres
DB_HOST = localhost
DB_PORT = 5433
DEBUG = True
I have gone throught '/var/lib/postgresql/data' location 'listen adress = *' is there . Dont know how to deal with this.
Each container in Docker is a separate host, which means that you can't reach Postgres from dcs_web using localhost, you have to use the hostname of the Postgres container, which by default is the name of the service defined in the Docker Compose file, in your case: db.
Replace localhost with db in your config.ini and it should work.
I am having trouble making a connection to a MySQL container using pymysql from outside a container on Mac OSX, though I am able to connect to the docker MySQL with MySQL Workbench.
I used the following docker-compose.yaml:
version: '3'
services:
db:
image: mysql:latest
volumes:
- ./data/db:/var/lib/mysql
ports:
- 6603:3306
restart: always
environment:
MYSQL_ROOT_PASSWORD: password
MYSQL_USER: testuser
MYSQL_PASSWORD: password
And the following pymysql
import pymysql
conn = pymysql.connect(
host='0.0.0.0',
port=6603,
user='testuser',
password='password',
charset='utf8mb4',
cursorclass=pymysql.cursors.DictCursor
)
And received the following error:
OperationalError: (1045, "Access denied for user 'testuser'#'172.18.0.1' (using password: NO)")
The same applies when trying to login as root.
I have tried:
1) Granting privileges in many permutations, also setting up HOST as '%'
2) Removing and recreating the mounted volume
Mainly I want to be able to access the MySQL database from an interactive shell during development.
Any guidance would be great. Thanks.
You should use the IP address 127.0.0.1, or localhost, or your external IP address instead of 0.0.0.0. The address 0.0.0.0 is a special IP address that only makes sense when creating a listening socket, but it makes no sense for creating a connecting socket.
I run python container, I want to connect localhost postegresql. And I try some method. But not work. Please talk me. How can I do ? Thanks.
I have run postegresql on port 5432, create datatbase and grant user.
run docker comand
docker run --name=python3 -v ${pwd}:/code -w /code python
python code
import psycopg2
def main():
#Define our connection string
conn_string = "host='localhost' dbname='testDB' user='test' password='test'"
# print the connection string we will use to connect
print ("Connecting to database\n ->{}".format(conn_string))
# get a connection, if a connect cannot be made an exception will be raised here
conn = psycopg2.connect(conn_string)
# conn.cursor will return a cursor object, you can use this cursor to perform queries
cursor = conn.cursor()
print ("Connected!\n")
if __name__ == "__main__":
main()
error message
Is the server running on host "localhost" (::1) and accepting
TCP/IP connections on port 5432?
could not connect to server: Connection refused
Is the server running on host "localhost" (127.0.0.1) and accepting
TCP/IP connections on port 5432?
It depends on your host OS and your docker version.
I assume here your database is not running in a container itself, but rather on the host itself (localhost).
For instance, as mentioned in "From inside of a Docker container, how do I connect to the localhost of the machine?", with Docker for Mac v 17.06 and above (June 2017), you can connect to the special Mac-only DNS name docker.for.mac.localhost which will resolve to the internal IP address used by the host.
On Linux directly, you would use the host mode, with an image having ifconfig installed:
docker run --network="host" -id <Docker image ID>
Reading that you're on Windows 10 and running postgresql on the host, I advise you to run postgresql in a container. It makes this way easier.
To connect the python container to the postgres container you'll need a docker network though. Let's call it postgres_backend.
docker network create postgres_backend
You can create the postgresql container with the following command. Just change the /path/to/store/data to a local directory in which you'd like to store the postgres data:
docker run --name postgres \
-e POSTGRES_PASSWORD=test \
-e POSTGRES_USER=test \
-d --restart always \
-v /path/to/store/data:/var/lib/postgresql/data \
--net postgres_backend \
postgres:9.6
Now your postresql container should be up and running :)
To connect your python container to it, you'll have to add a --net postgres_backend to your docker run command and change the host in your script to "postgres" (it's the name we gave the postgres container with --name).
If the python container can't find the host "postgres", try it with the IP shown when entering the command docker exec -ti postgres ip addr show.
to fix this bug:
First, it is normal that it does not work, because postgresql does not run in the same container as the application so the host localhost: 5432 does not exist.
to fix it :
on the properties file isntead localhost:5432 use your IPadresse like IP:5432 and in the pg_hba.conf add this
host all all 0.0.0.0/0 md5