Python Code running in Docker not working with NGROK - python

rather new to coding in general and especially new to docker & python so please excuse any blatant mistakes. I have been struggling with a specific issue for weeks now and maybe someone here can help me.
A webservice we'd like to use needs to be connected to a PostgreSQL Database via a Python connector. This connector basically compares data input on the webservice with the data in the DB and matches variables.
https://github.com/rossumai/elis-connector-example-python
Right now I am running the databse in AWS and I have succesfully uploaded sample data on it. Also I have run the connector.py code in Docker via the localhost:5000 port. Now I want to connect all the parts and I am running Ngrok to tunnel my localhost:5000 and I tell the webservice to connect to the https address ngrok is giving me.
Unfortunately I am receiving the "Failed to complete tunnel connection: The connection was successfully tunneled but the client failed to establish a connection to localhost:5000"
Is this a port issue?
This is my Dockerfile:
FROM python:3.7
WORKDIR /src/connector.py
COPY requirements.txt ./
RUN apt-get update && apt-get install -y libpq-dev gcc
RUN pip3 install psycopg2~=2.7.7
RUN apt-get autoremove -y gcc
RUN pip install --no-cache-dir -r requirements.txt
COPY . .
CMD [ "python", "./connector.py" ]
Any help, or what else can I add to make my problem a bit more comprehensible?
BR & thanks for any help

Related

Docker Max retries exceeded with url: /assets

I'm a bit new to docker and I'm messing around with it. I currently have a server being ran on port 5000 in another container. This server is being ran with express and uses JavaScript. I'm trying to send requests to that server with python. I tried using both localhost:5000 and 127.0.0.1:5000 but neither of these seems to work. What can I do? I noticed if I run the python code without docker it works perfectly fine.
Python Docker File:
FROM ubuntu:latest
RUN apt update
RUN apt install python3 -y
RUN apt-get install -y python3-pip
WORKDIR /usr/app/src
COPY . .
RUN pip install discum
RUN pip install python-dotenv
EXPOSE 5000
CMD ["python3", "./src/index.py"]
JavaScript Docker File:
FROM node:latest
WORKDIR /usr/src/app
COPY package.json ./
RUN npm install
COPY . .
CMD ["npm", "start"]
You could create a network between the to containers using --net look at this answer How to get Docker containers to talk to each other while running on my local host?
Another way, and my preferred way, is to use docker-compose and create networks between your containers.
Use Service Name and Port is always the best.
So if you have a docker file like the below you could use the URL http://client:5000
version: 3.8
services:
client:
image: blah
ports:
- 5000:5000

Can't connect to SQL server from Container

My Python container is throwing the error below when trying to connect to my SQL DB hosted on a server:
mariadb.OperationalError: Can't connect to server on 'xxxxxxxxxxx.jcloud-ver-jpc.ik-server.com' (115)
I am trying to run my container from my server as well. If I run the exact same container from my machine, I can connect to the SQL DB.
I am new to Docker, so just for info, here is my Dockerfile:
FROM python:3.10-slim-buster
WORKDIR /app
COPY alpha_gen alpha_gen
COPY poetry.lock .
COPY pyproject.toml .
# install basic utils
RUN apt-get update
RUN apt-get install curl -y
RUN apt-get install gcc -y
# install MariaDB connector
RUN apt install wget -y
RUN wget https://downloads.mariadb.com/MariaDB/mariadb_repo_setup
RUN chmod +x mariadb_repo_setup
RUN ./mariadb_repo_setup \ --mariadb-server-version="mariadb-10.6"
RUN apt install libmariadb3 libmariadb-dev -y
# install poetry
RUN curl -sSL https://install.python-poetry.org | python3 -
ENV PATH="${PATH}:/root/.local/bin"
RUN poetry config virtualenvs.in-project true
# install dependencies
RUN poetry install
CMD poetry run python alpha_gen/main.py --load_pre_process
Any ideas ?
Problem solved. Apparently there is a private port to use for communication from the server and a public port for communication from outside the server. I was using the public one so it was not working.

Python Docker error: [FreeTDS][SQL Server]Unable to connect to data source (0) (SQLDriverConnect) on SQL Server

I'm trying to make a simple MS SQL Server call from Python by using Docker. The SQL connection is able to establish if I run the python execute script, but it won't work from Docker.
My code is below
Dockerfile
from python:3
WORKDIR /code
COPY requirements.txt .
RUN apt-get update \
&& apt-get install unixodbc -y \
&& apt-get install unixodbc-dev -y \
&& apt-get install freetds-dev -y \
&& apt-get install freetds-bin -y \
&& apt-get install tdsodbc -y \
&& apt-get install --reinstall build-essential -y
RUN echo "[FreeTDS]\n\
Description = FreeTDS Driver\n\
Driver = /usr/lib/x86_64-linux-gnu/odbc/libtdsodbc.so\n\
Setup = /usr/lib/x86_64-linux-gnu/odbc/libtdsodbc.so" >> /etc/odbcinst.ini
#Pip command without proxy setting
RUN pip install -r requirements.txt
COPY src/ .
CMD ["python", "./producer.py"]
producer.py
import pyodbc
connP = pyodbc.connect('driver={FreeTDS};'
'server={MYSERV01\SQLEXPRESS};'
'database=ABCD;'
'uid=****;'
'pwd=*****')
requirement.txt
kafka-python
pyodbc==4.0.28
Error message
I referred to this article and did. I searched online for resolutions and tried several steps, but nothing helped. I'm pretty new to Docker and no experience in Python, so any help would be really good. Thanks in advance!
In your pyodbc.connect try to give the server as '0.0.0.0' instead of any other value.
If you want to debug it from inside the container, then comment the last CMD line of your Dockerfile.
Build your Docker container
docker build -f Dockerfile -t achu-docker-container .
Run your Docker Container
docker run -it achu-docker-container /bin/bash
This will place you inside the container. This is like, ssh to a different machine.
Go to your WORKDIR
cd code
python ./producer.py
What do you get the above above? (If you install any editor using apt-get install vim you will be able to interactively edit the producer.py file and fix your problem from inside the running container.
Then you can move your changes to your source Dockerfile and build a new image and container with it.
I was trying to connect to the local SQL Server database. I referred many articles and figured out that the following code works:
the server should have host.docker.inter,<port_no> -- this was the catch. When it comes to dedicated database where the sql server is different from the docker image, the server name is provided directly, but when both image and database are in same server, the following code works. Please check the port number in the SQL Configuration TCP Address (IP4All)

Correct way for deploying dbt with docker and cloud run

I'm trying to deploy dbt on a Google cloud run service with a docker container. following david vasquez and dbt Docker images However when trying to deploy the builded image to cloud run. I'm getting an error.
ERROR: (gcloud.run.deploy) Cloud Run error: Container failed to start. Failed to start and then listen on the port defined by the PORT environment variable. Logs for this revision might contain more information.
This is my dockerfile
FROM python:3.8.1-slim-buster
RUN apt-get update && apt-get dist-upgrade -y && apt-get install -y --no-install-recommends git software-properties-common make build-essential ca-certificates libpq-dev && apt-get clean && rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/*
COPY requirements/requirements.0.17.0rc4.txt ./requirements.0.17.0rc4.txt
RUN pip install --upgrade pip setuptools
RUN pip install -U pip
RUN pip install dbt==0.17.0
RUN pip install --requirement ./requirements.0.17.0rc4.txt
RUN useradd -mU dbt_user
ENV PYTHONIOENCODING=utf-8
ENV LANG C.UTF-8
ENV PORT = 8080
ENV HOST = 0.0.0.0
WORKDIR /usr/app
VOLUME /usr/app
USER dbt_user
CMD ['dbt', 'run']
I understand the health check fails because it can't find a port to listen to, except i specify one in my ENV
Can anyone help me with a solution? thx in advance
According to the documentation one of the requirements to deploy an application on Cloud Run is to listen requests on 0.0.0.0 and expose a port:
The container must listen for requests on 0.0.0.0 on the port to which requests are sent. By default, requests are sent to 8080, but you can configure Cloud Run to send requests to the port of your choice.
dbt is a command line tool which means it doesn't expose any PORT, hence when you're trying to deploy Cloud Run and it verifies if the build is listening it fails with the mentioned error.

Install mysql in dockerfile?

I want to write simple python application and put in docker conteiner with dockerfile. My dockerfile is:
FROM ubuntu:saucy
# Install required packages
RUN apt-get update
RUN DEBIAN_FRONTEND=noninteractive apt-get -y install python
RUN DEBIAN_FRONTEND=noninteractive apt-get -y install mysql-server python-mysqldb
# Add our python app code to the image
RUN mkdir -p /app
ADD . /app
WORKDIR /app
# Set the default command to execute
CMD ["python", "main.py"]
In my python application I only want to connect to the database. main.py look something like this:
import MySQLdb as db
connection = db.connect(
host='localhost',
port=3306,
user='root',
passwd='password',
)
When I built docker image with:
docker build -t myapp .
and run docker image with:
docker run -i myapp
I got error:
_mysql_exceptions.OperationalError: (2002, "Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)")
What is the problem?
The problem is that you've never started the database - you need to explicitly start services in most Docker images. But if you want to run two processes in Docker (the DB and your python program), things get a little more complex. You either have to use a process manager like supervisor, or be a bit cleverer in your start-up script.
To see what I mean, create the following script, and call it cmd.sh:
#!/bin/bash
mysqld &
python main.py
Add it to the Dockerfile:
FROM ubuntu:saucy
# Install required packages
RUN apt-get update
RUN DEBIAN_FRONTEND=noninteractive apt-get -y install python
RUN DEBIAN_FRONTEND=noninteractive apt-get -y install mysql-server python-mysqldb
# Add our python app code to the image
RUN mkdir -p /app
ADD . /app
WORKDIR /app
# Set the default command to execute
COPY cmd.sh /cmd.sh
RUN chmod +x /cmd.sh
CMD cmd.sh
Now build and try again. (Apologies if this doesn't work, it's off the top of my head and I haven't tested it).
Note that this is not a good solution; mysql will not be getting signals proxied to it, so probably won't shutdown properly when the container stops. You could fix this by using a process manager like supervisor, but the easiest and best solution is to use separate containers. You can find stock containers for mysql and also for python, which would save you a lot of trouble. To do this:
Take the mysql installation stuff out of the Dockerfile
Change localhost in your python code to mysql or whatever you want to call your MySQL container.
Start a MySQL container with something like docker run -d --name mysql mysql
Start your container and link to the mysql container e.g: docker run myapp --link mysql:mysql

Categories

Resources