Just to say that I am new to docker...
I have a docker-compose application (I like/need the docker-compose way to run multiple containers) that runs a mysql container and a python app, and I would like the python app to run every minute. I tried an infinite loop in the python code, but docker seems to prevent it from launching, so I tried with cron, but without any success (although it might not have any impact, I run docker on Windows, which seems less straightforward than on Linux :(, and use VS Code for development). Therefore, I would like to know if there is a way to use docker-compose and run the python app every minute, and if yes, how to configure cron.
my setup is the following:
docker installed on windows.
I run everything from VS Code' terminal, without any docker extension
A pythonmain.py file (for example, I used emails because I could not see any logs nor print once I run the code from cron. Most likely because cron does not launch the python app):
`
import mysql.connector as mysql
import smtplib
from email.mime.multipart import MIMEMultipart
from email.mime.text import MIMEText
print("Starting Python Docker")
# EMAIL
mail_content = "Hello"
#The mail addresses and password
sender_address = 'email1#mail.com'
sender_pass = 'pwd'
receiver_address = 'email2#mail.com'
#Setup the MIME
message = MIMEMultipart()
message['From'] = sender_address
message['To'] = receiver_address
message['Subject'] = 'New data email.' #The subject line
#The body and the attachments for the mail
message.attach(MIMEText(mail_content, 'plain'))
#Create SMTP session for sending the mail
session = smtplib.SMTP('smtp.office365.com', 587) #use gmail with port
session.starttls() #enable security
session.login(sender_address, sender_pass) #login with mail_id and password
text = message.as_string()
session.sendmail(sender_address, receiver_address, text)
session.quit()
print('Mail Sent')
#Database
db = mysql.connect(host = 'mydb', user = 'root', password = 'root', port = 3306)
print("db connection passed")
cursor = db.cursor()
cursor.execute("CREATE DATABASE IF NOT EXISTS emails")
cursor.execute("SHOW DATABASES")
#connect to the right database
db = mysql.connect(
host = "mydb",
user = "root",
passwd = "root",
database = "emails"
)
cursor = db.cursor()
## creating a table in database
cursor.execute("CREATE TABLE IF NOT EXISTS emailscount (id VARCHAR(255), count INT)")
## defining the Query for charger data
query = "INSERT INTO emailscount (id, count) VALUES (%s, %s)"
values = ("123",1)
cursor.execute(query, values)
db.commit()
print(cursor.rowcount, "data inserted")
`
a dockerfile:
`
FROM python:3.9
COPY . .
RUN pip install mysql-connector-python requests
# for implementation without cron:
# CMD ["python", "./pythonmain.py"]
# trial with cron, but python app does not launch:
RUN apt-get update
RUN apt-get install -y cron
RUN chmod +x pythonmain.py
RUN crontab crontab
CMD ["cron", "-f"]
`
a crontab file:
`
*/1 * * * * python ./pythonmain.py
`
A docker-compose file:
`
version: "3"
services:
db:
container_name: mydb
image: mysql:5.7
ports:
- "33061:3306"
environment:
MYSQL_ROOT_PASSWORD: root
pythonapp:
container_name: pythonapp
depends_on:
- "db"
links:
- "db"
build: ./
ports:
- "5001:5000"
`
I would like the pythonapp to run every minute after I run the "docker-compose up" command in the terminal. But it does not work (the python app never runs, although all containers seem to be created), and I am not sure it can run as it is with the docker-compose command.
What works perfectly is when the dockerfile has the CMD ["python", "./pythonmain.py"] without everything related to cron, then the python app runs as it should, but of course, just once.
I tried as well to just run the python container from cron using docker build and docker run (not docker-compose up), but without any success neither, the code does not run (i.e. I was not able to replicate most of the examples/tutorials on cron for docker). Furthermore, using docker on windows, I could not find any log of error, even in the AppData\Local\Docker\wsl\data folder (which is empty).
Obviously, I am completely new to docker, so any help would be amazing!
Thanks a lot in advance !!
Related
Using python 3.10.10 on Windows 10 I am trying to connect to a mongo database via ssh ideally. On the command line I just do
ssh myuser#111.222.333.444
mongo
and I can query the mongo DB. With the following python code
from pymongo import MongoClient
from pymongo.errors import ConnectionFailure
HOST = "111.222.333.444"
USER = "myuser"
class Mongo:
def __init__(self):
self.host = HOST
self.user = USER
self.uri = f"mongodb://{self.user}#{self.host}"
def connection(self):
try:
client = MongoClient(self.uri)
client.server_info()
print('Connection Established')
except ConnectionFailure as err:
raise(err)
return client
mongo = Mongo()
mongo.connection()
however I get an error
pymongo.errors.ConfigurationError: A password is required.
But as I am able to just login via ssh using my public key I do not require a password. How can this be solved in python?
I also tried to run a command on the command line using ssh alone like
ssh myuser#111.222.333.444 "mongo;use mydb; show collections"
but this does not work like that either.
You do two different things. In the first command you connect via ssh (using port 22) to the remote server. On the remote server you start the mongo shell. In the second command, you try to connect directly to the mongod server (default port 27017).
In your case myuser is the user on remote server operating system, not the user on the MongoDB.
You can (almost) always connect to a MongoDB without username/password, however when you provide a username then you also need a password. Try
self.uri = f"mongodb://{self.host}"
It is not fully clear what you try to achieve. You can configure MongoDB to logon with x509 certificate instead of username/password, see Use x.509 Certificates to Authenticate Clients. These connections are also encrypted via TLS/SSL.
Or are you looking to configure a SSH-Tunnel? See https://serverfault.com/questions/597765/how-to-connect-to-mongodb-server-via-ssh-tunnel
Here is the solution that I found in the end, as simple as possible, and it can be run from within python, and without any special module to install, from a windows powershell:
import json
import subprocess
cmd_mongo = json.dumps('db.units.find({"UnitId": "971201065"})')
cmd_host = json.dumps(f"mongo mydb --eval {cmd_mongo}")
cmd_local = f"ssh {USER}#{HOST} \"{cmd_host}\""
output = subprocess.check_output(cmd_local, shell=True)
print(output)
I am trying to perform a table creation using pyodbc on a SQL Server 2017 database hosted using Docker. I'm also using a network so that I can connect to it later from another Docker image. However, I get the following error
pyodbc.OperationalError: ('HYT00', '[HYT00] [Microsoft][ODBC Driver 17 for SQL Server]Login timeout expired (0) (SQLDriverConnect)')
This is how I went about creating the connection.
To create and run the DB server,
docker run --name mssqldocker -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=<password>' -e 'MSSQL_PID=Express' -p 7000:7000 --network=lambda-local-mssql -v <my_path> -d mcr.microsoft.com/mssql/server:2017-latest-ubuntu
I also tried adding
-h "mssqldocker"
to the command for running the Docker image and then using "mssqldocker" instead of localhost, but to no avail, since mismatched hostnames seem to be the recurring theme when using DBs and Docker together. Also tried adding in \sqlexpress without effect as well. The Python code is as follows
import pyodbc
import sql_clauses
from settings import ENDPOINT, PORT, USERNAME, PASSWORD
cnxn = pyodbc.connect(
'DRIVER={ODBC Driver 17 for SQL Server}' +
';SERVER=' + ENDPOINT + ';UID=' + USERNAME +
';PWD=' + PASSWORD)
cursor = db.cursor()
cursor.execute(create_database(dbname))
cnxn.commit()
cnxn.close()
print("Database created")
The settings file is as follows
ENDPOINT="localhost"
PORT = 7000
USERNAME="SA"
PASSWORD=<password>
In your docker run command you specify -p 7000:7000. This translates in "map the host port 7000 (first 7000 - published) to the container port 7000 (the second 7000 - exposed)". If you have MSSQL running on a different port inside your container (which probably you do) then you have to change that second 7000 to the correct port.
Once you do that you should be able to connect to MSSQL from host using "localhost:7000". This applies if your python application runs directly on host.
If your python project also runs in a container, you need to make sure it runs on the same network as the mssql container (--network=lambda-local-mssql) and then you need to connect using "mssqldocker:mssql_exposed_port". In this case localhost and 7000 (the first part of `-p 7000:...) are not valide anymore since you are on a docker managed network.
The sample code in the question is incomplete and uses variables that are not defined.
A very simple working example is:
# docker compose file
version: "3.9"
services:
<Some name>:
image: mcr.microsoft.com/mssql/server:2019-latest # Or whatever version you want
container_name: <Some name>
restart: unless-stopped
ports:
- "1433:1433"
environment:
- ACCEPT_EULA=Y
- SA_PASSWORD=<Some Password>
- MSSQL_PID=Developer
- MSSQL_AGENT_ENABLED=True
import pandas as pd
import pyodbc
cnxn = pyodbc.connect(
'DRIVER={ODBC Driver 17 for SQL Server}' +
';SERVER=' + 'localhost,1433' + ';UID=' + 'sa' +
';PWD=' + '<Some password' +
';database=<some DB name>') # database here is optional if you want to specify it below in the query.
df = pd.read_sql('some query like select * from table', cnxn)
cnxn.commit()
cnxn.close()
print(df)
By Default Duo Sync runs once Daily, due to the demand of business this needs to be done every 2 hours. looking at DUO API there is a Command for User Sync:
python -m duo_client.client --ikey <> --skey <> --host api-<>.duosecurity.com --method POST --path /admin/v1/users username=<> /directorysync/<DIR SYNC>/syncuser
However I don't see an API for a general overall sync with the Active Directory So to combat such, I was hoping to get all the users from the 2FA Group and Sync via username over a loop using the following:
import sys
import os
import duo_client
from ldap3 import Server, Connection, ALL, NTLM, ALL_ATTRIBUTES, ALL_OPERATIONAL_ATTRIBUTES, AUTO_BIND_NO_TLS, SUBTREE
from ldap3.core.exceptions import LDAPCursorError
server_name = ''
domain_name = ''
user_name = ''
password = '!'
admin_api = duo_client.Admin(
ikey= "",
skey= "",
host= "api-.duosecurity.com",)
format_string = '{:40}'
print(format_string.format('samaccountname'))
server = Server(server_name, get_info=ALL)
conn = Connection(server, user='{}\\{}'.format(domain_name, user_name), password=password, authentication=NTLM,
auto_bind=True)
conn.search('dc={},dc=int'.format(domain_name), '(&(objectCategory=user)(memberOf=CN=2FA,OU=,OU=,OU=,OU=,DC=,DC=int))',
attributes=[ALL_ATTRIBUTES, ALL_OPERATIONAL_ATTRIBUTES])
for e in sorted(conn.entries):
print(e.samaccountname)
os.system("python -m duo_client.client --ikey --skey --host api-.duosecurity.com --method POST --path /admin/v1/users username={}/directorysync//syncuser".format(e.samaccountname))"
The above code some what works, but for some users it also re-creates them as the following: User_IDs such as "username/Dir/DIRAPI/usersync". as showing in images below Duo API
Syncing User
It seemed the username={} was in the wrong
The below is to Create a new user hence why i was seeing username/..../....
Post /admin/v1/users username={}
Below is the Right way for using the API Call.
os.system("python -m duo_client.client --ikey --skey --host api-.duosecurity.com --method POST --path /admin/v1/users/directorysync/syncuser username={}".format(e.samaccountname))"
I need to ssh to a remote Ubuntu server to do some routine job, in following steps:
ssh in as userA
sudo su - userB
run daliy_python.py script with use psycopg2 to read some info from the database (via local connection (non-TCP/IP))
scp readings to my local machine
The question is: How to do that automatically?
I've try to use Fabric, but I run into a problem with psycopg2, after I run the Fabric script below, I received error from my daliy_python.py
psycopg2.OperationalError: could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/var/run/xxx/.s.xxxx"?
My fabfile.py code is as below:
from fabric.api import *
import os
import socket
import pwd
# Target machine setting
srv = 'server.hostname.com'
env.hosts = [srv]
env.user = 'userA'
env.key_filename = '/location/to/my/key'
env.timeout = 2
# Force fabric abort at timeout
env.skip_bad_hosts = False
def run_remote():
user = 'userB'
with settings(warn_only=True):
run('whoami')
with cd('/home/%s/script/script_folder' % user):
sudo('whoami')
sudo('pwd', user=user)
sudo('ls', user=user)
sudo('python daliy_python.py', user=user)
Any suggestions? My database can only be access via userB locally, but only userA can ssh to the server. That might be a limitation. Both local and remote machine is running Ubuntu 14.04.
This is what I do to read my root accessible logfiles without extra login
ssh usera#12.34.56.78 "echo hunter2 | sudo -S tail -f /var/log/nginx/access.log"
That is: ssh usera#12.34.56.78 "..run this code on the remote.."
Then on the remote, you pipe the sudo password into sudo -S echo hunter2 | sudo -S
Add a -u userb to sudo to switch to a particular user, I am using root in my case. So then as the sudo'ed user, run your script. In my case tail -f /var/log/nginx/access.log.
But, reading your post, I would probably simply set up a cronjob on the remote, so it runs automatically. I actually do that for all my databases. A cronjob dumps them once a day to a certain directory, with the date as filename. Then I download them to my local PC with rsync an hour later.
I finally find out where my problem is.
Thanks #chishake and #C14L, I look at the problem in another way.
After inspired by this posts link1 link2, I start to think this problem is related to environmental variables.
Thus I add a with statement to alter $HOME and it worked.
fabfile.py is as below:
from fabric.api import *
import os
import socket
import pwd
# Target machine setting
srv = 'server.hostname.com'
env.hosts = [srv]
env.user = 'userA'
env.key_filename = '/location/to/my/key'
env.timeout = 2
# Force fabric abort at timeout
env.skip_bad_hosts = False
def run_remote():
user = 'userB'
with settings(warn_only=True):
run('whoami')
with shell_env(HOME='/home/%s' % user):
sudo('echo $HOME', user=user)
with cd('/home/%s/script/script_folder' % user):
sudo('whoami')
sudo('pwd', user=user)
sudo('ls', user=user)
sudo('python daliy_python.py', user=user)
I have a ubuntu remote server say 172.123.342.12. I want to take backup of a postgresql database on my local machine via a python script.
My Script is:
def backUp(self):
Pass = 'fb2024d4'
os.putenv("PGPASSWORD",Pass)
dt = datetime.now()
format = "%Y_%b_%d"
cur_time = dt.now()
form_time = cur_time.strftime(format)
backup_str = "C:\\Bitnami\\odoo-8.0-7\\postgresql\\bin\\pg_dump.exe --format=c -h 172.123.342.12 -p 5432 -d new_db -U bn_openerp > C:\\Users\\n\\Desktop\\Odoo_Backups\\%s.dump" %form_time
os.system(backup_str)
print ("Backup Created in Desktop")
box.showinfo("Information", "Backup Created")
backup()
It does nothing. Some help will be appreciated.
EDIT: The Script works on a database on windows as i am using admin account. So it does not asks for password. But When i try to backup a database from remote ubuntu server. It asks for password.
I have tried following solutions:
1.) SET PGPASSPASSWORD = C:\foo\bar..\pgpass.conf.
2.) os.putenv("PGPASSWORD","password")
3.) PGPASSWORD='password' pg_dump.exe -h localhost.....
No one worked for me.
I was able to use a python script to create a dump file using pg_dump.exe:
filename = 'C:/Path/To/File/mydb_dump.sql'
pgDump = 'C:/Program Files/PostgeSQL/9.5/bin/pg_dump'
subprocess.Popen('"{}" -h 127.0.0.1 dbname > "{}"'.format(pgDump, filename), shell=True)
A few things to note:
I STRONGLY CAUTION AGAINST USING shell=True !!!
There is a huge security hazard with possible shell injections as per the documentation.
I'm not sure if will work with a remote Ubuntu server, but I couldn't see why not if all permissions and sharing is setup properly.
I know this is pretty old, but I hope it helps.