Cron not working on Amazon Elastic Beanstalk with Python and PostgreSQL - python

I have built a simple Django app and successfully deployed it to Beanstalk. The app uses a PostgreSQL backend on an RDS instance. From a browser, I can successfully access the admin and create and delete models inside of it. However, I'm also trying to run a cron that updates the database. I installed the cron on the server, but it didn't work. So I then shelled in, ran the commands manually and got the following error: Is the server running on host "127.0.0.1" and accepting TCP/IP connections on port 5432?
From my Googling, I'm guessing this has something to do with either security groups, allowed hosts or JDBC. Perhaps allowing the Beanstalk's EC2 instance and RDS instance to interactive with each other. But I'm lost. I tried the instructions from this AWS tutorial.
For the record, the script that the cron runs works perfectly when run locally as python manage.py runscript scrape.
Other stuff:
The tutorial I followed for deploying my app.
The tutorial I followed for the cron
Cron
* * * * * /opt/python/run/venv/bin/python3.4 /opt/python/current/app/manage.py runscript scrape
Database part of settings.py
if 'RDS_DB_NAME' in os.environ:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': os.environ['RDS_DB_NAME'],
'USER': os.environ['RDS_USERNAME'],
'PASSWORD': os.environ['RDS_PASSWORD'],
'HOST': os.environ['RDS_HOSTNAME'],
'PORT': os.environ['RDS_PORT'],
}
}
else:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'db',
'USER': 'user',
'PASSWORD': 'password',
'HOST': '127.0.0.1',
'PORT': '',
}
}

This is the way you run a python script.
It is a question on cron.
First is add a SHEBANG line on top of your python script.
#!/usr/bin/env python3
Make your script executable with chmod +x
And do a crontab -e and add 0 0 */2 * * /path/to/your/pythonscript.py

I'm dumb. I forgot to actually set the environment variables in the AWS console. What can I say, it has been a long day.

Related

Connect to remote mysql with django on Windows 10

I've got a Django settings file which works on mac and linux which will let me use my. my.cnf file to connect to a remote MySQL database on AWS. However, on Windows 10 this doesn't seem to be the case. It keeps on saying it can't connect to the local database as if the my.cnf file doesn't exist.
I've installed mysql connector and python mysql connector on windows 10, along with pip install mysqlclient as well like I would need to on linux however the problem still persists.
db_conf = Path("Project/my.cnf")
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'OPTIONS': {
'read_default_file': '{}'.format(db_conf),
}
}
}
With the same settings file I can run makemigrations, migrate and run the server. On windows this same thing crashes it. my.cnf is the same on Windows/Mac/Linux any help would be appreciated.
I suspect it's a hangup with the way Windows does paths but I'm unsure how to resolve this as I usually code in Linux.
I've resolved the issue but..not in the way I wanted. I don't know why but django and windows don't seem to like using os.path or Path when trying to use it to connect to mysql database.
As such I've used a longer winded route using configparser and no longer having django read a config file instead give it the details in the settings file.
import configparser
db_conf = Path("Project/my.cnf")
conf = configparser.ConfigParser()
conf.read(db_conf)
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': conf.get("client", "database"),
'USER': conf.get("client", "user"),
'PASSWORD': conf.get("client", "password"),
'HOST': conf.get("client", "host"),
'PORT': '3306',
}
}
What I find hilarious about this whole situation is that Path works normally except when trying to read it with django on Windows. It's not the situation I wanted but it does resolve the issue. I hope this helps others who may be struggling doing the same thing when connecting to a remote mysql server on Windows.

django settings.py won't read env variable for unit tests

I have database environment variables specified for my django app:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': os.getenv("POSTGRES_NAME"),
'USER': os.getenv("POSTGRES_USER"),
'PASSWORD': os.getenv("POSTGRES_PW"),
'HOST': os.getenv("POSTGRES_HOST"),
'PORT': os.getenv("POSTGRES_PORT"),
}
}
The variables successfully get read in when I run "python manage.py runserver", during the build on Circle CI, and also in its production environment. But I am not understanding why when I run unit tests they don't get read in.
Thanks for the help!
Turns out the problem was that I hadn't closed my IDE in some time. I had to reboot the IDE to source the env vars from the virtual environment.

Set up a Django Project with Mamp?

I just downloaded a newer version of MAMP (3.2.1) and i noticed that this Version has Python installed and also seems to handle SQLite Databases.
Shouldn't I be able to manage Django Projects with it?
Where and how would i install it?
I found some Posts in the Web (before my new MAMP release) where People already trying to get MAMP + Django to work with MySQL but those seemed more complicated to me then the usual setup with Virtualenv + SQLite/Postgres.
I'm pretty new to django but starting a project at the time seems quite simple to me.
If Django would work with MAMP together what would be the advantages?
Anyone has already experiences or useful links?
OK i gues working with MAMP MySQL has the advantage that i can easy import/export Database with php MyAdmin tool.
Anyway based on tanorix answer here how for me Django worked with MAMP MySQL Database:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'projectdb',
'USER': 'root',
'PASSWORD': 'root',
'HOST': '/Applications/MAMP/tmp/mysql/mysql.sock',
'PORT': '8888',
}
}
Then
python manage.py migrate
I don't have knowledge about MAMP but I can give you some elements to put Django Database with WAMP, so I think it can be the same manipulation:
First, in MAMP, you need to create a database, call it : projectdb.
Then, at your settings.py, update your variable DATABASES like this:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql', # Add 'postgresql_psycopg2', 'mysql', 'sqlite3' or 'oracle'.
'NAME': 'projectdb', # Or path to database file if using sqlite3.
# The following settings are not used with sqlite3:
'USER': 'root',
'PASSWORD': '',
'HOST': '127.0.0.1', # Empty for localhost through domain sockets or '127.0.0.1' for localhost through TCP.
'PORT': '', # Set to empty string for default.
}
}
Then, if you are using South, at your shell write this:
python manage.py schemamigration <name of your app> --init
python manage.py syncdb # => create your tables at your MAMP
python manage.py migrate

Setting up Django with MySQL, syncdb giving segmentation fault?

I'm currently working on building a Django app. I'm following the "tangowithdjango" tutorial, which uses Django 1.54. In their tutorial, they use Sql-lite, but I'm planning on building this app for most robust purpose, which is why I'm attempting to connect MySQL instead.
Needless to say, it's been a nightmare. I can't get MySQL to connect for the life of me.
Here's what my settings.py looks like:
DATABASE_PATH = os.path.join(PROJECT_PATH, 'app.db')
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'rideb',
'USER': 'root',
'PASSWORD': 'nantucket',
#'HOST': 'localhost', # Empty for localhost through domain sockets or '127.0.0.1' for localhost through TCP.
#'PORT': '', # Set to empty string for default.
}
}
And here's my output I'm getting...
(rideb)grantmcgovern#gMAC:~/Dropbox/Developer/Projects/RideB/master$ python manage.py syncdb
Segmentation fault: 11
I've installed python-mysqldb, and now I'm simply getting this, and I'm very perplexed to say the least. Is it some Django compatibility issue?
Everything works fine as the tutorial suggests with SQL-lite, but not looking to use that.
OS:
Mac OSX 10.10 Yosemite
MySQL (installed via .dmg on Oracle's site):
Server version: 5.6.19 MySQL Community Server (GPL)
I had a similar problem and it turns out it was related to incorrect mysql implementation on my Mac (OS X 10.10 Yosemite). I had "mysql-connector-c" installed instead of "mysql". Uninstalling "mysql-connector-c" (brew uninstall mysql-connector-c) and installing "mysql" (brew install mysql) resolved the problem right away.
To find out if you are facing this exact problem as I did, open the Console app on your Mac, go to User Diagnostic Reports and look for the report for your crash (should start with Python_). Under queue, if it shows you "0 libmysqlclient.18.dylib", then you have the same problem as I had.
As already mentioned you do not need DATABASE_PATH.
My working configuration looks like:
DATABASES = {
'default': {
'ENGINE': 'mysql.connector.django',
'NAME': 'db_name',
'USER': 'db_user',
'PASSWORD': 'db_pass',
'HOST': '127.0.0.1',
}
}
I am using a different engine, because it is required for Python3. → see documentation
After that you have to create the database and the user. Do not forget to give all needed rights to the user.
With python manage.py migrate your database is populated with your models.
syncdb the predecessor to migrate will be removed in Django 1.9

Linking Django and Postgresql with Docker

I have two Docker containers. The first one is Postgresql container, which I run using the following command.
sudo docker run -v /home/mpmsp/project/ezdict/postgresql/data:/var/lib/postgresql/data -p 127.0.0.1:5432:5432 -name my-postgres -d postgres
It is based on official image and it is working perfectly, I can connect to Postgresql from the host.
The second container is a container with my Django application. The image is built using the following Dockerfile (based on this image):
FROM python:3-onbuild
EXPOSE 8000 5432
CMD ["/bin/bash"]
And I run this container with the following command
sudo docker run --link my-postgres:my-postgres -v /home/mpmsp/project/ezdict/ezbkend:/usr/src/app -name my-app -i -t my-app
docker ps output shows that containers are linked
NAMES
my-app/my-postgres, my-postgres
However, when I go to localhost:8000, I see an error page from Django, with the following output
OperationalError at /api-auth/login/
could not connect to server: Connection refused
Is the server running on host "127.0.0.1" and accepting
TCP/IP connections on port 5432?
Request Method: GET
Request URL: http://127.0.0.1:8000/api-auth/login/
Django Version: 1.6.4
Exception Type: OperationalError
Exception Value:
could not connect to server: Connection refused
Is the server running on host "127.0.0.1" and accepting
TCP/IP connections on port 5432?
Exception Location: /usr/local/lib/python3.4/site-packages/psycopg2/__init__.py in connect, line 164
Python Executable: /usr/local/bin/python
Python Version: 3.4.1
Python Path:
['/usr/src/app',
'/usr/local/lib/python34.zip',
'/usr/local/lib/python3.4',
'/usr/local/lib/python3.4/plat-linux',
'/usr/local/lib/python3.4/lib-dynload',
'/root/.local/lib/python3.4/site-packages',
'/usr/local/lib/python3.4/site-packages']
Server time: Птн, 10 Окт 2014 12:07:07 +0400
Application's settings.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'mydb',
'USER': 'postgres',
'PASSWORD': '',
'HOST': '127.0.0.1',
'PORT': '5432',
}
}
How to make linking work? Thanks in advance
The Dockerfile for your Django image should not expose port 5432 as no Postgresql server will be running in any container created from that image:
FROM python:3-onbuild
EXPOSE 8000
CMD ["/bin/bash"]
Then as you are running the Django container linking it with
--link my-postgres:my-postgres
your settings for the database are incorrect.
In the Django container: 127.0.0.1 refers to the Django container which isn't running any service listening on port 5432.
So your settings.py file should be:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'mydb',
'USER': 'postgres',
'PASSWORD': '',
'HOST': 'my-postgres',
'PORT': '5432',
}
}
As you run your Django container with:
sudo docker run --link my-postgres:db -v /home/mpmsp/project/ezdict/ezbkend:/usr/src/app -name my-app -i -t my-app
then your settings.py file would have to be:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'mydb',
'USER': 'postgres',
'PASSWORD': '',
'HOST': 'db',
'PORT': '5432',
}
}
Ths syncdb only works AFTER both db and django containers are build and started, then you can manually run the syncdb command with fig/docker-compose/docker.
I am thinking of creating an AT job and let the container run the syncdb itself (and creating an admin user after the syncdb - for creating the necessary tables)

Categories

Resources