Django Error: cannot import name autodiscover_modules - python

I meet this error when I deploy my Django project on another VPS. The same codes can run successfully on my Macbook and a staging VPS.
My website based on Django 1.4.20, and import some third python library and Django apps, for example redis-py, requests, django-import-export, django-kronos, django-cors-headers. I install these by pip install etc
I'm really confused how these happen. Maybe it's a library dependency problem, but I can't find detail error log or stacks. Thanks for your time.

You should have a requirements.txt with your webapp. Then do pip install -r requirements.txt when you deploy.
If you did not make such a file, you can create one later by running pip freeze > requirements.txt. But beware that there might be some packages there that are not needed, if you installed other stuff on the side, so be prepared to manually screen the file.
If you work with multiple webapps you may also need to containerize your requirements (here's why). Two options: Docker or virtualenv. If you don't know what Docker is and don't have some time on your hands I suggest you go with Virtualenv for now.

Related

Installing requirements.txt on heroku runs error

I am deploying an application using heroku. In the python code I am using a NLP library called spacy.
I already deployed the app without any installed library, working perfectly.
I added the library in the requirements.txt of the app:
In my cmd, at the time of deployment I am using:
pip install -r requirements.txt
However, it runs an error:
intentionally avoided writing the code because of the sizeI also installed Microsoft C++ Build Tools and restarted cmd, but now it runs additional errors.
This is a screenshot of the files I am uploading:
Is there a way I can install the libraries remotely? I would like to avoid countless installations. Given the red code, am I doing something wrong or am I just missing packages?
(intentionally avoided writing the code because of the size)
That is a local error on your Windows machine, that should not occur on Heroku. The Heroku build pack should automatically install dependencies if you put requirements.txt at root. See https://devcenter.heroku.com/articles/python-pip
For fixing your local error, you need to install the C++ build tools as per the error message.

Is this the right method for distributing a python app with git and pip?

I write python applications for internal use at my company. Everyone has python installed on their machines, but they are not developers, so cloning a repo or creating a venv is outside of their skillset. I was hoping to change my apps so they may update themselves when I push an update to the git repo on a shared network drive. I would like the app code to update, but also the dependencies installed in each of their virtual environments.
I was thinking of wrapping my application in some code which, before importing any of the dependencies:
Checks if a venv exists
if it doesn't, creates one using the local requirements.txt
if it does, skips this step
Checks if the git repo version tag is more recent than the local git repo version tag
if a more recent version is available, it will pull from the master repo. The repo will only contain the app code and the requirements.txt
It will then pip install --upgrade the new or updated packages using the new requirements.txt file
if a newer version isn't available, it just runs the app
This all seems doable, and not terribly complicated, but I can't help but feel like I'm reinventing the wheel here. This seems a bit too manual and too common of a problem for there not to be a better way of doing this. What am I missing? What's the best way to handle this? Thanks for your help, everyone!

Install Python Flask without using pip

How do I install Python Flask without using pip?
I do not have pip, virtualenv nor easy_install.
The context of this question is that I am on a tightly controlled AIX computer. I cannot install any compiled code without going through several layers of management. However, I can install python modules.
Python 2.7 is installed.
I have some existing python code that generates a report.
I want to make that report available on a web service using Flask.
I am using bottle, but I am going to want to use https, and support for https under Flask seems much more straight forward.
I would like to put the flask library (and its dependencies) into my project much like bottle is placed into the project.
What I tried: I downloaded the flask tarball and looked at it. It had quite a bit of stuff that I did not know what to do with. For instance, there was a makefile.
Yes you can but it will be little difficult.
get flask source code from this and Extract it.
https://pypi.python.org/packages/source/F/Flask/Flask-0.10.1.tar.gz
There will be a file with name setup.py in that you can see dependencies , I listed them here. download those packages and install them first.
'Werkzeug>=0.7', https://pypi.python.org/packages/source/W/Werkzeug/Werkzeug-0.10.4.tar.gz
'Jinja2>=2.4', https://pypi.python.org/packages/source/J/Jinja2/Jinja2-2.7.3.tar.gz
'itsdangerous>=0.21' , https://pypi.python.org/packages/source/i/itsdangerous/itsdangerous-0.24.tar.gz
MarkupSafe==0.23 ,https://pypi.python.org/packages/source/M/MarkupSafe/MarkupSafe-0.23.tar.gz
download all those from pypi and install using python setup.py install for every module.
Now you can install flask by running python setup.py install in the flask source code folder.
Now you system is acquainted with flask.
:-)
On Debian-based systems you can install with Apt.
For Python 3.x use:
sudo apt-get install python3-flask
For Python 2.x use:
sudo apt-get install python-flask
One way to go around the problem is to use another machine with pip installed on which you can download all dependencies.
On this first machine you would then run these commands below
$ mkdir myapp
$ pip install flask --target ./myapp
Then transfer the myapp folder to the AIX machine.
Then develop your program inside the myapp folder as this is the only place flask will be accessible. Unless you setup the environment path accordingly.
You can try with wheel pacakges .

installing python modules that require gcc on shared hosting with no gcc or root access

I'm using Hostgator shared as a production environment and I had a problem installing some python modules, after using:
pip install MySQL-python
pip install pillow
results in:
unable to execute gcc: Permission denied
error: command 'gcc' failed with exit status 1
server limitations
no root access
sudo doesnt work (sudo: effective uid is not 0, is sudo installed setuid root?)
no gcc
questions
is there an alternative package for pillow. I want this to use django ImageField. (just like pymysql is an equally capable alternative for mysql-python)
i have modules like mysql-python and pil installed in root, i.e. pip freeze without any virtualenv lists these modules. but i cannot install my other required modules in this root environment and in my virtualenv i cannot install mysql-python and pil. can something be done? can we import/use packages installed in root somehow in a virtualenv?
is hostgator shared only good for PHP and not for python/django webapps. we have limited traffic so we are using hostgator shared. should we avoid hostgator or shared hosting? aren't they good enough for python/django (i had no problems in hosting static/PHP sites ever). are they too many problems and limitations or performance issues (FCGI)? if yes, what are the alternatives?
You can try building wheels on some similar host where gcc is available, copy them to your server and install. But I do not know how much similar hosts should be.
on "similar" host with gcc:
mkdir /tmp/wheels
mkdir /tmp/pip-cache
pip wheel --download-cache /tmp/pip-cache -w /tmp/wheels -r requirements.pip
copy wheels to your hosting (I assume that you copy to /tmp/wheels)
install from wheels ignoring index and using wheels dir:
pip install --download-cache /tmp/pip-cache --find-links=/tmp/wheels --no-index -r requirements-dev.pip
P.S. Maybe you should also copy download-cache to your hosting. I do not remember if this is needed. If this is not needed then you can skip option --download-cache /tmp/pip-cache
you can try to use PIL instead of Pillow (try it but I'm guessing you will probably run into the same compile problem)
when you setup your virtualenv, you can pass it a --system-site-packages flag. See here
there are definitely a lot of alternative services out there- heroku, digital ocean, webfaction etc. Quick plug for PythonAnywhere (I work here)-- we are a PAAS specifically geared towards python frameworks such as Django and come with PIL, mysql-python, and many other python packages preinstalled.
You need root access to install the necessary packages to run your python application.
PAAS like Heroku are another option but the free package at Heroku is only good for developing your application and it is not intended for hosting it once you get traffic and users.
I strongly suggest you get a VPS at DigitalOcean.com. For 5$ per month you will get root access and more power. You will also control your full stack. I use Nginx+Gunicorn to host about 10 Django projects on DigitalOcean right now.

how to package a django project?

How can i package a project so that i can just call some function that runs the project?
I know how to package a django app, but my question is how to package a django project.
Currently i have my project on an internal pypi server and can pull it down using:
pip install [project]
but then to use it i have to go into my site-packages and then the package just so i can run
./manage.py ....
Or am i just better off checking out the project and pip installing the apps?
One way, is to create a package using your distros package management system. At my shop, we use Ubuntu's aptitude. So package our software as a .deb using CMake.
It's probably not the best way to do it, but you can use distribute to generate wrapper scripts for you: http://packages.python.org/distribute/setuptools.html#automatic-script-creation
I had this same question with my own project. The solution, as the community around the project arrived at, was to package it in a number of ways for different platforms, but not as a PyPi module.
In order of popularity, my project is typically installed via:
Docker instance: docker pull <project-name> && docker-compose up
Cloning the git repo: git clone <url> && ./project/scripts/do-the-thing
Vagrant: vagrant up
If there was more support in the group, I suppose it would make sense to roll out a .deb, .ebuild, and .rpm, but at this stage, people seem more-or-less happy with the above.

Categories

Resources