PyPi does not grab templates directory on a Django app - python

What is my goal
I want to publish a PyPI package where the code within this package will be a Django application (migrations + templates folder, views, models, etc...)
I created the project with setup.py which includes the necessary data so that the build module of python will build the PyPI package smoothly, with version management.
This is the repository for further look: github-repo
What is the problem
The python -m build command does not take the templates directory within the package.
It leads to a broken package of a Django application, because when you attempt to install the (package), it does not bring the templates. It is testable with pip install rlocker-expiryaddon
What did I try
I know that mentioning a file inside MANIFEST.in will be responsible to grab the files in the build process. (?)
So I tried to inject the following line:
recursive-include templates *.html schema.js
Yet, no luck.
I know that there are a lot of open-source libraries like the djangorestframework that is easy downloadable via pip and it brings it's templates directory to the system interpreter as expected. But for me, I am not sure what is wrong.
Thanks for your help!

Related

Python Django project - What to include in code repository?

I'd like to know whether I should add below files to the code repository:
manage.py
requirements.txt
Also created the very core of the application that includes settings.py. Should that be added to the repo?
And the last one. After creating a project, a whole .idea/ folder was created. It was not included in the .gitignore file template, so how about that?
manage.py
This is part of the Django package, so you do not need to include it. Anyone who install's django will have that installed with it.
requirements.txt
This is how you actually how you tell people WHAT to install to run your project. So at the very least, for a Django project, you will want Django to be in that file. So yes, this file should be included in your git repo. Then anyone you pulls your code can simple run pip install -r requirements.txt and have the requirements installed.
settings.py
This is where things get slightly more into personal preference, but in general, yes it (or something like it) should be included. I like to follow the "Two Scoops of Django" pattern (https://startcodingnow.com/two-scoops-django-config/) of having different settings for different environments.
.idea/
This is actually IDE specific information. JetBrains has a sample file for what they recommend ignoring and what they think you should keep in that folder
(https://github.com/github/gitignore/blob/master/Global/JetBrains.gitignore), but I think it is far more common to just completely ignore that folder all together.

How Can we Create a plugin and Play Architecture in Django?

Actually, I want to create a plug and play architecture in Python Django. I have a different kind of scrapers and I am writing more scrapers too. Whenever I build a new scraper I have to again publish my repo in production. What I need I just want to plug that new scraper without actually again deploying my code. I have already used the GitHub versioning system but I am in need of a more clean way. Thanks in advance.
What you should be doing is Generating the Setup.py File which would automatically set it up as a package thus, When the User would run the Setup.py File,
It would deploy it as a package. Second Thing is you can create a Package out of It with the tools already provided by Python to create a Package of an App in Django.
Packaging a Python Script:
https://the-hitchhikers-guide-to-packaging.readthedocs.io/en/latest/quickstart.html
https://pythonhosted.org/an_example_pypi_project/setuptools.html
For Django, Follow below Steps:
Initilize it with a ReadMe File
Initilize it with a MANIFEST.in to include the text files and static files in our package
run python setup.py build
Reference: https://www.pythoncentral.io/package-python-django-application-reusable-component/
Now Simply Use it anywhere You Wish making it a Reusable App!
Reference for a Package Settled Up:
https://github.com/smfai200/rebound

GAE - Including external python modules without adding them to the repository?

I'm current working on a python based Google App Engine project. Specifically, I'm using Flask for the application. I'm wondering what the accepted method of including external python modules is, specifically when it comes to the repository. From what I can tell, including other people's code in my repository is bad form for several reasons. However, other people will be working on the same repository, so we should be using the same external modules to insure the same results.
Specifically, I need to include Flask (and its dependencies) to my application. The easiest way to do this with Google App Engine is just to throw them into the root level:
MyProject
app.yaml
main.py
MyApp
Flask
...
What is the proper way to bring in these external modules in such a project? Both a generalized answer and one specific to my case would be useful. Also, any other related recommendations would be appreciated. Thank you much.
While it is indeed possible to include third party libraries as submodules or symlinks from external repositories, in practice it's not a good idea. Here are two scenarios on what could go wrong:
If the third party library releases a new version that breaks the functionality, you will have to either make all the necessary changes to meet the new requirements or simply find the previous version to keep working and break the external connection. Usually this happens when you are very close to deadlines.
If the third party library releases a new version and one of your colleagues is upgraded and made all the necessary changes to support the new version, on your side the code will be broken until you will upgrade as well.
The above examples are much more visible in big projects with lots of dependencies and as more people joining the project in the long run it becomes a huge problem! I could come up with more examples, but I think you can see the point.
Your best option is to include the external libraries into your repository, which also has the advantage that you are able to have the whole project up and running on a new machine without many dependencies. There are many ways on how to organize your third party libraries and all of them needs to be included on the same or deeper level with your app.yaml file. Just as #dragonx mentioned include only the core library code.
Also do not afraid putting stuff into your repository cause space is not an issue today and these libraries usually not updating that often so your repository size is not getting too much bigger over time.
Since you mentioned Flask on Google App Engine, you can check out my gae-init project, where you can see in practice how the external libraries are organised.
You're actually asking two questions here.
How do I include the external library in my GAE project?
You've got the right idea. Whatever way you go about it, you must somehow include Flask and its dependencies in the root of your GAE project. One way is to put a copy directly in there.
The second way is to use a symbolic link to the folder that contains the external library. I'm not sure about Flask, but often times external repos contain the actual library code in a subdirectory - so often you don't want the root of the repo in your GAE app, just the root of the actual source. In this case, it's easier to put a symlink that links to the source folder.
How do I manage external libraries in my source repo?
This is a harder question to answer since it depends what source control tool you're using. Yes, you do want to have everyone use the same versions of external libraries, and they should be included in your source control somehow.
If you're using git, git submodule is the way to go. It's a bit confusing to start with but it'll get the job done.
I'd recommend a repo structure that looks something like this
repo/
thirdparty/
flask/
other_dependency/
another_dependency/
README.TXT
setup.py
src/
app/
app.yaml
your_source.py
softlink_to_flask
softlink_to_other_dependency
softlink_to_another_dependency_src
In this example you keep the source to your external libraries in the thirdparty folder. These may be git submodules. In the app folder you have your source, and softlinks to the appropriate files that are actually needed for your app to run. In this case, the actual code for another_dependency may be in the another_dependency/src folder rather than the actual root of another dependency. This way you don't need to include the unnecessary files in your deployment folder, but you can still keep the entire library in your repo.
You can't just create requirements.txt and put it to GAE. Your code must include all pure python libraries that used your project and doesn't supported by GAE (https://developers.google.com/appengine/docs/python/tools/libraries27).
If you look at flask deploy example for GAE (http://flask.pocoo.org/docs/quickstart/#deploying-to-a-web-server and https://github.com/kamalgill/flask-appengine-template) you can find some dependencies like flask, werkzeug and etc. and all this dependencies you must push to GAE server.
So I see three solutions:
Use local requirements for local development and make custom build function that will download all dependencies, put with your application and upload to GAE server.
Add tools for local deployment when you just start project that put required libraries with your application (don't forget about .gitignore).
Use something like git submodules to requirements repositories.
There is two case for using python third party packages in google app engine project:
If your library is one of the supported runtime-provided third-party libraries of GAE section
just add it to your app.yml file under libraries
libraries:
- name: package_name
version: latest
Add your code
import pack_name
Sometimes you need to install the package with
pip install package_name
Make sure you're using the right interpreter, by using
pip freeze
you can make sure the package is installed successfully to the right path.
Otherwise, if GAE does not support you library, you need to download it manually and save it locally under root/Lib directory:
or through GIT
or through pip (pip install package_name -t path/to/your/Lib/dir)
After that, we should declare Lib directory as source dir in pycharm
pycharm->preferences->Project Structure
Choose Lib directory and mark it as source.
Then, import it.
import pack_name
Pay attention that when you're doing the import, you choosing the local path and not your python path.
In general, that's recommended to have requirements.txt file, that includes all the used packages names, and then the pycharm will recognize the uninstalled packages and suggest you to install them.
Good Luck

Django : Which approach is better [ virtualenv + pip ] vs [manually carrying packages in svn]?

I have a django project that uses a lot of 3rd party apps, so wanted to decide out of the two approaches to manage my situation :
I can use [ virtualenv + pip ] along with pip freeze as requirements file to manage my project dependencies.
I don't have to worry about the apps, but can't have that committed with my code to svn.
I can have a lib folder in my svn structure and have my apps sit there and add that to sys.path
This way, my dependencies can be committed to svn, but I have to manage sys.path
Which way should I proceed ?
What are the pros and cons of each approach ?
Update:
Method1 Disadvantage : Difficult to work with appengine.
This has been unanswered question (at least to me) so far. There're some discussion on this recently:-
https://plus.google.com/u/0/104537541227697934010/posts/a4kJ9e1UUqE
Ian Bicking said this in the comment:-
I do think we can do something that incorporates both systems. I
posted a recipe for handling this earlier, for instance (I suppose I
should clean it up and repost it). You can handle libraries in a very
similar way in Python, while still using the tools we have to manage
those libraries. You can do that, but it's not at all obvious how to
do that, so people tend to rely on things like reinstalling packages
at deploy time.
http://tarekziade.wordpress.com/2012/02/10/defining-a-wsgi-app-deployment-standard/
The first approach seem the most common among python devs. When I first started doing development in Django, it feels a bit weird since when doing PHP, it quite common to check third party lib into the project repo but as Ian Bicking said in the linked post, PHP style deployment leaves out thing such non-portable library. You don't want to package things such as mysqldb or PIL into your project which better being handled by tools like Pip or distribute.
So this is what I'm using currently.
All projects will have virtualenv directory at the project root. We name it as .env and ignore it in vcs. The first thing dev did when to start doing development is to initialize this virtualenv and install all requirements specified in requirements.txt file. I prefer having virtualenv inside project dir so that it obvious to developer rather than having it in some other place such as $HOME/.virtualenv and then doing source $HOME/virtualenv/project_name/bin/activate to activate the environment. Instead developer interact with the virtualenv by invoking the env executable directly from project root, such as:-
.env/bin/python
.env/bin/python manage.py runserver
To deploy, we have a fabric script that will first export our project directory together with the .env directory into a tarball, then copy the tarball to live server, untar it deployment dir and do some other tasks like restarting the server etc. When we untar the tarball on live server, the fabric script make sure to run virtualenv once again so that all the shebang path in .env/bin get fixed. This mean we don't have to reinstall dependencies again on live server. The fabric workflow for deployment will look like:-
fab create_release:1.1 # create release-1.1.tar.gz
fab deploy:1.1 # copy release-1.1.tar.gz to live server and do the deployment tasks
fab deploy:1.1,reset_env=1 # same as above but recreate virtualenv and re-install all dependencies
fab deploy:1.1,update_pkg=1 # only reinstall deps but do not destroy previous virtualenv like above
We also do not install project src into virtualenv using setup.py but instead add path to it to sys.path. So when deploying under mod_wsgi, we have to specify 2 paths in our vhost config for mod_wsgi, something like:-
WSGIDaemonProcess project1 user=joe group=joe processes=1 threads=25 python-path=/path/to/project1/.env/lib/python2.6/site-packages:/path/to/project1/src
In short:
We still use pip+virtualenv to manage dependencies.
We don't have to reinstall requirements when deploying.
We have to maintain path into sys.path a bit.
Virtualenv and pip are fantastic for working on multiple django projects on one machine. However, if you only have one project that you are editing, it is not necessary to use virtualenv.

Installing python library to a custom location?

I want to use a couple of third party django packages for my application. So, I want to install each package locally for that application (or my whole project).
The python custom installation instructions look a bit scary - how do I do this as simply as possible?
You can just put the library module into your project folder because your project folder will be in the PYTHONPATH automatically when running via manage.py runserver or your wsgi script will point to it when running on production.
Usually all python packages are packaged like this:
package directory
module directory
... other files/dirs like README, Manifest and so on
What need to be in your Django project folder is only the module directory part from the example above, not the rest of the package.
Use virtualenv.
That is by far the best solution for installing custom packages for each project.
Just use --home - nothing scary about it.
Don't forget to make your PYTHONPATH point there, though.

Categories

Resources