I'd like to know whether I should add below files to the code repository:
manage.py
requirements.txt
Also created the very core of the application that includes settings.py. Should that be added to the repo?
And the last one. After creating a project, a whole .idea/ folder was created. It was not included in the .gitignore file template, so how about that?
manage.py
This is part of the Django package, so you do not need to include it. Anyone who install's django will have that installed with it.
requirements.txt
This is how you actually how you tell people WHAT to install to run your project. So at the very least, for a Django project, you will want Django to be in that file. So yes, this file should be included in your git repo. Then anyone you pulls your code can simple run pip install -r requirements.txt and have the requirements installed.
settings.py
This is where things get slightly more into personal preference, but in general, yes it (or something like it) should be included. I like to follow the "Two Scoops of Django" pattern (https://startcodingnow.com/two-scoops-django-config/) of having different settings for different environments.
.idea/
This is actually IDE specific information. JetBrains has a sample file for what they recommend ignoring and what they think you should keep in that folder
(https://github.com/github/gitignore/blob/master/Global/JetBrains.gitignore), but I think it is far more common to just completely ignore that folder all together.
Related
I have a Django project for which I have created a virtual environment with the packages it requires.
Now for development, I was looking for a tool where I could try some code before including it to my project. I had a good experience with Jupyter a little while ago and I thought that would be nice to work with this tool again.
In order to avoid cluttering the minimal virtual environment with the dependencies of Jupyter, I duplicated it and installed jupyter alongside with django-extensions.
In my settings.py, I have:
if os.environ.get("VENV_NAME") == "jupyter-sandbox":
INSTALLED_APPS += ['django_extensions']
so that I can still be able to use the minimal virtual environment without django_extensions.
It works fairly well for now apart from the fact that I cannot run the server from my Jupyter-enabled virtual environment. This is because my project uses django-images and django can't find a migration file in this environment (in sites-packages/django_images/migrations). The error message is below:
raise NodeNotFoundError(self.error_message, self.key, origin=self.origin)
django.db.migrations.exceptions.NodeNotFoundError: Migration core.0001_initial dependencies reference nonexistent parent node ('django_images', '0002_auto_20170710_2103')
Would it be a good idea to create a symlink so that both virtual environments share the same django-images migrations folder or would it mess up completely my project?
I am not utterly confident with migrations yet and would appreciate some advice on this.
I think the confusion here is what you should do. In any standard project, you should have a hand-cultivated list of project dependencies. Most django users put that list into requirements.txt (usually recommended) or setup.py. You can always have multiple requirements: requirements-test.txt, requirements-dev.txt, etc. Use -r requirements.txt at the top of the other files to "import" other requirements:
# requirements.txt
django=1.11.3
and then...
# requirements-test.txt
-r requirements.txt
pytest
tox
and finally...
# requirements-dev.txt
-r requirements-test.txt
ipython
pdbpp
jupyter
Your goal is to have whatever you need for your project to run within the first file. It doesn't matter if your virtual environment has more than enough. And additionally, you should use something like tox to test out if your requirements.txt actually contains exactly what it needs. Hope that helps.
I have to find a solution for sharing code between two big Django projects. The main things to share are models and serializers and template tags. I've came up with 3 different solutions and I need you to find pro and cons to be able to make a choice.
I'll list you the solutions I found:
git submodules
Create a repository where to store my *.py files and include them as a django app such as 'common_deps'
Even if this is the purpose of git submodules there are a bit hard to use and its easy to fall into traps.
python package
Create a python package to store my *.py files.
It seems to be the best option to me event if that means that I'll need to change my requirements.txt file on my projects on each new release.
Simple git repository
Create a new repository to store my *.py files and include them as a django app such as 'common_deps'. Then add it to my PYTHON_PATH
I need some advices, I haven't chosen yet. I'm just telling myself that git submodules seems to be a bas idea.
Tell me guys.
I will definitely go with the second option you listed - packaging your app. If you follow steps in the Packaging your app part of official Django tutorial, you'll get tar.gz file which will allow you to include your app in any project you want by simply installing (e.g. with pip) to the virtual env connected with the project or globally
I will go with python package, after all this is what it is for.
I just started learning Pyramid using the official documentation and I found it very cool so far.
Unfortunately, while the basic one-file app is really simple and straight, I'm having an hard time trying to understand how a "serious app", generated using the pcreate scaffolding command (alchemy in my case), is supposed to be handled.
For example:
is setup.py mandatory or can I just use requirements.txt as I'm used to do with Django in order to install dependencies?
if I have to rely on setup.py am I supposed to execute python setup.py develop each time I create/delete a new file (since I saw them listed in SOURCES.txt)?
In settings.ini how "use" (under [app:main]) works? (can I "bypass" the egg-info which is it pointing to and "bootrsapping" the app in an alternative way?)
There are several tutorials that addresses all these topics, and provide references to further relevant reading for each step along the way. I suggest starting with the Quick Tutorial.
To answer your bullet points in order:
setup.py is a standard for installing Python packages and dependencies and testing your app.
If you need to install more packages as a result of your changes, then yes. EDIT: It is now recommended to use pip install -e . for installing more packages. Also while developing, you can use pserve development.ini --reload which will monitor file changes and restart the server for you.
For more information about the meaning of the use = egg:MyProject, see Entry Points and PasteDeploy .ini Files. There are many ways to configure Pyramid apps including Application Configuration and Advanced Configuration.
You chose Pyramid, the best Python web microframework, a very choice! Here are some pointers for further insight. Actually your question is not specific to Pyramid, but generally how Python packages and applications generally work.
is setup.py mandatory or can I just use requirements.txt as I'm used to do with Django in order to install dependencies?
It is not. You can use requirement.txt. setup.py install_dependencies is mainly for libraries. For more information read blog post Declaring dependencies in Python.
if I have to rely on setup.py am I supposed to execute python setup.py develop each time I create/delete a new file (since I saw them listed in SOURCES.txt)?
It's not necessary.
In settings.ini how "use" (under [app:main]) works? (can I "bypass" the egg-info which is it pointing to and "bootrsapping" the app in an alternative way?)
Please see the other answer regarding Paster and Entry points.
I downloaded a Pinterest-clone project from Djangosites. Now I want to see it live on my browser.
What changes do I have to make?
Is there everything that is required inside that source folder?
And the size of the directory is only 800KB. Is it always like that?
Is there something missing?
Do I have to install any additional pieces of programs?
A step by step guide would be extremely helpful for a novice like me or you also can provide me the hyperlinks to the sites that already have these steps explained.
I started learning Python just 3 months ago and now diving into Django but I'm already surprised that a Project file can be that small (which has the potential of becoming so big that a whole data-center might one day be required for it.) in size! I understand that it's mostly code(text) and some layout images(Icons, banners, Buttons) and the data(Images, profiles, comments) provided by the users is the reason the site is soon gonna need more space but kindly offer me some insights about how this works.
Answers to these questions might solve some mysteries that beginners like me face and think Django as magical, miraculous. Well, I think it is but it would be more satisfactory if I can conjure those tricks myself.
Assume you already have running django project. This is a short way:
You need install packages from requirements.txt
Instal django-pin (pip install django-pin )
add apps as is on installation page (https://pypi.python.org/pypi/django-pin) in to INSTALLED_APPS in your settings.py
adjust your urls.py
Start your project python manage.py runserver
Navigate your browser to http://127.0.0.1:8000/pin/
Have you checked https://pypi.python.org/pypi/django-pin? They give full instructions there but just to repeat them:
Clone repo
Install requirements.txt with pip install -r requirements.txt
Install pip with pip install django-pin
Add pin to your INSTALLED_APPS in settings.py
Add url(r'pin/', include('pin.urls')) to your urls.py
I have a django project that uses a lot of 3rd party apps, so wanted to decide out of the two approaches to manage my situation :
I can use [ virtualenv + pip ] along with pip freeze as requirements file to manage my project dependencies.
I don't have to worry about the apps, but can't have that committed with my code to svn.
I can have a lib folder in my svn structure and have my apps sit there and add that to sys.path
This way, my dependencies can be committed to svn, but I have to manage sys.path
Which way should I proceed ?
What are the pros and cons of each approach ?
Update:
Method1 Disadvantage : Difficult to work with appengine.
This has been unanswered question (at least to me) so far. There're some discussion on this recently:-
https://plus.google.com/u/0/104537541227697934010/posts/a4kJ9e1UUqE
Ian Bicking said this in the comment:-
I do think we can do something that incorporates both systems. I
posted a recipe for handling this earlier, for instance (I suppose I
should clean it up and repost it). You can handle libraries in a very
similar way in Python, while still using the tools we have to manage
those libraries. You can do that, but it's not at all obvious how to
do that, so people tend to rely on things like reinstalling packages
at deploy time.
http://tarekziade.wordpress.com/2012/02/10/defining-a-wsgi-app-deployment-standard/
The first approach seem the most common among python devs. When I first started doing development in Django, it feels a bit weird since when doing PHP, it quite common to check third party lib into the project repo but as Ian Bicking said in the linked post, PHP style deployment leaves out thing such non-portable library. You don't want to package things such as mysqldb or PIL into your project which better being handled by tools like Pip or distribute.
So this is what I'm using currently.
All projects will have virtualenv directory at the project root. We name it as .env and ignore it in vcs. The first thing dev did when to start doing development is to initialize this virtualenv and install all requirements specified in requirements.txt file. I prefer having virtualenv inside project dir so that it obvious to developer rather than having it in some other place such as $HOME/.virtualenv and then doing source $HOME/virtualenv/project_name/bin/activate to activate the environment. Instead developer interact with the virtualenv by invoking the env executable directly from project root, such as:-
.env/bin/python
.env/bin/python manage.py runserver
To deploy, we have a fabric script that will first export our project directory together with the .env directory into a tarball, then copy the tarball to live server, untar it deployment dir and do some other tasks like restarting the server etc. When we untar the tarball on live server, the fabric script make sure to run virtualenv once again so that all the shebang path in .env/bin get fixed. This mean we don't have to reinstall dependencies again on live server. The fabric workflow for deployment will look like:-
fab create_release:1.1 # create release-1.1.tar.gz
fab deploy:1.1 # copy release-1.1.tar.gz to live server and do the deployment tasks
fab deploy:1.1,reset_env=1 # same as above but recreate virtualenv and re-install all dependencies
fab deploy:1.1,update_pkg=1 # only reinstall deps but do not destroy previous virtualenv like above
We also do not install project src into virtualenv using setup.py but instead add path to it to sys.path. So when deploying under mod_wsgi, we have to specify 2 paths in our vhost config for mod_wsgi, something like:-
WSGIDaemonProcess project1 user=joe group=joe processes=1 threads=25 python-path=/path/to/project1/.env/lib/python2.6/site-packages:/path/to/project1/src
In short:
We still use pip+virtualenv to manage dependencies.
We don't have to reinstall requirements when deploying.
We have to maintain path into sys.path a bit.
Virtualenv and pip are fantastic for working on multiple django projects on one machine. However, if you only have one project that you are editing, it is not necessary to use virtualenv.