I've been working on a Python project for a while which uses quite some third party libraries. I want to deploy my project to another server, but I don't know by heart which projects I use and digging through every line of source code would be too much work.
Is there a way to generate a list of third party modules in my project so I can use this with PIP's installer? Thank you for your help
pip install -r dependencies.txt # <-- I need to generate this file from a project
Provided that you're using a virtual environment to keep your dependencies separate from the globally installed pip packages, you should be able to use pip's freeze command, like so:
pip freeze > dependencies.txt
If you haven't been using a virtual environment, then you probably need to peruse the source code to find the modules. A virtual environment is a means of keeping your python project isolated from the global environment, meaning that you can only import modules that are installed within that environment, and it should only contain modules relevant to its corresponding python project. I recommend that you read up on Virtual Environments, they are very useful for larger projects.
I ended up writing a python module which does this instead as I couldn't find it. The source code is available on GitHub. You can install it like so:
$ pip install pip-module-scanner
Using it is pretty simple, for full usage examples check the GitHub repo.
$ pip-module-scanner
foo==0.0.1
bar==2.0.0
Related
This question already has answers here:
How can I set up a virtual environment for Python in Visual Studio Code?
(23 answers)
Closed 2 years ago.
just wanted to set up my new MacBookPro M1. As I want to organize my MB this time, i want to start using virtualenv.
So, what I've done so far:
installed brew
installed virtualenv
set up a dir, in there create my first env called sec_env
installed some packages for testing
Now I want to use my virtualenv:
I started it, source sec_env/dir/activate
AND now here we go, I want to code something in this env. So I start up my code-insiders and try to import the package i already installed....does not work ;( (EDIT1: Maybe i failed config it inside vs code?)
Do I missunderstand the use of virtualenv? I thought of it kinda like a virtual machine...So i can install package in need for one project and code it. But if i work on another, i would just switch, start up my vs-code again and keep writing on the other project.
Or is the problem just, that all the project I want to code have to be inside the dir of the virtualenv(sec_env)? At the moment , I have a dir virtualenvs where I store all my environments , start one up and change to desktop to work . And all the projects are on my desktop.
Would be awesome if someone give me any tipps on this, or another way to separate my different projects. I am super new to this topic, since I used different virtual-box images before...now i am forced to use something else...M1 :D !
Your understanding is generally correct as virtualenv are a way to keep projects' dependencies separated from each other, like a VM would.
Your code doesn't need to be in the same directory as where your virtual environment, but many people tend to organize it that way for the sake of convenience. That way you don't need to think about what venv you coded a project with since it's right there in the directory.
With your steps, I think you installed a package before activating the environment. Doing it in that order installs the package in your system site-packages, not your virtual environment packages. Before you install a package, you need to activate your environment. Also, it appears from How to tell Homebrew to install inside virtualenv? that homebrew doesn't support installing a package into a virtual env. So in order to install packages into a virtualenv, I would suggest using pip as your package manager.
So the sequence of commands would be...
source <path to virtualenv>/dir/activate
pip install <modules you want to install>
# Now you can run your code that references those installed modules.
I'm an experienced developer, but not very familiar with Python and Pyramid.
I'm trying to follow some (a bit old and unclear) instructions on deploying a Pyramid web application. My OS is Ubuntu 16.04.
The instructions say to install virtualenv and Pyramid - I do so with apt install virtualenv and apt install python-pyramid. Then they say I should run the app in a virtual environment, so I build that with virtualenv . -ppython3, and activate it with source bin/activate. I install the application from a ready-to-run buildout from GitHub. The buildout includes a "production.ini" file with parameters to pserve.
But Pyramid is not included in the virtual environment built with virtualenv. (There is no "pserve" in the bin directory, e.g.) So I can't run the applications with bin/pserve etc/production.ini, as the instructions say. And if I try with only "pserve", I get errors when trying to access files like "var/waitress-%(process_num)s.sock". Files that the app excepts to find in the virtual environment.
I've looked for flags to tell virtualenv to include Pyramid, but couldn't find any. Am I overlooking something? I'd be most grateful for some help! :-)
/Anders from Sweden
Perhaps you might want to try installing Pyramid in your virtual environment using pip, since apt-installed libraries are installed into /opt, rather than being visible to Python. In the guide, it seems like you're wanting to install Pyramid through the virtual environment so that it can be used by your program, so I think you'd be best using pip rather than apt-get. I did a quick Google search, and it seems like this is the library you need. Here, all you'd have to do is run the installation command once you've already entered the virtual environment with pip install pyramid. This way, you should only have access to it within the virtual environment, as well!
You mentioned it's using buildout - I assume this is zc.buildout. buildout usually manages its own virtualenv and handles installing all of the necessary dependencies. It really depends on how that buildout is configured as there's no standard there for what to do or how to run your app. I would normally expect pserve to be exposed in the bin folder, but maybe another app-specific script is exposed instead.
I'm working on a script in python that relies on several different packages and libraries. When this script is transferred to another machine, the packages it needs in order to run are sometimes not present or are older versions that do not have the same functionality and cause the script to fail.
I was considering using a virtual environment, but I can't find a way to have the script use the specific environment I design as it's default, and in order to use the environment a user must manually activate it from the command line.
I've also looked into trying to check the versions of the packages installed on the machine, and if they are not sufficient then updating them from the script as described here:
Installing python module within code
Is there any easier/surefire way to make sure that the needed packages will always be available regardless of where it's run?
The normal approach is to create an installation script and have that manage your dependencies. Then when you move your project to a new environment your installer will check that all dependencies are present.
I recommend you check out setuptools: https://setuptools.readthedocs.io/en/latest/
If you don't want to install dependencies whenever you need to use your script somewhere new, then you could package your script into a Docker container.
If the problem is ensuring the required packages are available in a new environment or virtual environment, you could use pip and generate a requirements.txt and check it in version control or use a tool to do that for you, like pipenv.
If you would prefer to generate a requirements.txt by hand, you should:
Install your depencencies using pip
Type pip freeze > requirements.txt to generate a requirements.txt file
Check requirements.txt in you source management software
When you need to setup a new environment, use pip install -m requirements.txt
The solution that I've been using has been to include a custom library (folder with all of my desired packages) in the folder with my script, and I simply import them from there:
from Customlib import pkg1, pkg2,...
As long as the custom library and script stay together in the same folder, it will always have access to the right packages and the correct versions of those packages.
I'm not sure how robust this solution actually is or what possible bugs may arise from this if it is passed from machine to machine, but for now this seems to work.
I would like to know if any of you had implemented an autoupdate feature for a python app. My idea is to download the first 100-200 bytes (using requests?) from a github URL, contanining the version tag. Ex.
#!/usr/bin/env python
#######################
__author__ = 'xxx'
__program__ = 'theprogram'
__package__ = ''
__description__ = '''This program does things'''
__version__ = '0.0.0'
...
So if the version tag is greater than the one in the local module, the updater would download the whole file and replace it, and then (either the way) run it.
What is the best way to do this?
Thanks!
You can use pip programmatically to schedule updates for your modules in a cron, so you won't be needing to request the version because pip will update only when necessary.
pip install --upgrade yourpackage
or
pip install --upgrade git+https://github.com/youracc/yourepo.git
Also, as #riotburn pointed out, you should be using a virtualenv to isolate your environment and may as well rollback to a previous one if necessary. In that last scenario, you may find this virtualenv wrapper very helpful.
You should look into using virtualenv or conda for managing the dependencies used in your package. Both allow you to create isolated environments for installing specific versions of packages as well as creating environments from predefined list of dependencies. Conda also has the benefit of being a package manager like pip. If you were to not specify versions in this requirements file, it would install the latest. Then you could just write a bash script to automate the couple of command lines needed to do this for your use case.
Try reading up on python environments:
http://conda.pydata.org/docs/using/envs.html
https://virtualenv.pypa.io/en/latest/
Being new to the python games I seem to have missed out on some knowledge on how you can develop on a program but also keep it in your live environment.
Programs like gpodder can be run directly from the source checkout which is really handy however others want to be "installed" to run.
A lot of programs are distributed with a setup.py with instructions to run "python ./setup.py install" as root which will put stuff somewhere in your file-system. There are even install commands like "develop" which seem to hold the promise of what I want. So I tried:
export PYTHONPATH=/home/alex/python
python ./setup.py develop --install-dir=/home/alex/python
Which downloaded a bunch of stuff locally and seems magically ensure the application I'm hacking on is still being run out of the src tree. So I guess my roundabout question is is this the correct way of developing python code? How do things like easy_install and pip factor into this?
So I tried the following:
python /usr/share/pyshared/virtualenv.py /home/alex/src/goobook
cd /home/alex/src/goobook/googbook.git
/home/alex/src/goobook/bin/python ./setup.py develop
And finally linked the program in question to my ~/bin
cd /home/alex/src/goobook
linkbin.pl bin/goobook
However invocation throws up a load of extra chatter which seems to imply it's wrong:
17:17 alex#socrates/i686 [goobook] >goobook --help
/home/alex/bin/goobook:5: UserWarning: Module pkg_resources was already imported from /home/alex/src/goobook/lib/python2.5/site-packages/setuptools-0.6c8-py2.5.egg/pkg_resources.py, but /home/alex/src/goobook/lib/python2.5/site-packages/distribute-0.6.10-py2.5.egg is being added to sys.path
from pkg_resources import load_entry_point
/home/alex/bin/goobook:5: UserWarning: Module site was already imported from /home/alex/src/goobook/lib/python2.5/site.pyc, but /home/alex/src/goobook/lib/python2.5/site-packages/distribute-0.6.10-py2.5.egg is being added to sys.path
from pkg_resources import load_entry_point
Install:
http://pypi.python.org/pypi/virtualenv
to set up a localized virtual environment for your libraries, and:
http://pypi.python.org/pypi/setuptools
i.e. "easy_install" to install new things.
Virtualenv allows you to work in completely independent and isolated Python environments. It will let you easily create multiple environments which have different Python packages installed or different versions of a same package. Virtualenv also lets you easily switch between your different environments.
As of 2012, the de facto preferred tool for package management in Python is pip rather than setuptools. Pip is able to handle dependencies and to install/uninstall globally or inside a virtual environment. Pip even comes out-of-the-box with virtualenv.
Python 3
Also worth mentioning is the fact that virtual environments are becoming a part of Python itself in release 3.3, with the implementation of PEP 405.
The Python Packaging User Guide, which "aims to be the authoritative resource on how to package, publish and install Python distributions using current tools", recommends using pip to install in "development mode":
pip install -e <path>
Thus in the root directory of your package you can simply
pip install -e .
See installing from a local source tree.
The best way to develop Python apps with dependencies is to:
Download desired version of the python interpreter.
Install and use buildout (http://www.buildout.org/).
Buildout is something like Maven for Java (will fetch all needed packages automatically).
This way your Python interpreter will not be polluted by third party packages (this is important if you will be running developed application on other machines). Additionally you can integrate buildout with virtualenv package (this allows you to create virtual python interpreters for each project).