Is there a way to convert a python package, i.e. is a folder of python files, into a single file that can be copied and then directly imported into a python script without needing to run any extra shell commands? I know it is possible to zip all of the files and then unzip them from python when they are needed, but I'm hoping that there is a more elegant solution.
It's not totally clear what the question is. I could interpret it two ways.
If you are looking to manage the symbols from many modules in a more organized way:
You'll want to put an __init__.py file in your directory and make it a package. In it you can define the symbols for your package, and create a graceful import packagename behavior. Details on packages.
If you are looking to make your code portable to another environment:
One way or the other, the package needs to be accessible in whatever environment it is run in. That means it either needs to be installed in the python environment (likely using pip), copied into a location that is in a subdirectory relative to the running code, or in a directory that is listed in the PYTHONPATH environment variable.
The most straightforward way to package up code and make it portable is to use setuptools to create a portable package that can be installed into any python environment. The manual page for Packaging Projects gives the details of how to go about building a package archive, and optionally uploading to PyPi for public distribution. If it is for private use, the resulting archive can be passed around without uploading it to the public repository.
Related
I'm working on a Python package (built as a wheel), which requires some external files – i.e. ones that aren't in my package/repository. At runtime, it currently gets the path to these from a config file.
I'd like to instead bundle these files inside the wheel itself, so the package is all you need to install. Normally you'd use package_data to do this kind of thing, but I don't think that works for files outside of the package tree. I've wondered about making a build script, which first copies these into a local temporary directory. Would that work, or is there a more elegant way to do this?
I created a command-line python tool for Unix systems.
I want my script to be executed from anywhere just like any Unix command. One solution is to make my script executable and move it to /usr/bin.
But the script works with external files, and I guess moving all these files with my script in /usr/bin is a bad habit, as it will be hard to delete them one by one in the future.
Where should I put the directory of my application ? How to add the main script to the PATH, to execute it from anywhere ?
I would like then to create a package that could move my files in the right place, and delete them in the case the user wants to uninstall my application.
I don't know how to do this.
Thank you.
There are many way of distributing a Linux application.
It will depend on the distribution you are using, since they do not all use the same package manager. For example, you would create a .deb package for Debian, Ubuntu and there derivatives, an Arch package for Archlinux, etc...
You could then share the package with anyone to let them install your tool.
However, since your tool is in written in Python, you can also make a python package. You could then upload it to the Python Package Index to let anyone install it using python's pip package manager.
To create a python package, you will need to create a file called setup.py, and, from it, to call the setup method from the setuptool packages.
You will probably want to read the python documentation about writing such a script: https://setuptools.readthedocs.io/en/latest/setuptools.html
You may especially be interested by this sections:
Including Data Files
Automatic Script Creation
If you do things correctly, setuptools will take care of installing your script and its files somewhere in the PATH so it can be executed from the command line.
You should use setuptools to distribute your application, creating a setup.py file to configure its setup and installation.
Binaries can be delivered using the console_scripts option, while data files can be delivered using either package_data.
I'm new to Python, so I think my question is very fundamental and is asked a few times before but I cannot really find something (maybe because I do not really know how to search for that problem).
I installed a module in Python (reportlab). Now I wanted to modify a python script in that module but it seems that the python interpreter does not notice the updates in the script. Ironically the import is successful although Python actually should not find that package because I deleted it before. Does Python uses something like a Cache or any other storage for the modules? How can I edit modules and use those updated scripts?
From what you are saying, you downloaded a package and installed it using either a local pip or setup.py. When you do so, it copies all the files into your python package directory. So after an install, you can delete the source folder because python is not looking here.
If you want to be able to modify, edit, something and see changes, you have to install it in editable mode. Inside the main folder do:
python setup.py develop
or
pip install -e .
This will create a symbolic link to you python package repository. You will be able to modify sources.
Careful for the changes to be effective, you have to restart your python interpreter. You cannot just import again the module or whatever else.
I'm working on a python project that contains a number of routines I use repeatedly. Instead of rewriting code all the time, I just want to update my package and import it; however, it's nowhere near done and is constantly changing. I host the package on a repo so that colleagues on various machines (UNIX + Windows) can pull it into their local repos and use it.
It sounds like I have two options, either I can keeping installing the package after every change or I can just add the folder directory to my system's path. If I change the package, does it need to be reinstalled? I'm using this blog post as inspiration, but the author there doesn't stress the issue of a continuously changing package structure, so I'm not sure how to deal with this.
Also if I wanted to split the project into multiple files and bundle it as a package, at what level in the directory structure does the PTYHONPATH need to be at? To the main project directory, or the .sample/ directory?
README.rst
LICENSE
setup.py
requirements.txt
sample/__init__.py
sample/core.py
sample/helpers.py
docs/conf.py
docs/index.rst
tests/test_basic.py
tests/test_advanced.py
In this example, I want to be able to just import the package itself and call the modules within it like this:
import sample
arg = sample.helper.foo()
out = sample.core.bar(arg)
return out
Where core contains a function called foo
PYTHONPATH is a valid way of doing this, but in my (personal) opinion it's more useful if you have a whole different place where you keep your python variables. Like /opt/pythonpkgs or so.
For projects where I want it to be installed and also I have to keep developing, I use develop instead of install in setup.py:
When installing the package, don't do:
python setup.py install
Rather, do:
python setup.py develop
What this does is that it creates a synlink/shortcut (I believe it's called egglink in python) in the python libs (where the packages are installed) to point to your module's directory. Hence, as it's a shortcut/symlink/egglink when ever you change a python file, it will immediately reflect the next time you import that file.
Note: Using this, if you delete the repository/directory you ran this command from, the package will cease to exist (as its only a shortcut)
The equivalent in pip is -e (for editable):
pip install -e .
Instead of:
pip install .
I am writing a program in python to be sent to other people, who are running the same python version, however these some 3rd party modules that need to be installed to use it.
Is there a way to compile into a .pyc (I only say pyc because its a python compiled file) that has the all the dependant modules inside it as well?
So they can run the programme without needing to install the modules separately?
Edit:
Sorry if it wasnt clear, but I am aware of things such as cx_freeze etc but what im trying to is just a single python file.
So they can just type "python myapp.py" and then it will run. No installation of anything. As if all the module codes are in my .py file.
If you are on python 2.3 or later and your dependencies are pure python:
If you don't want to go the setuptools or distutiles routes, you can provide a zip file with the pycs for your code and all of its dependencies. You will have to do a little work to make any complex pathing inside the zip file available (if the dependencies are just lying around at the root of the zip this is not necessary. Then just add the zip location to your path and it should work just as if the dependencies files has been installed.
If your dependencies include .pyds or other binary dependencies you'll probably have to fall back on distutils.
You can simply include .pyc files for the libraries required, but no - .pyc cannot work as a container for multiple files (unless you will collect all the source into one .py file and then compile it).
It sounds like what you're after is the ability for your end users to run one command, e.g. install my_custom_package_and_all_required_dependencies, and have it assemble everything it needs.
This is a perfect use case for distutils, with which you can make manifests for your own code that link out to external dependencies. If your 3rd party modules are available publicly in a standard format (they should be, and if they're not, it's pretty easy to package them yourself), then this approach has the benefit of allowing you to very easily change what versions of 3rd party libraries your code runs against (see this section of the above linked doc). If you're dead set on packaging others' code with your own, you can always include the required files in the .egg you create with distutils.
Two options:
build a package that will install the dependencies for them (I don't recommend this if the only dependencies are python packages that are installed with pip)
Use virtual environments. You use an existing python on their system but python modules are installed into the virtualenv.
or I suppose you could just punt, and create a shell script that installs them, and tell them to run it once before they run your stuff.