Install static files in Python egg sdist install directory - python

I am creating a Python3 application that depends on a directory of static files in the project.
Project structure:
myBlanky
\__blankys
\__bootstrap
|__google_app_engine
\__my_blanky
\____init__.py
|__my_blanky.py
|__file_system_utils.py
setup.py
MANIFEST.in
README.md
LICENSE
.travis.yml
setup.cfg
So, in my root directory, I have 1 Python module (my_blanky), various Python project files (manifest, setup.py, etc) and 1 directory full of static data (blankys).
I was having troubles creating a .tar.gz package for my application earlier, but after creating a MANIFEST.in file and editing my setup.py file, I was able to create a .tar.gz successfully with command: python3 setup.py sdist.
My setup.py:
from setuptools import setup
from my_blanky import version
setup(name="myBlanky",
...url, author, etc info here...
packages=["my_blanky"],
include_package_data=True,
package_data={'blankys': ['blankys/*']},
zip_safe=False,
entry_points={"console_scripts": ["myblanky = my_blanky.my_blanky:main"]},
install_requires=['docopt==0.6.1'])
My MANIFEST.in:
recursive-include blankys *
include *.md
include LICENSE
include setup.cfg
After I run python3 setup.py sdist, I open up the created .tar.gz file and it includes all the data that I need (blankys directory, my_blanky module, all root dir static files) so it includes all my static files that I need, great.
Then I move onto installing the .tar.gz with pip: find ./dist -iname "*.tar.gz" -print0 | xargs -0 pip install and it runs successfully with no errors. (The output of installing it does not display any information regarding the static files but didn't think it would do that any way.)
Now here is the issue. My python project should now be installed on my machine. I can run myblanky on my machine or myblanky -v and it all works great. Everything runs. But when I try to run any code like myblanky list that tries to access the static files directory "blankys", it breaks saying there is no such directory in /usr/local/lib/python3.4/dist-packages/my_blanky/. I browse to that location manually and the only files there is my Python module my_blanky source files. The parent directory: /usr/local/lib/python3.4/dist-packages/ also does not include any static files at all.
I am expecting the static directory blankys gets copied over to the dist-packages location when I pip install my project. I am expecting the directory to live with my my_blanky module directory so I can use that static files directory my relative path searching in my program.
So if the static files install great into my .tar.gz package and installation seems to succeed on my machine, where are my static files in the installed dist-package? I am installing the project on my machine from the .tar.gz which has the static files/directory in it, so why are those not copied over in the installation? Only the python module my_blanky gets copied over.
UPDATE (2020): For future readers of this question, some people have commented that the original question and answer I gave in 2014 does not make sense or help solve their problem. Sorry, but I have not touched this code since 2014 and I am not able to improve this question or answer. I am leaving it here in case it's helpful, but I am not able to help any further on this.

UPDATE (2020): For future readers of this answer, some people have commented that the original question and answer I gave in 2014 does not make sense or help solve their problem. Sorry, but I have not touched this code since 2014 and I am not able to improve this question or answer. I am leaving it here in case it's helpful, but I am not able to help any further on this.
I feel I have solved this issue.
Essentially my problem is that I have a "settings.config" file that I want to be able to access system wide on the user's machine. So instead of having the settings.config file located in some random directory that my program cannot find, I wanted to have one single settings.config file on the system that it can find and manipulate so I was asking in my question how you go about doing this in distutils.
While I was cleaning out my machine's home directory today I realized that I was going at the idea all wrong. Applications do this idea all the time where it has one config file somewhere in the system it can access. It is located in ~/.config or /etc or on Windows: %APPDATA%. I was asking in my question above how I can install a settings.config file inside of /usr/local/lib/python3 but that is wrong. That is not what I should be doing, just do like what every other program does.
So to answer my question: you can't install a data file in /usr/local/lib/python3 and for good reason. Instead use ~/.config or /etc or %APPDATA%.

Related

clean up .pyc files in virtualenv stored in souce repository after the fact?

I've created a virtualenv for my project and checked it into source control. I've installed a few projects into the virtualenv with pip: django, south, and pymysql. After the fact I realized that I had not set up source control for ignoring .pyc files. Could there be any subtle problems in simply removing all .pyc files from my project's repository and then putting in place the appropriate file ignore rules? Or is removing a .pyc file always a safe thing to do?
That is fine, just remove them!
Python auto-generates them from the corresponding .py file any time it wants to, so you needn't worry about simply deleting them all from your repository.
A couple of related tips - if you don't want them generated at all on your local dev machine, set the environment variable PYTHONDONTWRITEBYTECODE=1. Python 3.2 fixed the annoyance of source folders cluttered with .pyc files with a new __pycache__ subfolder

Why Does setup.py Remain in my Distribution Package?

I am somewhat new to Python and even newer to distutils.
I wanted to create a distribution of my program for other users in my office, so I used setup from distutils.core. I set up my directory structure, created a manifest.in file and a setup.py file. Everything seemed to go as planned. The result is that I have a .zip file containing the directory structure that I intended.
The maifest file was not contained in the .zip file (I assume it was only needed by distutils), but the setup.py file remained in the .zip file. Why is that? Is setup.py needed by the end-user?
Thank you,
-RS
In the normal case, users install your app by running python setup.py install, or something that effectively does the same thing (like pip install foo).
Of course there are cases where they don't need setup.py—e.g., because they're installing a pre-packaged binary egg or Windows installer or whatever—but most packages have to work for the normal case. So, the default packaging commands include it. In the docs, Specifying the files to distribute says:
If you don’t supply an explicit list of files (or instructions on how to generate one), the sdist command puts a minimal default set into the source distribution:
… setup.py (or whatever you called your setup script) …

How to build a single python file from multiple scripts?

I have a simple python script, which imports various other modules I've written (and so on). Due to my environment, my PYTHONPATH is quite long. I'm also using Python 2.4.
What I need to do is somehow package up my script and all the dependencies that aren't part of the standard python, so that I can email a single file to another system where I want to execute it. I know the target version of python is the same, but it's on linux where I'm on Windows. Otherwise I'd just use py2exe.
Ideally I'd like to send a .py file that somehow embeds all the required modules, but I'd settle for automatically building a zip I can just unzip, with the required modules all in a single directory.
I've had a look at various packaging solutions, but I can't seem to find a suitable way of doing this. Have I missed something?
[edit] I appear to be quite unclear in what I'm after. I'm basically looking for something like py2exe that will produce a single file (or 2 files) from a given python script, automatically including all the imported modules.
For example, if I have the following two files:
[\foo\module.py]
def example():
print "Hello"
[\bar\program.py]
import module
module.example()
And I run:
cd \bar
set PYTHONPATH=\foo
program.py
Then it will work. What I want is to be able to say:
magic program.py
and end up with a single file, or possibly a file and a zip, that I can then copy to linux and run. I don't want to be installing my modules on the target linux system.
I found this useful:
http://blog.ablepear.com/2012/10/bundling-python-files-into-stand-alone.html
In short, you can .zip your modules and include a __main__.py file inside, which will enable you to run it like so:
python3 app.zip
Since my app is small I made a link from my main script to __main__.py.
Addendum:
You can also make the zip self-executable on UNIX-like systems by adding a single line at the top of the file. This may be important for scripts using Python3.
echo '#!/usr/bin/env python3' | cat - app.zip > app
chmod a+x app
Which can now be executed without specifying python
./app
Use stickytape module
stickytape scripts/blah --add-python-path . > /tmp/blah-standalone
This will result with a functioning script, but not necessarily human-readable.
You can try converting the script into an executable file.
First, use:
pip install pyinstaller
After installation type ( Be sure you are in your file of interest directory):
pyinstaller --onefile --windowed filename.py
This will create an executable version of your script containing all the necessary modules. You can then transfer (copy and paste) this executable to the PC or machine you want to run your script.
I hope this helps.
You should create an egg file. This is an archive of python files.
See this question for guidance: How to create Python egg file
Update: Consider wheels in 2019
The only way to send a single .py is if the code from all of the various modules were moved into the single script and they your'd have to redo everything to reference the new locations.
A better way of doing it would be to move the modules in question into subdirectories under the same directory as your command. You can then make sure that the subdirectory containing the module has a __init__.py that imports the primary module file. At that point you can then reference things through it.
For example:
App Directory: /test
Module Directory: /test/hello
/test/hello/__init__.py contents:
import sayhello
/test/hello/sayhello.py contents:
def print_hello():
print 'hello!'
/test/test.py contents:
#!/usr/bin/python2.7
import hello
hello.sayhello.print_hello()
If you run /test/test.py you will see that it runs the print_hello function from the module directory under the existing directory, no changes to your PYTHONPATH required.
If you want to package your script with all its dependencies into a single file (it won't be a .py file) you should look into virtualenv. This is a tool that lets you build a sandbox environment to install Python packages into, and manages all the PATH, PYTHONPATH, and LD_LIBRARY_PATH issues to make sure that the sandbox is completely self-contained.
If you start with a virgin Python with no additional libraries installed, then easy_install your dependencies into the virtual environment, you will end up with a built project in the virtualenv that requires only Python to run.
The sandbox is a directory tree, not a single file, but for distribution you can tar/zip it. I have never tried distributing the env so there may be path dependencies, I'm not sure.
You may need to, instead, distribute a build script that builds out a virtual environment on the target machine. zc.buildout is a tool that helps automate that process, sort of like a "make install" that is tightly integrated with the Python package system and PyPI.
I've come up with a solution involving modulefinder, the compiler, and the zip function that works well. Unfortunately I can't paste a working program here as it's intermingled with other irrelevant code, but here are some snippets:
zipfile = ZipFile(os.path.join(dest_dir, zip_name), 'w', ZIP_DEFLATED)
sys.path.insert(0, '.')
finder = ModuleFinder()
finder.run_script(source_name)
for name, mod in finder.modules.iteritems():
filename = mod.__file__
if filename is None:
continue
if "python" in filename.lower():
continue
subprocess.call('"%s" -OO -m py_compile "%s"' % (python_exe, filename))
zipfile.write(filename, dest_path)
Have you taken into considerations Automatic script creation of distribute the official packaging solution.
What you do is create a setup.py for you program and provide entry points that will be turned into executables that you will be able run. This way you don't have to change your source layout while still having the possibility to easily distribute and run you program.
You will find an example on a real app of this system in gunicorn's setup.py

Why doesn't my Python 2.6 auto-unzip egg files on import?

I'm under the impression that Python import is supposed to automatically
unzip egg files in site-packages.
My installation doesn't seem to want to auto-unzip the egg. What I tried:
(1) I used easy_install to install the suds module, which copied the
egg file into site-packages. Python couldn't import it. (import suds)
(2) Then I used the --always-unzip option to easy_install. This time it
gave me a directory instead of a zip file. Python still couldn't import the suds module.
(3) I renamed the directory suds. still couldn't find it.
(4) finally I copied the suds directory out of the unzipped egg directory into
site-packags and Python found it (no surprise there).
for me, easy_install wasn't. What's missing here?
Rufus
By default (if you haven't specified multi-version mode), easy_installing an egg will add an entry to the easy-install.pth file in site-packages. Check there to see if there's a reference to the suds egg. You can also check the Python import path (which is the list of places Python will search for modules) like this:
import sys
print sys.path
Did you try import suds in a Python shell that was started before you easy_installed suds?
That would explain the behaviour you saw. The .pth files are only read at Python startup, so the egg directory or zip file wouldn't have appeared in sys.path. Copying the suds dir from inside the egg directory worked because site-packages itself was already in sys.path. So make sure you restart Python after installing an egg.
Python will import from zip archives, but it won't unzip the archive into site-packages. That is, it won't leave the unzipped directory there after you import. (I think it reads from the zip file in-place without extracting it anywhere in the file system.) I've seen problems where some packages didn't work as zipped eggs (they tried to read data from their location in the file-system), so I'd recommend always using the --always-unzip flag as you do in (2).
You haven't given the command lines you used. Did you specify the -m option to easy_install? That will cause the egg to be installed in multi-version mode. It won't be in sys.path by default, and you'd need to use the pkg_resources.require function before trying to import it.

How do I setup python to always include my directory of utility files

I have been programming in Python for a while now, and have created some utilities that I use a lot. Whenever I start a new project, I start writing, and as I need these utilities I copy them from where ever I think the latest version of the particular utility is. I have enough projects now that I am losing track of where the latest version is. And, I will upgrade one of these scripts to fix a problem in a specific situation, and then wish it had propagated back to all of the other projects that use that script.
I am thinking the best way to solve this problem is to create a directory in the site-packages directory, and put all of my utility modules in there. And then add this directory to the sys.path directory list.
Is this the best way to solve this problem?
How do modify my installation of Python so that this directory is always added to sys.path, and I don't have to explicitly modify sys.path at the beginning of each module that needs to use these utilities?
I'm using Python 2.5 on Windows XP, and Wing IDE.
The site-packages directory within the Python lib directory should always be added to sys.path, so you shouldn't need to modify anything to take care of that. That's actually just what I'd recommend, that you make yourself a Python package within that directory and put your code in there.
Actually, something you might consider is packaging up your utilities using distutils. All that entails is basically creating a setup.py file in the root of the folder tree where you keep your utility code. The distutils documentation that I just linked to describes what should go in setup.py. Then, from within that directory, run
python setup.py install
to install your utility code into the system site-packages directory, creating the necessary folder structure automatically. Or you can use
python setup.py install --user
to install it into a site-packages folder in your own user account.
Add your directory to the PYTHONPATH environment variable. For windows, see these directions.
If it's not in site-packages then you can add a file with the extension .pth to your site-packages directory.
The file should have one path per line, that you want included in sys.path

Categories

Resources