SQLite Library - Python Portability - python

I am writing some code that is intended to operate with a pre-bundled version of Python included with a program.
The issue is that it doesn't as yet include the Sqlite3 library which is a requirement of my code. Currently I have a wrapped that calls a system installed version of python2.7 to use the import.
Is there any way I can manually include or package this library to go with the code to make it more portable? My concern is that I require this to operate on Windows systems but it is less likely that they will have python2.7 available. (2.7 is the minimum as this included an update to the library)

Yes in python you have setup tools. You can use requirements.txt file and use pip install -r requirements.txt to install dependency.
pip link

pysqlite is also maintained as an external project, looks like they've got downloads all the way back to Python 2.3

Related

Does building pure python modules w/ conda require setuptools?

This weekend I've been reading up on conda and the python packaging user guide because I have a simple pure python project that depends on numpy. It seemed to me that distributing/installing this project via conda was better than pip due to this dependency.
One thing on which I'm still not clear: conda will install a python package from a recipe in build.sh, but it seems like build.sh just ends up calling python setup.py install for most python packages.
So even if I want to distribute/install my python package with conda, I still end up depending on setuptools (or distutils) for the actual installation, correct? I was unable to find a conda utility analogous to setuptools; am I missing something?
FWIW, I posted this question on the conda issue tracker.
Thanks!
Typically you will still be using distutils (or setuptools if the library requires it) to install things, yes. It is not technically required. The build.sh can be anything. If you wanted to, you could just copy the code into site-packages. Using setup.py install is recommended, though, as libraries will already have setup.py working, it will install metadata that can be read by pip, and it will compile any extension modules and install any data files.

Automatically installing Python dependencies using CMake

I've had a quick look around, but because of terminology like dependencies and packages being used in different ways, it's quite tricky to pin down an answer.
I'm building a mixed-language source (Fortran, some C and Python) and the Fortran calls a Python script which depends on the networkx Python package in the PyPI. Normally, I just have networkx installed anyway, so it isn't a problem for me when rebuilding.
However, for distribution, I want the best way to:
Install pip or equivalent, if it is not installed.
Possibly install virtualenv and create a virtual environment, if appropriate.
Download and install networkx using the --user option with pip.
Is there a standard way? Or should I just use CMake dependencies with custom commands that install pip etc.?
it depends. for "manual" install, you definitely should detect if all required (to build) tools are installed, and issue an error if they don't. then use execute_process() to run pip and whatever you want.
from other side, if you are going to produce a real package for some particular Linux, you just pack your binaries and require (via corresponding syntax of particular package format like *.rpm or *.deb that your package depends on some other packages. so, you can be sure that they will be installed w/ (or even before) your package.

Can't install Python's IMAPClient

Trying to install IMAPClient using the command pip install IMAPClient. I'm on Windows, with Python 3.3.
It downloads fine, begins to install, then complains with ImportError: No module named 'response_parser'. I think that is an internal module for this library, so isn't something I can install separately.
I also tried to download the tarball manually, unpack it, and run python setup.py install, but get the same error.
This is a popular and stable library, so I realise I'm doing something wrong. I'm still a Python noob. What should I do?
From the home page:
Python versions 2.4 through 2.7 are currently supported. Python 3 support is in the works.
It would be nicer if they updated the setup file and/or the package metadata so it could immediately give you a clear error saying "Python 3 is not yet supported" or the like, instead of a mysterious failure from the middle of the setup process. But many projects don't bother to do that, preferring to put that energy into finishing the Python 3 port instead.
However, if you look at the source page:
Python versions 2.6, 2.7, 3.2, and 3.3 are officially supported.
So, it looks like the Python 3 support is actually done, but just hasn't been pushed to PyPI yet.
Which means that, if you have mercurial installed, you should be able to do this:
pip-3.3 install hg+https://bitbucket.org/mjs0/imapclient
If you don't have mercurial (and don't want to install it), download the zipfile from the source page, unzip it, and pip-3.3 install . or python3.3 setup.py install from inside the source tree.
If you plan on distributing Python 3 code that requires IMAPClient, you may want to scan the mailing list archives (or join the mailing list and ask) to find out when it will be updated on PyPI.

Using google's protobuf in python without installing it

It seems to me that when I'm using protobuf in python I need to install it first so that I also have setuptools installed. To me it seems like this is severly limiting portability as I would have to install protobuf on every machine on which I want to use any kind of python code using protobuf.
So my question is: Is there a way to package protobuf for python in such a way, that it can be distributed with the python code using it?
Any info on this would be appreciated.
The package contains an experimental C++ extension, and the setup file generates Python files, but with the extension disabled by default, you should be able to include the setup.py build result with your script just fine.
Note that the Python package still needs the command-line tool to be installed. The tool is used to generate some Python for you.
Once that is available, run:
cd python
python setup.py build
and copy the build/lib/google directory to your script distribution, it needs to be on your sys.path to be importable.
Alternatively, use setup.py bdist --formats=zip and add the path to the resulting zipfile (located in dist/protobuf-<version>.<platform>-<architecture>.zip) to your sys.path. Renaming it should be fine.
Do note that the package uses a namespace, and thus the pkg_resources module needs to be available as well. It is only used to declare the google namespace in google/__init__.py.

Python packages installation in Windows

I recently began learning Python, and I am a bit confused about how packages are distributed and installed.
I understand that the official way of installing packages is distutils: you download the source tarball, unpack it, and run: python setup.py install, then the module will automagically install itself
I also know about setuptools which comes with easy_install helper script. It uses eggs for distribution, and from what I understand, is built on top of distutils and does the same thing as above, plus it takes care of any dependencies required, all fetched from PyPi
Then there is also pip, which I'm still not sure how it differ from the others.
Finally, as I am on a windows machine, a lot of packages also offers binary builds through a windows installer, especially the ones that requires compiling C/Fortran code, which otherwise would be a nightmare to manually compile on windows (assumes you have MSVC or MinGW/Cygwin dev environment with all necessary libraries setup.. nonetheless try to build numpy or scipy yourself and you will understand!)
So can someone help me make sense of all this, and explain the differences, pros/cons of each method. I'd like to know how each keeps track of packages (Windows Registry, config files, ..). In particular, how would you manage all your third-party libraries (be able to list installed packages, disable/uninstall, etc..)
I use pip, and not on Windows, so I can't provide comparison with the Windows-installer option, just some information about pip:
Pip is built on top of setuptools, and requires it to be installed.
Pip is a replacement (improvement) for setuptools' easy_install. It does everything easy_install does, plus a lot more (make sure all desired distributions can be downloaded before actually installing any of them to avoid broken installs, list installed distributions and versions, uninstall, search PyPI, install from a requirements file listing multiple distributions and versions...).
Pip currently does not support installing any form of precompiled or binary distributions, so any distributions with extensions requiring compilation can only be installed if you have the appropriate compiler available. Supporting installation from Windows binary installers is on the roadmap, but it's not clear when it will happen.
Until recently, pip's Windows support was flaky and untested. Thanks to a lot of work from Dave Abrahams, pip trunk now passes all its tests on Windows (and there's a continuous integration server helping us ensure it stays that way), but a release has not yet been made including that work. So more reliable Windows support should be coming with the next release.
All the standard Python package installation mechanisms store all metadata about installed distributions in a file or files next to the actual installed package(s). Distutils uses a distribution_name-X.X-pyX.X.egg-info file, pip uses a similarly-named directory with multiple metadata files in it. Easy_install puts all the installed Python code for a distribution inside its own zipfile or directory, and places an EGG-INFO directory inside that directory with metadata in it. If you import a Python package from the interactive prompt, check the value of package.__file__; you should find the metadata for that package's distribution nearby.
Info about installed distributions is only stored in any kind of global registry by OS-specific packaging tools such as Windows installers, Apt, or RPM. The standard Python packaging tools don't modify or pay attention to these listings.
Pip (or, in my opinion, any Python packaging tool) is best used with virtualenv, which allows you to create isolated per-project Python mini-environments into which you can install packages without affecting your overall system. Every new virtualenv automatically comes with pip installed in it.
A couple other projects you may want to be aware of as well (yes, there's more!):
distribute is a fork of setuptools which has some additional bugfixes and features.
distutils2 is intended to be the "next generation" of Python packaging. It is (hopefully) adopting the best features of distutils/setuptools/distribute/pip. It is being developed independently and is not ready for use yet, but eventually should replace distutils in the Python standard library and become the de facto Python packaging solution.
Hope all that helped clarify something! Good luck.
I use windows and python. It is somewhat frustrating, because pip doesn't always work to install things. Python is moving to pip, so I still use it. Pip is nice, because you can uninstall items and use
pip freeze > requirements.txt
pip install -r requirements.txt
Another reason I like pip is for virtual environments like venv with python 3.4. I have found venv a lot easier to use on windows than virtualenv.
If you cannot install a package you have to find the binary for it. http://www.lfd.uci.edu/~gohlke/pythonlibs/
I have found these binaries to be very useful.
Pip is trying to make something called a wheel for binary installations.
pip install wheel
wheel convert path\to\binary.exe
pip install converted_wheel.whl
You will also have to do this for any required libraries that do not install and are required for that package.

Categories

Resources