Installing python library with C dependancies - python

I'd like to install the pypoker-eval package. The author provided a nice readme for how to install the package in windows:
pypoker-eval on Windows
install python
open visual studio project and change directories to include python directories and fix the path to the poker-eval project
build the project, there may be an undefined ssize you can typedef as int
locate the newly built dll pypokereval.dll then look at the exports and you will find something similar to init_pokereval_2_4
rename the dll to _pokereval_2_4.pyd (your version numbers) and move it into the test.py directory
edit the .py files so the imported module is _pokereval_2_4 (your version numbers) not a dynamically generated one
To be quite honest though, these instructions really lost me. I've tried doing some research on my own, and I understand that because the library has some C code I need to compile it first. I wasn't able to figure out though what the purpose of changing the directories to include python directories, does this part mean change the working directory? Also is the compiler I use important? I've dabbled some in C but that was with the Code::Blocks IDE.
I think my main problem is I don't really understand what happens when I install a package. Pip has spoiled me and kept this whole process as a block box and now with something like this I feel completely lost. Any recommended reading on installing packages with C dependencies or tips on how I can get this package installed would be much appreciated!
For reference I'm using Python 2.7 on Windows 7.

Related

Packaging a Python program that needs to a C compiler

I'm creating and packaging a Python (2.7) program to create a Lambda Function on Amazaon WS. The program I have is dependent on the PySha3 package, which is essentially a Python wrapper around the optimized Sha3 C package. Pysha3 imports a C file; the source code of these two files can be seen in this Github Repository.
In order to import a package into the Lambda platform, the whole file and all dependencies must be zipped and uploaded. When I do this and attempt to test the package after the upload I get an error that the Sha3 classes being imported from the C file included in PySha3's package cannot be found. As I result of my own research and troubleshooting I have decided to install a Pysha3 wheel in the root of my program's directory, which I do believe is the right move. However, when I do this I get the error below:
I'm really hoping that if I can get the wheel to install correctly by installing a C compiler and can somehow package the C compiler along with my program, that the Lambda will then work. I found this compiler for Python and I'm hoping that installing it will fix my error locally, and maybe including it in the zip file will fix the error on Lambda AWS.
So, my question is, does my thought process seem valid? As well, any tips/guidance on how to include the compiler in a zipped file so that it will actually be utilized? I'm very new to all of this, so anything will help. Thank you very much!
In order to package a binary for Lambda, you need to build any native code in the correct environment. It appears that you are trying to build on a Windows machine -- even if your build worked, the resulting binary would not be suitable for the Lambda execution environment.
The current supported environment is documented in the Lambda Execution Environment guide, which states also:
If you are using any native binaries in your code, make sure they are compiled in this environment.

Why is python27.dll not part of python installed folder but in Windows system folder

As described in: http://bugs.python.org/issue22139, the python27.dll is installed in the windows systems (in my case C:\Windows\Systems32) folder.
But I would like to know why? Why is it not installed next to the python.exe, for example in C:\Python27\?
Reason I ask: I've made a mercurial hook in python that our developers need to use to check if the commit message is valid. It checks a.o. for a valid JIRA issue number. To prevent all our developers to install python themselves and install the required modules manually (a lot of work and errorprone), I zipped the python installation and asked the developers to unzip it locally. But they can't run it, because the python27.dll is missing, or worse, they already have another minor version of python installed, and the hook will fail due to the wrong python27.dll used. Confusing.
If I just add the python27.dll (the correct version) to the zip file, it all seems to work great. So, why is it not installed in that location in the first place? What is the advantage of installing it in C:\Windows\System32?
Hope someone can explain this to me!
Thanks in advance,
Tallandtree.
I use the Anaconda Python distribution from http://continuum.io. They put python27.dll into c:\anaconda right next to its python.exe. This distribution is also superior in that you can have multiple python environments with precisely the packages you need and switch between them easily (http://conda.pydata.org/docs/using/envs.html). You can also get the package list of one of your environments and distribute it to others.
I recommend this Python distribution over the one from python.org and Enthought, because of this issue.
.dlls are quite windows-specific files. I imagine you will have shared object (.so) files for LINUX/UNIX-specific Python stuff? You said your developer's couldn't run it, because they didn't have the correct DLL (i.e. the one relevant to their Python installation).
Also, the advantage of installing it to System32 is that it's in the default PATH. Additionally, if any other application is internally using Python and require access to the .dll file, and also NOT reference your Python directory, they will probably be looking for a location that "Actually" exists (I wanted to say guaranteed to exist, but......never mind). That location would be `C:/windows/Systems32'.
I found it to work just fine to put python27.dll in the Python directory (c:\Python27 or wherever). As long as it's in the PATH, it seems to work. I did this for a "relocatable" installation of Python. I can copy the installation directory to a Windows machine that has no Python installed, set the PATH to include that directory, and run Python, including all the libraries I had installed with pip install on the original machine.

How can I import python libraries with .pyx and .c files without installing on the computer?

I am writing code for a number of other people, none of whom are particularly computer savvy. I installed python 2.7 for all of them, but I really do not want to have to install anything else.
To get around installing every library that I wanted to use, I've simply been including the library source code in the same folder as my project source code. Python automatically searches for the necessary files in the working directory, and all goes well.
The problem came when I tried to install pandas. Pandas is a library that includes .pyx and .c files that are compiled on install. I cannot just include these files in with my source code, because they are not in the proper form.
How can I either compile these on launch or pre-compile them for ease of transfer? (And the kicker, I need a solution that works cross platform--I work on Windows 7, my colleagues work on OSX.)
Thank you in advance.

What is Building and Installing?

This is probably a question that has a very easy and straightforward answer, however, despite having a few years programming experience, for some reason I still don't quite get the exact concepts of what it means to "build" and then to "install". I know how to use them and have used them a lot, but have no idea about the exact processes which happen in the background...
I have looked across the web, wikipedia, etc... but there is no one simple answer to it, neither can I find one here.
A good example, which I tried to understand, is adding new modules to python:
http://docs.python.org/2/install/index.html#how-installation-works
It says that "the build command is responsible for putting the files to install into a build directory"
And then for the install command: "After the build command runs (whether you run it explicitly, or the install command does it for you), the work of the install command is relatively simple: all it has to do is copy everything under build/lib (or build/lib.plat) to your chosen installation directory."
So essentially what this is saying is:
1. Copy everything to the build directory and then...
2. Copy everything to the installation directory
There must be a process missing somewhere in the explanation...complilation?
Would appreciate some straightforward not too techy answer but in as much detail as possible :)
Hopefully I am not the only one who doesn't know the detailed answer to this...
Thanks!
Aivoric
Building means compiling the source code to binary in a sandbox location where it won't affect your system if something goes wrong, like a build subdirectory inside the source code directory.
Install means copying the built binaries from the build subdirectory to a place in your system path, where they become easily accessible. This is rarely done by a straight copy command, and it's often done by some package manager that can track the files created and easily uninstall them later.
Usually, a build command does all the compiling and linking needed, but Python is an interpreted language, so if there are only pure Python files in the library, there's no compiling step in the build. Indeed, everything is copied to a build directory, and then copied again to a final location. Only if the library depends on code written in other languages that needs to be compiled you'll have a compiling step.
You want a new chair for your living-room and you want to make it yourself. You browse through a catalog and order a pile of parts. When they arrives at your door, you can't immediately use them. You have to build the chair at your workshop. After a bit of elbow-grease, you can sit down in it. Afterwards, you install the chair in your living-room, in a convenient place to sit down.
The chair is a program you want to use. It arrives at your house as source code. You build it by compiling it into a runnable program. You install it by making it easier to use.
The build and install commands you are refering to come from setup.py file right?
Setup.py (http://docs.python.org/2/distutils/setupscript.html)
This file is created by 3rd party applications / extensions of Python. They are not part of:
Python source code (bunch of c files, etc)
Python libraries that come bundled with Python
When a developer makes a library for python that he wants to share to the world he creates a setup.py file so the library can be installed on any computer that has python. Maybe this is the MISSING STEP
Setup.py sdist
This creates a python module (the tar.gz files). What this does is copy all the files used by the python library into a folder. Creates a setup.py file for the module and archives everything so the library can be built somewhere else.
Setup.py build
This builds the python module back into a library (SPECIFICALLY FOR THIS OS).
As you may know, the computer that the python library originally came from will be different from the library that you are installing on.
It might have a different version of python
It might have a different operating system
It might have a different processor / motherboard / etc
For all the reasons listed above the code will not work on another computer. So setup.py sdist creates a module with only the source files needed to rebuild the library on another computer.
What setup.py does exactly is similar to what a makefile would do. It compiles sources / creates libraries all that stuff.
Now we have a copy of all the files we need in the library and they will work on our computer / operating system.
Setup.py install
Great we have all the files needed. But they won't work. Why? Well they have to be added to Python that's why. This is where install comes in. Now that we have a local copy of the library we need to install it into python so you can use it like so:
import mycustomlibrary
In order to do this we need to do several things including:
Copy files to their library folders in our version of python.
Make sure library can be imported using import command
Run any special install instructions for this library. (seting up paths, etc)
This is the most complicated part of the task. What if our library uses BeautifulSoup? This is not a part of Python Library. We'd have to install it in a way such that our library and any others can use BeautifulSoup without interfering with each other.
Also what if python was installed someplace else? What if it was installed on a server with many users?
Install handles all these problems transparently. What is does is make the library that we just built able to run. All you have to do is use the import command, install handles the rest.

Python compile all modules into a single python file

I am writing a program in python to be sent to other people, who are running the same python version, however these some 3rd party modules that need to be installed to use it.
Is there a way to compile into a .pyc (I only say pyc because its a python compiled file) that has the all the dependant modules inside it as well?
So they can run the programme without needing to install the modules separately?
Edit:
Sorry if it wasnt clear, but I am aware of things such as cx_freeze etc but what im trying to is just a single python file.
So they can just type "python myapp.py" and then it will run. No installation of anything. As if all the module codes are in my .py file.
If you are on python 2.3 or later and your dependencies are pure python:
If you don't want to go the setuptools or distutiles routes, you can provide a zip file with the pycs for your code and all of its dependencies. You will have to do a little work to make any complex pathing inside the zip file available (if the dependencies are just lying around at the root of the zip this is not necessary. Then just add the zip location to your path and it should work just as if the dependencies files has been installed.
If your dependencies include .pyds or other binary dependencies you'll probably have to fall back on distutils.
You can simply include .pyc files for the libraries required, but no - .pyc cannot work as a container for multiple files (unless you will collect all the source into one .py file and then compile it).
It sounds like what you're after is the ability for your end users to run one command, e.g. install my_custom_package_and_all_required_dependencies, and have it assemble everything it needs.
This is a perfect use case for distutils, with which you can make manifests for your own code that link out to external dependencies. If your 3rd party modules are available publicly in a standard format (they should be, and if they're not, it's pretty easy to package them yourself), then this approach has the benefit of allowing you to very easily change what versions of 3rd party libraries your code runs against (see this section of the above linked doc). If you're dead set on packaging others' code with your own, you can always include the required files in the .egg you create with distutils.
Two options:
build a package that will install the dependencies for them (I don't recommend this if the only dependencies are python packages that are installed with pip)
Use virtual environments. You use an existing python on their system but python modules are installed into the virtualenv.
or I suppose you could just punt, and create a shell script that installs them, and tell them to run it once before they run your stuff.

Categories

Resources