I'm following Google's OR-Tools instructions and reading this instruction:
> "Then you can download all dependencies and build them using:
>
> make third_party"
What is this make command? Should I run it from Windows command prompt? Where is this third_party file located?
Sorry for this basic question. I'm new to this realm.
That page seems very clear to me.
Please make sure that svn.exe, nmake.exe and cl.exe are in your path.
You need to do exactly that. nmake.exe implements the make command, from the sound of things. As to where you should run this command, run it, as the page says, from the terminal in the Tools menu in Visual Studio.
NAME
make - GNU make utility to maintain groups of programs
SYNOPSIS
make [ -f makefile ] [ option ] ... target ...
Simply put make is a compilation tool, the Make command is a command used in Linux to 'make' all necessary recompilations. Make requires a configuration file. Once this file is constructed for your project, you usually type make to build the changed files.
Take a look at this link for some make examples.
http://linuxdevcenter.com/pub/a/linux/2002/01/31/make_intro.html
As per the link you provided, the instructions are straight forward:
Compiling libraries
All build rules use make (gnu make), even on windows. A make.exe binary is provided in the tools sub-directory; They are providing you with the make.exe, which means that in Windows you can use svn.exe to execute the following commands, just make sure you are within the path that includes the make binary.
If you do not find svn.exe, please install a svn version that offers the command line tool.
http://www.collab.net/downloads/subversion
Just execute the following commands to build the dependencies:
make
To compile in debug mode while in windows, use the following:
make DEBUG="/Od /Zi" all
If you need to clean everything and do it again, run:
make clean
This will clean all downloaded sources, all compiled dependencies, and Makefile.local. It is useful to get a clean state, or if you have added an archive in dependencies.archives.
Finally, to compile the library run:
make all
When everything is compiled, you will find under or-tools/bin and or-tools/lib:
some static libraries (libcp.a, libutil.a and libbase.a, and more)
One binary per C++ example (e.g. nqueens)
C++ wrapping libraries (pywrapcp.so, linjniwrapconstraint_solver.so)
Java jars (com.google.ortools.constraintsolver.jar...)
C# assemblies
Then we can edit the MakeFile.local
First off, download Python 2.7 and JDK 7, install them.
Edit Makefile.local to point to the correct Python and Java installation. For instance, on my system, it is:
WINDOWS_JDK_DIR = c:\\Program Files\\Java\\jdk1.7.0_02
WINDOWS_PYTHON_VERSION = 27
WINDOWS_PYTHON_PATH = C:\\python27
Afterwards, to use python, you need to install google-apputils.
cd dependencies/sources/google-apputils
c:\python27\python.exe setup.py install
Related
I want to distribute a python program on, say, Windows and/or Mac, but I don't want to give the user the headache of ensuring there is an appropriate python runtime installed on their machine. And i don't want to interfere with their machine's configuration by, let's say, requesting root privileges and installing a system-wide python runtime on their system that suits my program specifically because it's too invasive and might cause compatibility collisions with other installed versions of the runtime.
I would much rather have a self-contained executable that could be, for example, stored on a USB flash-drive, inserted into the system, and then maybe with a stepping-stone binary executable that just invokes the device-portable runtime on a python script that I provide, I could then run the program as if it were a self-contained binary executable (with only standard-library dependencies).
A link to this binary executable could be published into main-menu program lists, docks, or desktops. And it could be invoked by shell scripts or other executed-by-proxy mechanisms. Such a no-install/self-contained python program could potentially be a first-class user-invokable application. This is what I want to achieve.
I googled around for projects that provided a device-portable/mobile python installation and so far I've only found portablepython.com. Unfortunately it says the project is discontinued and no download link for the project is provided. it listed some similar projects but they all seemed defunkt or with a very different focus.
Does anyone know of an active project that is or includes such an independent/portable/mobile/no-install distribution for python?
or is there some way i could configure python's build system to build a noinstall-friendly product?
any ideas welcome. thanks for your input!
After more searching I found that Python.org publishes its own standalone-python distribution called the embeddable zip file.
This is exactly what I was searching for. It's a basic python standalone runtime that requires relatively few megabytes of storage.
I started with this embeddable distro and then cajoled a standalone copy of pip to work with it. Problem solved.
Improving upon #oreus2020's answer, you can download the embeddable zip file from here. Then, unzip the compressed file to a folder of your choice. Go to the root of your install and find python._pth file and open it in a text editor. Remove the "#" before import site(This file is the one which manages the environment of the portable install. If you want anything to be recognized by the portable python interpreter, just throw the path in here and that's it!). If you want pip, go to this page and save it in the root of your portable install and run it using the portable python interpreter like ./python get-pip.py from a commandline opened at the root of your install. Pip installed! To use the pip, do ./python -m pip <commands> from the commandline opened at the root of your install and then open the python._pth file and insert the following below the "." ./Lib/site-packages ./Scripts. Voila, you got yourself a python portable install!
My python._pth file looks like:
python39.zip
.
# Uncomment to run site.main() automatically
./Repo
./Repo/Code
./Repo/Code/cogs
./Lib/site-packages
./Scripts
import site
If you are still wondering, here is the link to the one I made for myself.
P.S. Pardon my bad English
I want to implement a new model language for spaCY.
I have installed spaCy (using the guide of the official web site) on my Windows SO but I haven't understand where and how I could write and run my future files.
Help me, Thanks.
I hope I understand your question correctly: If you only want to use spaCy, you can simply create a Python file, import spacy and run it.
However, if you want to add things to the spaCy source – for example to add new language data that doesn't yet exist – you need to compile spaCy from source. On Windows, this needs a little more preparation – but it's not that difficult:
Install the Visual C++ Build Tools, which include the compiler you need.
Fork and clone the spaCy repository on GitHub.
Navigate to that directory and install spaCy's dependencies (other packages plus developer requirements like Cython) by running pip install -r requirements.txt.
Then run python setup.py build_ext --inplace from the same directory. This will build and compile spaCy into the directory.
Make sure your PYTHONPATH is set to the new spaCy directory. This is important so Python knows that you want to execute this exact version of spaCy, and not some other one you have installed somewhere else. On Windows, I normally use this command: set PYTHONPATH=C:\path\to\spacy\directory. There's also this thread with more info. (I'm no Windows expert, though – so if anyone reads this and disagrees, feel free to correct me here.)
You can now edit the source, add files and run them. If you want to add a new language, I'd recommend starting by adding a new directory to spacy/lang and creating an __init__.py. You can find more info on how this should look in the usage guide on adding languages.
To test if everything works, start the Python interpreter and import and initialise your language. For example, let's assume you've added Icelandic. You should then be able to do this:
from spacy.lang.is import Icelandic
nlp = Icelandic()
I don't even know the right way to put my question but i will try my best.
I downloaded (Python-3.4.2.tar) the source code of a python interpreter from www.python.org
I extracted the files(using 7-zip).
Now lets say i latter want to use the unziped/extracted fies to create an installer i.e put it in a form that i can double click and Python-3.4.2 will be installed in my computer.
i guess it is called creating a build distriution.
I know i can just download Python-3.4.2.exe from the site and install right away but i want to know how it goes from being source code to becoming something one can install.
Here's a starting point for compiling:
https://github.com/python/cpython/blob/2.7/PCbuild/readme.txt
I was able to compile 2.7.13 with:
cd PCBuild
cmd /c get_externals.bat
cmd /c build.bat -e --no-tkinter "/p:PlatformToolset=v100"
3.4 may require a later compiler.
Here's how to build an installer:
https://github.com/python/cpython/blob/2.7/Tools/msi/README.txt
For building the CPython python interpreter from source, you'll want to have a look at the instructions at https://docs.python.org/devguide/
These instructions probably also (somewhere) contain the steps to produce a Windows installer, which seems to be done with some tool called PCbuild.
Basically, I think what you are asking is how you can distribute the Python runtime along with your program. The process is pretty simple. First, you may want to take a look at Python's distutils. Secondly, you will need to distribute the Python runtime. Python is currently released within installation binaries for pretty much every major operating system. You have the option of compiling target system(s) yourself in order to make Python silently install.
You will then need to decide where you want the files to go. Most people don't like this sort of behavior, so I recommend putting the files in your programs directory. Shortcuts, system variables, et cetera will need to be looked into.
As another option you can consider porting your script(s) to Jython, as most people tend to have at least the Java runtime. In the process of porting your code there is a way of constructing Java .class files with Jython when coded a certain way. They can then be placed into an executable Jar file. Easy peasy, but I haven't tried this myself, to be honest.
If you want to stick with pure Python, the last hurtle is putting everything in one file. There are plenty of tools for that. I would try Github for starters. For bonus points you can always code your own self-extracting binary.
Post Script: With the title changed, I think you are looking for Creating Build Distributions. It is the second link within the Google results for the query "how to compile python installer". Further down the same page I found this gem.
This is probably a question that has a very easy and straightforward answer, however, despite having a few years programming experience, for some reason I still don't quite get the exact concepts of what it means to "build" and then to "install". I know how to use them and have used them a lot, but have no idea about the exact processes which happen in the background...
I have looked across the web, wikipedia, etc... but there is no one simple answer to it, neither can I find one here.
A good example, which I tried to understand, is adding new modules to python:
http://docs.python.org/2/install/index.html#how-installation-works
It says that "the build command is responsible for putting the files to install into a build directory"
And then for the install command: "After the build command runs (whether you run it explicitly, or the install command does it for you), the work of the install command is relatively simple: all it has to do is copy everything under build/lib (or build/lib.plat) to your chosen installation directory."
So essentially what this is saying is:
1. Copy everything to the build directory and then...
2. Copy everything to the installation directory
There must be a process missing somewhere in the explanation...complilation?
Would appreciate some straightforward not too techy answer but in as much detail as possible :)
Hopefully I am not the only one who doesn't know the detailed answer to this...
Thanks!
Aivoric
Building means compiling the source code to binary in a sandbox location where it won't affect your system if something goes wrong, like a build subdirectory inside the source code directory.
Install means copying the built binaries from the build subdirectory to a place in your system path, where they become easily accessible. This is rarely done by a straight copy command, and it's often done by some package manager that can track the files created and easily uninstall them later.
Usually, a build command does all the compiling and linking needed, but Python is an interpreted language, so if there are only pure Python files in the library, there's no compiling step in the build. Indeed, everything is copied to a build directory, and then copied again to a final location. Only if the library depends on code written in other languages that needs to be compiled you'll have a compiling step.
You want a new chair for your living-room and you want to make it yourself. You browse through a catalog and order a pile of parts. When they arrives at your door, you can't immediately use them. You have to build the chair at your workshop. After a bit of elbow-grease, you can sit down in it. Afterwards, you install the chair in your living-room, in a convenient place to sit down.
The chair is a program you want to use. It arrives at your house as source code. You build it by compiling it into a runnable program. You install it by making it easier to use.
The build and install commands you are refering to come from setup.py file right?
Setup.py (http://docs.python.org/2/distutils/setupscript.html)
This file is created by 3rd party applications / extensions of Python. They are not part of:
Python source code (bunch of c files, etc)
Python libraries that come bundled with Python
When a developer makes a library for python that he wants to share to the world he creates a setup.py file so the library can be installed on any computer that has python. Maybe this is the MISSING STEP
Setup.py sdist
This creates a python module (the tar.gz files). What this does is copy all the files used by the python library into a folder. Creates a setup.py file for the module and archives everything so the library can be built somewhere else.
Setup.py build
This builds the python module back into a library (SPECIFICALLY FOR THIS OS).
As you may know, the computer that the python library originally came from will be different from the library that you are installing on.
It might have a different version of python
It might have a different operating system
It might have a different processor / motherboard / etc
For all the reasons listed above the code will not work on another computer. So setup.py sdist creates a module with only the source files needed to rebuild the library on another computer.
What setup.py does exactly is similar to what a makefile would do. It compiles sources / creates libraries all that stuff.
Now we have a copy of all the files we need in the library and they will work on our computer / operating system.
Setup.py install
Great we have all the files needed. But they won't work. Why? Well they have to be added to Python that's why. This is where install comes in. Now that we have a local copy of the library we need to install it into python so you can use it like so:
import mycustomlibrary
In order to do this we need to do several things including:
Copy files to their library folders in our version of python.
Make sure library can be imported using import command
Run any special install instructions for this library. (seting up paths, etc)
This is the most complicated part of the task. What if our library uses BeautifulSoup? This is not a part of Python Library. We'd have to install it in a way such that our library and any others can use BeautifulSoup without interfering with each other.
Also what if python was installed someplace else? What if it was installed on a server with many users?
Install handles all these problems transparently. What is does is make the library that we just built able to run. All you have to do is use the import command, install handles the rest.
I'm looking to create the following:
A portable version of python that can be run on any system (with any previous version of python or no python installed) and have it pre-configured with various python packages (ie, django, lxml, pysqlite, etc)
The closest I've found to the above is virtualenv, but this only goes so far.
If I package up a nice virtualenv for python on one machine, it contains sym links to a lot of the libraries it needs. I can take those sym links and convert them to their actual files, but if I try to move this entire directory to another machine, I get seg fault after seg fault.
To launch python on a different machine, I'm using:
LD_LIBRARY_PATH=lib/ ./bin/python
and in lib/ I have all of the shared libraries I copied from the original machine. The problem here is these shared libraries might rely on other shared libraries that I'm not including, so executing this on other linux distros does not work. Probably due to it falling back on older shared libaries installed on the system that do not work with what I copied over.
Anyone have an idea on how to get this working? Is this even possible?
EDIT:
To clarify, the desired outcome is to create a tar.gz of a python binary and associated packages (django, lxml, pysqlite, etc) that can be extracted and run on any linux based system, ie (ubuntu 8.04, redhat 5, suse 11, etc), all 32bit distros, where the locally installed version of python doesn't impact what's in the tar.gz.
I just tested this and it works great.
Get the copy of python you want to install and untar it and cd to the untarred folder first.
Also get a copy of setuptools and untar that.
/opt/portapy used below is of course just the name I came up with for this post, it could be any path and the full path should be tarred up and the same path should be used on any systems you put this on due to absolute path linking.
mkdir /opt/portapy
cd <python source dir>
./configure --prefix=/opt/portapy && make && make install
cd <setuptools source dir>
/opt/portapy/bin/python ./setup.py install
Make the virtual env folder inside the portapy folder.
mkdir /opt/portapy/virtenv
/opt/portapy/bin/virtualenv /opt/portapy/virtenv
cd /opt/portapy/virtenv
source bin/activate
Done. You are ready to install all of your libraries here and have the option of creating multiple virtual envs this way.
You can then tar up the whole /opt/portapy folder and transport it to any Linux system of the same arch, within reason I suspect.
I compiled 2.7.5 ond centOS 5.8 64bit and moved the folder to a Cent6.9 system and it runs perfectly.
I don't know how this is even possible. If it were, they woudn't need to distribute binary packages of python for different platforms. You can't simply distribute python that will run on any platform. It has to be built from source for that arch. Virtualenv will expect you to tell it which system python to use (using links).
This pretty much goes for almost any binary package that links against system libs. Again, if it were possible, we wouldn't need any platform specific binary distributions.
You can, however, achieve part of what you want. That is, running python on another machine that doesn't have python installed as long as its the same arch. This is the same concept behind freezing, or py2exe/py2app/pyinstaller. An interpreter is bundled into a standalone environment. So the app can run on any similar platform.
Edit
I just realized that while your question speaks about "system" agnostically, your title contains the reference "linux". There are different flavors of linux, so in order for it to work you would have to build it fat for multiple archs and also completely contain the standalone links. You might try building a package with pyinstaller and using that to include in your project.
You can try just building python from source, in your virtualenv:
$ ./configure --prefix=/path/to/virtualenv && make && make install
If you still have problems with the links to libs, you can also investigate building it statically
I'm not sure that working solely in Python is the way to go here. You might have better luck with Puppet of Chef, which are configuration tools that can be used to create a local environment. There is plenty of code out there to install virtualenv and python on just about any Linux plus OSX (probably not Windows though).
Your workflow would be to install chef or Puppet (your choice), run a script to install the Python you want, then enter a virtualenv and pip install any packages you might need.
Sorry this isn't as easy as virtualenv alone, but it is much more robust.
Well, since I rarely accept "can't be done", there is a way to do it. Warning: it isn't pretty and you should probably look into a different scenario.
What you will need to to is determine a standard location for this top level directory. Second, using that directory as your root you will need to compile Python on each Linux distribution you want to run this on. For this you would use something like "/usr/local/myappname/platform/" to configure and compile Python to live in. In each case substitute "platform" with the name of the platform such as "/usr/local/rhel/". If memory serves the configure option you are looking for here is --prefix.
Once you have each distribution compiled you will need a script to determine which one to use and either set environment variables or have it create symlinks to the appropriate "installation" of python. I would then use virtualenv and bootstrap in that tree to keep the "in-use" python libraries even more specific.
I can't think of a common Linux distribution that doesn't have Python by default. As such you could use setup.py and/or basic python scripts to script this out since you should be able to rely in Python being present - even if its ye olde version as in RHEL installs. Personally I find the above method overly complicated but it would meet your stated requirements with the allowance for a final script. Of course, you could use shar (SHell ARchive) to tar all of this into a runnable shell script to do the installation and avoid the need for secondary scripts. If you gzip the resulting shel archive then you can decompress it on target systems and execute it to set everything up.
All that said, I would not recommend this. I would recommend determining the minimum Python version you can run on and ensuring that is installed by the distribution whenever possible and if needs be pulling down from a repo and installing. Then, use virtualenv and bootstrap with a requirements.txt to install necessary python libraries and apps into the virutalenv. For that see this documentation
I faced the same problem, so I created PortableVirtualenv. Your Question is just the definition of it.
I use it as a base for commercial multiplatform app I develop. (But PortableVirtualenv is public domain - use it freely.)
If needed, you can pip-install any package and zip the whole directory to distribute also packages you need.
One nice option is to make a "snap" portable linux application. They have a python mode which lets you specify you specify exactly what modules you need. From https://snapcraft.io/first-snap#python :
Snaps let you distribute a dependency-isolated Python app in an app store experience for end users.
Another option is to containerize your application with something like docker. Then instead of executing your script directly, the user is actually running a small OS with just your application and its dependencies. https://www.infoq.com/articles/docker-executable-images/ has more about executable containers.
Container images can also be used for short lived processes: a containerized executable meant to be run on your computer. These containers execute a single task, are short lived and can generally be removed after use. We call these executable images. Examples are compilers (Golang) or build tools (Maven), presentation software (I love to hack a simple presentation in Markdown format and let a RevealJS Docker image serve that) and browsers (a fresh contained browser to follow that fishy link). A real evangelist for executable images is Docker's own Jessie Frazelle. To get some great inspiration be sure to read her blog about them or check out this presentation at DockerCon 2015.