I am trying to build a minimal docker image, capable of nothing more but running the Python interpreter. During development I start from alpine, the final image will be build FROM scratch. I am working on a Linux Mint machine.
I am able to compile the Python compiler and install it into my current working directory like this:
cd Python-3.8.7
./configure --prefix=$PWD/../python
make install
However, I did not find out how to tweak the --prefix settings correctly so that the created shebangs will later work in the docker container.
First line of ./python/pip3 contains the absolute path of my host and reads
#!/home/orion/minimal_py_image/Python-3.8.7/../python/bin/python3.8
But it should read
#!/python/bin/python3.8
because /python is the location under which the Python interpreter will be found in the docker image.
How can I trick the make install script so that the final destination will be /python/bin?
I would like to keep the build contained in the current directory i.e. not using the folder /python on the host where I do the compilation.
Additional Information
Probably not directly relevant for this question but as reference: Here is the Dockerfile I am trying to get working:
FROM alpine
COPY python /python
COPY lib64/* /lib64/
ENV LD_LIBRARY_PATH=/usr/lib64/:/lib64/:/python/lib
ENV PATH="${PATH}:/python/bin"
I am capable of running Python already with docker run -it mini python3 -c "print('hello from python')" but pip is not working yet due-to the wrong shebang.
A common convention in Autoconf-based build systems is to support a Make variable DESTDIR. When you run make install, if DESTDIR is set, it actually installs into the configured directory under DESTDIR, but still built with the original path. You can then create an archive of the target directory, or in a Docker context, use that directory as the build context.
cd Python-3.8.7
# Use the final install target as the --prefix
./configure --prefix=/python
# Installs into e.g. ../python/python/bin/python3.8
make install DESTDIR=../python
cd ../python
tar cvzf ../python.tar.gz .
You can see this variable referenced in the Python Makefile in many places.
Related
My goal is to create a .zip file which includes my python code and a batch file that runs that python code without installing anything else in the user's system but only uses files within the .zip. This means the batch file can't run python main.py since that would first involve getting the user to install python, which in my purposes, is just inconvenient and bad.
My first instinct was to put my code in a folder. Then, create a virtualenv with all dependencies installed. Create a batch file with the following code:
".venv/Scripts/python.exe" code_path/main.py
Package the code, virtualenv, and batch file in a zip file then pass that around to the users.
The problem I have with this is that I feel that it's dumb to add the virtualenv to the zip file. EDIT: Even if I were to add it, the .venv wouldn't even work for other systems apparently.
I tried other solutions like making a .exe instead with pyinstaller, but it keeps popping up with false positive detections from some security vendors like Chrome and Windows Defender, so I strayed away from this path and tried using .zip files instead.
EDIT: question currently limited to solutions for Windows 10/11.
Docker is a nice solution for the problem you face.
You could build the environment and apps in a container, which is based on official python docker image, then pack and distribute your python-app image.
The following Dockerfile is an example:
Prepare your app .py in /usr/src/app and write the requrements in requrements
FROM python:3
WORKDIR /usr/src/app
COPY requirements.txt ./
RUN pip install --no-cache-dir -r requirements.txt
COPY . .
CMD [ "python", "./your-daemon-or-script.py" ]
Visit https://hub.docker.com/_/python for more detail.
I have a python project and I am using pipenv to handle deps.
I need to create a zip file that includes the source code and all the dependencies code as well. I need this zip file for uploading it to AWS Lambda.
When working with pipenv, it downloads the dependency libraries somewhere in the computer, but for packaging/distribution of the project I need all the necessary code to be contained in the same place (a zip file).
Is there a way to run pipenv and set it to install dependencies at a specific path? If not, does someone knows where those dependencies are located in my machine?
Thanks
This has worked for me:
#!/bin/bash
# this is b/c pipenv stores the virtual env in a different
# directory so we need to get the path to it
SITE_PACKAGES=$(pipenv --venv)/lib/python3.6/site-packages
echo "Library Location: $SITE_PACKAGES"
DIR=$(pwd)
# Make sure pipenv is good to go
echo "Do fresh install to make sure everything is there"
pipenv install
cd $SITE_PACKAGES
zip -r9 $DIR/package.zip *
cd $DIR
zip -g package.zip posts.py
I've specifically tried it with numpy and it works correctly. It includes the .so files as well which is great because everything is self contained.
I'm new to python. I have a maven project which uses the org.jolokia maven-docker-plugin to create a docker image that consumes a python library.
Currently the container uses a pip install to install the python library.
I have forked the python library and made some changes, and now I would like my docker container to consume MY version of the python library. How can I do this?
What I have tried:
Copied my changed python file to overwrite the folder located in /usr/local/lib/python2.7/dist-packages/ which was generated after pip install (via mount directory into container).
Created tar of entire python project, added it into image using fileSets, ran pip install /maven/mypythonversion.tar.gz.
Any help much appreciated!
The way to do this is to get your python project into the image and run install on the setup.py. This method assumes your image already has the python interpreter and the relevant dependencies installed for your project to run.
Copy your python project code into a folder in your maven project i.e. one called input.
Use fileSets in the assembly of the image to assemble the python source code into the image. <directory> should point to the input folder containing your python source code:
<assembly>
<mode>tar</mode>
<inline>
<fileSets>
<fileSet>
<directory>C:/.../input</directory>
<outputDirectory>/output</outputDirectory>
<includes>
<include>**/*</include>
</includes>
</fileSet>
</fileSets>
</inline>
</assembly>
Then you want a script to run python setup.py install from the output directory (/maven/output/mypythonproject).
I.e. use a runCmd:
<runCmds>
<run>
cd /maven/output/mypythonproject \
python setup.py install
</run>
</runCmds>
This installs the python module and puts egg file in the /usr/local/lib/python27/dist-packages folder, which will be found by your python interpreter.
I have a virtualenv located at /home/user/virtualenvs/Environment. Now I need this environment at another PC. So I installed virtualenv-clone and used it to clone /Environment. Then I copied it to the other PC via USB. I can activate it with source activate, but when I try to start the python interpreter with sudo ./Environment/bin/python I get
./bin/python: 1: ./bin/python: Syntax Error: "(" unexpected
Executing it without sudo gives me an error telling me that there is an error in the binaries format.
But how can this be? I just copied it. Or is there a better way to do this? I can not just use pip freeze because there are some packages in /Environment/lib/python2.7/site-packages/ which I wrote myself and I need to copy them, too. As I understand it pip freeze just creates a list of packages which pip then downloads and installs.
Do the following steps on the source machine:
workon [environment_name]
pip freeze > requirements.txt
copy requirements.txt to other PC
On the other PC:
create a virtual environment using mkvirtualenv [environment_name]
workon [environment_name]
pip install -r requirements.txt
You should be done.
Other Resources:
How to Copy/Clone a Virtual Environment from Server to Local Machine
Pip Freeze Not Applicable For You?
Scenario: you have libraries installed on your current system that are very hard to migrate using pip freeze and am talking really hard, because you have to download and install the wheels manually such as gdal, fiona, rasterio, and then even doing so still causes the project to crash because possibly they were installed in the wrong order or the dependencies were wrong and so on.
This is the experience I had when I was brought on board a project.
For such a case, when you finally get the environment right you basically don't want to go through the same hell again when you move your project to a new machine. Which I did, multiple times. Until finally I found a solution.
Now, disclaimer before I move on:
I don't advocate for this method as the best, but it was the best for my case at the time.
I also cannot guarantee it will work when switching between different OSes as I have only tried it between Windows machine. In fact I don't expect it to work when you move from Windows to other OSs as the structure of the virtualenv folder from Unix-based OS is different from that of Windows.
Finally, the best way to do all of this is to use Docker. My plan is to eventually do so. I have just never used Docker for a non-web-app project before and I needed a quick fix as my computer broke down and the project could not be delayed. I will update this thread when I can once I apply Docker to the project.
THE HACK
So this is what I did:
Install the same base Python on your new machine. If you have 3.9 on the old, install 3.9 on the new one and so on. Keep note of where the executable can be located, usually something like C:\Users\User\Appdata\Local\Programs\Python\PythonXX
Compress your virtual env folder, copy it into the project directory
inside your new machine. Extract all files there
Using text editor of your choice, or preferably IDE, use the 'Search
in all files' feature to look for all occurrences of references to
your old machine paths: C:\Users*your-old-username*
Replace these with your new references. For my case I had to do it in
the following files inside the virtual env folder: pyvenv.cfg, Scripts/activate, Scripts/activate.bat, Scripts/activate.fish and Scripts/activate.nu.
And that's it!
Good luck everyone.
I think what occurs is that you just copy the symbolic links in the source file to the target machine as binary files(no longer links). You should copy it using rsync -l to copy to keep those links.
Usually I use virtualenv to create a new environment, then I go to the environment where I want to copy from, copy all the folders and paste it into the environment folder I just created, but most importantly when asking if you want to replace the Destination files, choose to skip these files. This way you keep your settings.
At least for me, this has worked very well.
I hope it works for you too.
I share my experience.
Suppose another PC does not install Python
Python version: 3.7.3
Platform: Platform: Windows 10, 7 (64bit)
The following is working for me.
Step:
download Windows embeddable zip file
download get-pip.py (because the embeddable zip file does not provide pip)
[Optional] install tkinter, see this article: Python embeddable zip: install Tkinter
Choose a packaging method (I use NSIS: https://nsis.sourceforge.io/Download)
folder artictures:
- main.nsi
- InstallData
- contains: Step1 & Step2
- YourSitePackages # I mean that packages you do not intend to publish to PyPI.
- LICENSE
- MANIFEST.in
- README.rst
- ...
- requirements.txt
- setup.py
The abbreviated content about main.nsi is as follows:
!define InstallDirPath "$PROGRAMFILES\ENV_PYTHON37_X64"
!define EnvScriptsPath "${InstallDirPath}\Scripts"
...
CreateDirectory "${InstallDirPath}" # Make sure the directory exists before the writing of Uninstaller. Otherwise, it may not write correctly!
SetOutPath "${InstallDirPath}"
SetOverwrite on
File /nonfatal /r "InstallData\*.*"
SetOutPath "${InstallDirPath}\temp"
SetOverwrite on
File /nonfatal /r "YourSitePackages\*.*"
nsExec::ExecToStack '"${InstallDirPath}\python.exe" "${InstallDirPath}\get-pip.py"' # install pip
nsExec::ExecToStack '"${InstallDirPath}\Scripts\pip.exe" install "${InstallDirPath}\temp\."' # install you library. same as: `pip install .`
RMDir /r "${InstallDirPath}\temp" # remove source folder.
...
/*
Push ${EnvScriptsPath} # Be Careful about the length of the HKLM.Path. it is recommended to write it to the HKCU.Path, it is difficult for the user path to exceed the length limit
Call AddToPath # https://nsis.sourceforge.io/Path_Manipulation
*/
hope someone will benefit from this.
I am trying to create a local development space on my laptop, running Apache-MySQL-Python. I have each component installed, but am having difficulty connecting Python to MySQL. I have used these instructions, including installing pip and PyMySQL: https://github.com/PyMySQL/PyMySQL#installation
When I get to the part that says to enter this:
$ cp .travis.databases.json pymysql/tests/databases.json
I get this:
$ cp .travis.databases.json pymysql/tests/databases.json
cp: .travis.databases.json: No such file or directory
I can't locate the .travis.databases.json file (I have hidden files showing), even though my $PATH is:
/Library/Frameworks/Python.framework/Versions/3.4/bin:/opt/local/bin:/opt/local/sbin:/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin:
Is my path wrong, or is there something else I'm missing? If it makes a difference, all of my tools (installers, pkgs, etc.) are in a folder on my desktop. Apache Server is up and running, too.
Did you install it from source or using pip? .travis.databases.json is used only for running the test suite and if you're building from source then it is here in the git repo. If you installed it using pip then you'll want to copy that file locally.