I have downloaded a python package to install, on my ubuntu machine. The package has already a setup.py file to use, but I want to change the default python installation address to something else, for this package specifically (and not for good). So what I tried is:
First in the terminal, I export that address of the new folder:
export PYTHONPATH=${PYTHONPATH}:${HOME}/Documents/testfolder/lib/python2.7/site-packages
Then I add this exported address as prefix to the installation command:
python setup.py install --prefix=~/Documents/testfolder
The installation goes through. Now to make python always look for this new path as well (next to the default installation path), I export the address in bashrc file:
export PYTHONPATH="${PYTHONPATH}:~/Documents/testfolder/lib/python2.7/site-packages"
But now whenever I open a terminal and try to import the installed package, it cannot see ("no module named..."). Only when I open a terminal in the folder where I had the installation files (namely setup.py), and run python, can it then see the package, and it works there.
Why isn't my export in bashrc making the package available from anywhere?
Is there something I have done wrong in the above?
To answer your question about the export path. Do you have $PYTHONPATH as a part of your $PATH? If not you should add it to path.
The best way to handle this scenario in my opinion is to use a virtual python environment. There are a couple to choose from, but I like virtualenv the best. The reason to take this approach is because you can manage different versions of python in separate folders. And have separate packages installed in these folders. I recommend looking into it as it is a very useful tool. If you want an examole of how to use it i can provide that https://virtualenv.pypa.io/en/stable/
Related
I'd like to collect all the functions I wrote in Python and store them in a folder. I'm an Ubuntu user, what environmental path do I have to add in my ~/.profile?
I tried
export PATH:$PATH:/home/functionFolder
or
export PYTHONPATH:/home/functionFolder
I also added an init.py file in the /home/functionFolder, but it doesn't work.
My aim is to import the functions with
from myFunctions import function1
with myFunctions located in /home/functionFolder.
Do not mess with PYTHONPATH. It will lead to hard-to-debug situations, non-portable code and general misery. (If you really, really want to, you could mess with it within your program with e.g. sys.path.insert(0, '...'), but that's also non-portable.)
If you want to maintain a personal toolbox library, it's better to just make it a package you can install. This will also pave the way to making it distributable later on should you want to.
The canonical guide to packaging Python libraries lives here at packaging.python.org but the very simplest tl;dr edition (at the time of writing) is:
This assumes
you have a project directory, e.g. /home/me/project
there is a package directory in the project directory, e.g. if your package is named myFunctions, you have a myFunctions/__init__.py
you know how to use virtualenvs for your other projects
Add pyproject.toml to /home/me/project with the contents listed below.
Add setup.cfg to /home/me/project with the contents listed below.
Now, in a project virtualenv where you want to use myFunctions, run pip install -e /home/me/project. This will install the package that lives in that path in editable mode, i.e. just link that folder into the project's environment.
pyproject.toml
[build-system]
requires = [
"setuptools>=42",
"wheel"
]
build-backend = "setuptools.build_meta"
setup.cfg
[metadata]
name = myFunctions
[options]
packages = find:
You can do it by adding the following commands to your .bashrc file which is located in ~/:
PYTHONPATH=/home/functionFolder/:$PYTHONPATH
export PYTHONPATH
once you have added the above commands in the .bashrc file, save it and type following in your console:
source ~/.bashrc
afterward, you can access and import your functions anywhere.
Python makes uses of a specific user-related directory for storing packages and modules that are installed by and for that user only (that is, not system-wide). The place for modules and packages is
$HOME/.local/lib/pythonx.y/site-packages/<module-or-package>
Where x.y denotes the relevant Python version, e.g. 3.9, and <module-or-package> is the .py file for a module, and a directory for the package. (Note that this means you can use multiple minor Python versions next to each other without them interfering, but you do have install packages for each minor Python version separately.)
This directory is automatically picked up by Python when the relevant user uses Python; no need to involve PYTHONPATH.
pip also installs into this directory, when you specify the --user flag; or when pip finds it can't install in the system directory and it will use the $HOME/.local directory instead.
If you have installable packages, you can even use pip install . --user or similar to install the packages properly in $HOME/.local
I have a Python app running on windows that has imports for the following packages:
requests
json
psycopg2
I copy the entire project (I used Pycharms to write the app and import the packages) to a new machine and expected it would work. The new machine is also windows and I'm trying to run my script from the command line (i.e. no Pycharm on the new machine).
Instead, I get an error saying "ModuleNotFoundError: No module named 'requests'"
If I look at the project, I have the directories:
venv
Lib
site-packages
requests
What am I missing/doing wrong?
You have a couple of options here but first the problem. You are exporting your code base to a new machine without the required Modules installed on that machine and/or within your Python project's environment. After you have python installed on your new machine, you need to be sure to point your PyCharm Project to the proper environment.
File > Default Preferences > Project Interpreter
The window that appears on the right will contain a drop down menu labeled Project Interpreter. If you click on the drop down, it should reveal a list of the available Python environments on your machine.
Based on your description of your site-packages directory I would assume you do not have your interpreter pointed the proper environment on your new machine. With that said, you would be better served creating a new virtual python environment on your machine and installing each relevant dependency within that environment.
Take a look at this post here for your first best option on re-creating your old python environment on your new machine.
EDIT: I apologize for not reading the question more thoroughly before answering the questions. If this is running on a Windows machine you will need to double check the environment path python is using. It is very easy to install python at a different PATH than the command line environment is checking on a Windows box. If for example your PATH is pointing to a different version of Python and PIP is installing packages somewhere else this issue can occur. Double check your System PATH for python and which version the command line is running.
On the new machine you must source venv/bin/activate so your path environment variables are set properly. In particular, which python should say venv/bin/python rather than /usr/bin/python. Also, take care to conda env update or pip install -r requirements.txt so you'll have suitable venv libraries on the new machine.
I have a virtualenv located at /home/user/virtualenvs/Environment. Now I need this environment at another PC. So I installed virtualenv-clone and used it to clone /Environment. Then I copied it to the other PC via USB. I can activate it with source activate, but when I try to start the python interpreter with sudo ./Environment/bin/python I get
./bin/python: 1: ./bin/python: Syntax Error: "(" unexpected
Executing it without sudo gives me an error telling me that there is an error in the binaries format.
But how can this be? I just copied it. Or is there a better way to do this? I can not just use pip freeze because there are some packages in /Environment/lib/python2.7/site-packages/ which I wrote myself and I need to copy them, too. As I understand it pip freeze just creates a list of packages which pip then downloads and installs.
Do the following steps on the source machine:
workon [environment_name]
pip freeze > requirements.txt
copy requirements.txt to other PC
On the other PC:
create a virtual environment using mkvirtualenv [environment_name]
workon [environment_name]
pip install -r requirements.txt
You should be done.
Other Resources:
How to Copy/Clone a Virtual Environment from Server to Local Machine
Pip Freeze Not Applicable For You?
Scenario: you have libraries installed on your current system that are very hard to migrate using pip freeze and am talking really hard, because you have to download and install the wheels manually such as gdal, fiona, rasterio, and then even doing so still causes the project to crash because possibly they were installed in the wrong order or the dependencies were wrong and so on.
This is the experience I had when I was brought on board a project.
For such a case, when you finally get the environment right you basically don't want to go through the same hell again when you move your project to a new machine. Which I did, multiple times. Until finally I found a solution.
Now, disclaimer before I move on:
I don't advocate for this method as the best, but it was the best for my case at the time.
I also cannot guarantee it will work when switching between different OSes as I have only tried it between Windows machine. In fact I don't expect it to work when you move from Windows to other OSs as the structure of the virtualenv folder from Unix-based OS is different from that of Windows.
Finally, the best way to do all of this is to use Docker. My plan is to eventually do so. I have just never used Docker for a non-web-app project before and I needed a quick fix as my computer broke down and the project could not be delayed. I will update this thread when I can once I apply Docker to the project.
THE HACK
So this is what I did:
Install the same base Python on your new machine. If you have 3.9 on the old, install 3.9 on the new one and so on. Keep note of where the executable can be located, usually something like C:\Users\User\Appdata\Local\Programs\Python\PythonXX
Compress your virtual env folder, copy it into the project directory
inside your new machine. Extract all files there
Using text editor of your choice, or preferably IDE, use the 'Search
in all files' feature to look for all occurrences of references to
your old machine paths: C:\Users*your-old-username*
Replace these with your new references. For my case I had to do it in
the following files inside the virtual env folder: pyvenv.cfg, Scripts/activate, Scripts/activate.bat, Scripts/activate.fish and Scripts/activate.nu.
And that's it!
Good luck everyone.
I think what occurs is that you just copy the symbolic links in the source file to the target machine as binary files(no longer links). You should copy it using rsync -l to copy to keep those links.
Usually I use virtualenv to create a new environment, then I go to the environment where I want to copy from, copy all the folders and paste it into the environment folder I just created, but most importantly when asking if you want to replace the Destination files, choose to skip these files. This way you keep your settings.
At least for me, this has worked very well.
I hope it works for you too.
I share my experience.
Suppose another PC does not install Python
Python version: 3.7.3
Platform: Platform: Windows 10, 7 (64bit)
The following is working for me.
Step:
download Windows embeddable zip file
download get-pip.py (because the embeddable zip file does not provide pip)
[Optional] install tkinter, see this article: Python embeddable zip: install Tkinter
Choose a packaging method (I use NSIS: https://nsis.sourceforge.io/Download)
folder artictures:
- main.nsi
- InstallData
- contains: Step1 & Step2
- YourSitePackages # I mean that packages you do not intend to publish to PyPI.
- LICENSE
- MANIFEST.in
- README.rst
- ...
- requirements.txt
- setup.py
The abbreviated content about main.nsi is as follows:
!define InstallDirPath "$PROGRAMFILES\ENV_PYTHON37_X64"
!define EnvScriptsPath "${InstallDirPath}\Scripts"
...
CreateDirectory "${InstallDirPath}" # Make sure the directory exists before the writing of Uninstaller. Otherwise, it may not write correctly!
SetOutPath "${InstallDirPath}"
SetOverwrite on
File /nonfatal /r "InstallData\*.*"
SetOutPath "${InstallDirPath}\temp"
SetOverwrite on
File /nonfatal /r "YourSitePackages\*.*"
nsExec::ExecToStack '"${InstallDirPath}\python.exe" "${InstallDirPath}\get-pip.py"' # install pip
nsExec::ExecToStack '"${InstallDirPath}\Scripts\pip.exe" install "${InstallDirPath}\temp\."' # install you library. same as: `pip install .`
RMDir /r "${InstallDirPath}\temp" # remove source folder.
...
/*
Push ${EnvScriptsPath} # Be Careful about the length of the HKLM.Path. it is recommended to write it to the HKCU.Path, it is difficult for the user path to exceed the length limit
Call AddToPath # https://nsis.sourceforge.io/Path_Manipulation
*/
hope someone will benefit from this.
It seems that python can be found in three different places in my Mac OS. See below. Is there anything wrong? Should I and how can I clean up my python installation without reinstalling the OS? In fact, I recently experience some strange behavior when using Python.
'/usr/local/Cellar/python/2.7.9/Frameworks/Python.framework/Versions/2.7/lib/python2.7/,
'/usr/local/lib/python2.7/site-packages',
'/Library/Python/2.7/site-packages',
What you need to do is update the paths in your .bashrc (or more likely .profile as you are on mac). This should be accessible from your home directory. ~/.profile and can be edited using nano.
You can then tell your terminal which set of libraries and version of python to use by adding the following. Note more than one may be added this way, so only have one bin directory with an executable python program!
export PATH=$PATH:/usr/local/lib/python2.7/site-packages
If you want to add other libraries / execute your own programs as if they were in the library or save yourself reinstalling everything, you can use the following:
export PYTHONPATH=/Library/Python/2.7/site-packages'
Finally if you wanted to run a script/preload libraries every time you open python, you can make a .pythonstartup file in your home directory as well.
export PYTHONSTARTUP=$HOME/.pythonstartup
As for cleaning up, most distributions tend to update you paths when they are installed, which is most likely what is causing your problems. So all you probably have to do is look at your .profile file and remove two of the three paths above.
Hope that helps!
I am currently writing a command line application in Python, which needs to be made available to end users in such a way that it is very easy to download and run. For those on Windows, who may not have Python (2.7) installed, I intend to use PyInstaller to generate a self-contained Windows executable. Users will then be able to simply download "myapp.exe" and run myapp.exe [ARGUMENTS].
I would also like to provide a (smaller) download for users (on various platforms) who already have Python installed. One option is to put all of my code into a single .py file, "myapp.py" (beginning with #! /usr/bin/env python), and make this available. This could be downloaded, then run using myapp.py [ARGUMENTS] or python myapp.py [ARGUMENTS]. However, restricting my application to a single .py file has several downsides, including limiting my ability to organize the code and making it difficult to use third-party dependencies.
Instead I would like to distribute the contents of several files of my own code, plus some (pure Python) dependencies. Are there any tools which can package all of this into a single file, which can easily be downloaded and run using an existing Python installation?
Edit: Note that I need these applications to be easy for end users to run. They are not likely to have pip installed, nor anything else which is outside the Python core. Using PyInstaller, I can generate a file which these users can download from the web and run with one command (or, if there are no arguments, simply by double-clicking). Is there a way to achieve this ease-of-use without using PyInstaller (i.e. without redundantly bundling the Python runtime)?
I don't like the single file idea because it becomes a maintenance burden. I would explore an approach like the one below.
I've become a big fan of Python's virtual environments because it allows you to silo your application dependencies from the OS's installation. Imagine a scenario where the application you are currently looking to distribute uses a Python package requests v1.0. Some time later you create another application you want to distribute that uses requests v2.3. You may end up with version conflicts on a system where you want to install both applications side-by-side. Virtual environments solve this problem as each application would have its own package location.
Creating a virtual environment is easy. Once you have virtualenv installed, it's simply a matter of running, for example, virtualenv /opt/application/env. Now you have an isolated python environment for your application. Additionally, virtual environments are very easy to clean up, simply remove the env directory and you're done.
You'll need a setup.py file to install your application into the environment. Say your application uses requests v2.3.0, your custom code is in a package called acme, and your script is called phone_home. Your directory structure looks like this:
acme/
__init__.py
models.py
actions.py
scripts/
phone_home
setup.py
The setup.py would look something like this:
from distutils.core import setup
install_requires = [
'requests==2.3.0',
]
setup(name='phone_home',
version='0.0.1',
description='Sample application to phone home',
author='John Doe',
author_email='john#doe.com',
packages=['acme'],
scripts=['scripts/phone_home'],
url='http://acme.com/phone_home',
install_requires=install_requires,
)
You can now make a tarball out of your project and host it however you wish (your own web server, S3, etc.):
tar cvzf phone_home-0.0.1.tar.gz .
Finally, you can use pip to install your package into the virtual environment you created:
/opt/application/env/bin/pip install http://acme.com/phone_home-0.0.1.tar.gz
You can then run phone_home with:
/opt/application/env/bin/phone_home
Or create a symlink in /usr/local/bin to simply call the script using phone_home:
ln -s /opt/application/env/bin/phone_home /usr/local/bin/phone_home
All of the steps above can be put in a shell script, which would make the process a single-command install.
And with slight modification this approach works really well for development environments; i.e. using pip to install / reference your development directory: pip install -e . where . refers to the current directory and you should be in your project directory alongside setup.py.
Hope this helps!
You could use pip as suggested in the comments. You need to create a MANIFEST.in and setup.py in your project to make it installable. You can also add modules as prerequisites. More info can be found in this question (not specific to Django):
How do I package a python application to make it pip-installable?
This will make your module available in Python. You can then have users run a file that runs your module, by either python path/run.py, ./path/run.py (with +x permission) or python -c "some code here" (e.g. for an alias).
You can even have users install from a git public reporitory, like this
pip install git+https://bitbucket.org/yourname/projectname.git
...in which case they also need git.