I'm developing my master thesis on a university's server, so I have my account and I can log in and do all the stuff I want if I remain inside /home/myname/.
I'm developing some python scripts and now I want to integrate python with the octave module, which is not currently installed on the system, and , of course, I cannot do anything with sudo apt-get install .
How can I overcome this problem without asking to my teacher?
thank you all,
Fabio
Please don't copy python and pip. You should use a virtualenv to install project-specific packages. This is particularly useful in your use-case where you can't install things at the system level. Even if you could, virtualenvs are recommended so the dependencies of each project are isolated.
Here is a quick primer that should get you going.
Create the virtualenv
virtualenv ~/project/env
Activate the virtualenv
source ~/project/env/bin/activate
This will modify your bash prompt by placing the name of your virtualenv in parenthesis to indicate that your virtualenv is activated.
(env) hostname:current_folder user$
Install Packages into the virtualenv
pip install -r requirements.txt
Use the virtualenv
python script.py
Use virtualenv by default in a script
script.py
#!~/project/env/bin/python
print('hello world!')
Then from the command line
chmod ugo+x script.py
./script.py
hello world!
Deactivate the virtualenv
deactivate
Make yourself a local copy of python and pip, then you can install whatever modules you want and not have to worry about getting a sysadmin to help you.
There are some good instructions here
Go here to get the link to the version of python you need and substitute it in the instructions above.
In your .bashrc add alias and path to your local copy - you may need to modify this for your own situation:
alias python="~/bin/python"
PATH=~/.local/bin:~/bin:$PATH
For the PATH - when you install local copies of modules through pip they by default go to ~/.local - change this if you prefer.
Begin your scripts with:
#/usr/bin/env python
so they use your preferred python version
Related
I'm a newbie to python(3), but not to programming in general.
I'd like to distribute a git repo with myprogram consisting of these files:
requirements.txt
myprogram.py
lib/modulea.py
lib/moduleb.py
My question is: What is the best-practice and least surprising way to let users run myprogram.py using the dependencies in requirements.txt? So that after git clone, and some idiomatic installation command(s), ./myprogram.py or /some/path/to/myprogram.py "just works" without having to first set magical venv or python3 environment variables?
I want to be able to run it using the #! shebang so that /path/to/myprogram.py and double-clicking it from the file manager GUI does the correct thing.
I already know I can create a wrapper.sh or make a clever shebang line. But I'm looking for the best-practice approach, since I'm new to python.
More details
I'm guessing that users would
git clone $url workdir
cd workdir
python3 -m venv .
./bin/pip install -r requirements.txt
And from now on this uses the modules from requirements.txt:
./myprogram.py
If I knew that the project directory was always /home/peter/workdir, I could start the myprogram.py with:
#!/home/peter/workdir/bin/python3
but I'd like to avoid hard-coding the project directory in myprogram.py.
This also seems to work in my tiny demo, but clearly this is brittle and not best-practice, but it illustrates what I'm trying to do:
#!/usr/bin/env python3
import os
import sys
print(os.path.join(os.path.dirname(__file__), 'lib', 'python3.10', 'site-packages'))
I'm sure I could come up with some home-grown shebang line that works, but what is the idiomatic way to do this in python3?
Again: After pip install, I absolutely refuse to have to to set any environment variables or call any setup code in future shells before running myprogram.py. (Unless that strongly conflicts with "idiomatic", which I hope isn't the case)...
Expanding #sinoroc's comment into an answer:
I've looked at https://packaging.python.org/en/latest/tutorials/packaging-projects/ and also at "entrypoints", and this is the smallest example I can think of. Create an empty directory with these two files:
pyproject.toml:
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[project]
name = "example_module_pmorch"
version = "0.0.1"
[project.scripts]
runme = "example_module_pmorch:cli_main"
src/example_module_pmorch/__init__.py:
def cli_main():
print("I'm the entrypoint")
Now if I run this:
$ python3 -m venv .
# Adding -e during development is optional
$ ./bin/pip install .
Then ./bin/runme does the right thing and prints I'm the entrypoint.
I do not see why you would need to hardcode anything. From your last snippet it looks like you are forcing the Python import system to include the target directory of the virtual environment you first create.
Based on your explanation, it seems you are using venv as your virtual environment manager. So long as your users install the dependencies in the virtual environment, and then activate the virtual environment before running the script, the dependencies will be ready for your module/script to use them.
This line you share: ./bin/pip install -r requirements.txt manually uses the package manager of the virtual environment you create with python3 -m venv .. Instead, you would want your user to create the environment (python3 -m venv example-env), activate the environment (source example-env/bin/activate) and then run the pip install command: python3 -m pip install -r requirements.txt.
The user of the package has to make sure that the environment is active before running the script.
I am trying to install a Python package with this command
pip install <name of package>
I'm getting permission errors and I'm not sure why. I could run it with sudo, but someone told me that was a bad idea, and I should use a virtualenv instead.
What is a virtualenv? What does it do for me?
Running with the system Python and libraries limits you to one specific Python version, chosen by your OS provider. Trying to run all Python applications on one Python installation makes it likely that version conflicts will occur among the collection of libraries. It's also possible that changes to the system Python will break other OS features that depend on it.
Virtual environments, or "virtualenvs" are lightweight, self-contained Python installations, designed to be set up with a minimum of fuss, and to "just work" without requiring extensive configuration or specialized knowledge.
virtualenv avoids the need to install Python packages globally. When a virtualenv is active, pip will install packages within the environment, which does not affect the base Python installation in any way.
In Python 3.3 or later, you can create a virtualenv as follows:
$ python3 -m venv ENV_DIR
For Windows, you should replace python3 with the full path to python.exe:
>C:\Python34\python.exe -m venv ENV_DIR
(This is a typical Python installation; your system may vary.)
In older versions of Python, including Python 2, one of the following commands should work in most cases:
$ virtualenv ENV_DIR
$ venv ENV_DIR
$ pyvenv ENV_DIR
$ pyvenv3 ENV_DIR
ENV_DIR should be a non-existent directory. The directory can have any name, but to keep these instructions simple, I will assume you have created your virtualenv in a directory called venv (e.g. with python3 -m venv ./venv).
To work in your virtualenv, you activate it:
$ . ./venv/bin/activate
(venv)$
Or use this if you have a windows system:
$ venv\Scripts\activate
The (venv) in the shell prompt lets you know which virtualenv you have activated, but you can turn this feature off if you do not like it. You can run all the usual Python commands, and they will be local to your virtualenv:
(venv)$ pip install requests numpy
[...]
(venv)$ python
[...]
>>> import requests
>>> import numpy as np
>>>
python will run the version of Python that you installed into your virtualenv, so (for example) you don't have to type python3 to get Python 3. The Python that it runs will have access to all the standard library modules and all the packages you installed into the virtualenv, but (by default) none of the packages installed in the system-wide site-packages directory.
This last rule is important: by restricting your virtualenv to only use locally-installed packages, you can ensure that you control exactly which dependencies your project is using, even if some new system-wide package gets installed or updated next week. If you like, you can get a listing of your installed packages:
(venv)$ pip freeze
requests==2.13.0
numpy==1.12.0
(venv)$
pip can also parse this format and install from it, and it will install the same versions, even if updates have been released in the meantime:
(venv)$ pip freeze >requirements.txt
(some-other-venv)$ pip install -r requirements.txt
[...]
(some-other-venv)$ python
>>> import requests
>>> import numpy as np
>>>
You can get out of the virtualenv by deactivating it:
(venv)$ deactivate
$ python
[...]
>>> import requests
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named 'requests'
You can create as many virtualenvs as you like, and they won't interfere with each other, nor with your system packages. A virtualenv is "just" a directory with a bunch of binaries and scripts under it, so you can remove a virtualenv the same way you remove any directory (rm -r venv on Unix). If the virtualenv is activated when you remove it, you may confuse your shell, so it's probably a good idea to deactivate first in that case.
Some times you are not given root privileges and you might end up not being able to use sudo. Many other times, it's not advisable to use sudo to install packages as it might overwrite some package which might be in use by some other applications.
Virtualenv can help you create a separate environment where you don't need root privileges as well as be able to tailor the environment according to your need. It consists of self-contained python installation which only interacts with your specific created environment.
So basically, it gives you a bit of freedom as well as avoid damaging (or modifying) the root environment which might be hosting many old functionalities of old applications.
Installation is pretty easy too.
Installing packages with sudo pip will install packages globally, which may break some system tools.
By install globally it means you will install your packages in place like /usr/lib/python2.7/site-package so if some packages need a previous version of your python packages, this action may break it.
virtualenv allows you to avoid installing Python packages globally by making an isolated python environment. That means it will install packages just in your desire project folder.
On mac and linux
Install
python3 -m pip install --user virtualenv
Creating a Virtual Env: Go to your desired project folder
python3 -m virtualenv env
Activating a virtualenv: In your desired project folder
source env/bin/activate
After activating you can install your packages using pip.
For more information about using it in Windows:
How to use virtualenv in Windows
I am going to break your question into two parts.
What is a virtualenv?
Python has its own way of downloading, storing, and resolving site packages. But Python can not differentiate between different versions in the site-package directory. Packages will be installed in one of the directories, whose name can be found by running the site.getsitepackages() commands.
>>> import site
>>> site.getsitepackages()
This means package_v2.0.1 and package_v3.0.1 have to be in the same directory with the same name i.e. package, which is obviously not possible. Now, you may ask why we would need the same package with different versions on our system. This is because multiple projects may require different versions of Python packages or even different Python versions themselves. So there was a need to have something which will resolve these conflicts and Virtualenv came into the picture to solve this issue.
What does it do for me?
It isolates the environment for Python projects so that each project can have its own dependencies.
I'm on an OS Catalina and I'm trying to install and run Mephisto (see https://github.com/facebookresearch/mephisto/blob/master/docs/quickstart.md). I created a python3 virtual environment and then went to the directory and ran
sudo pip3 install -e .
This seems to have run fine as I can now run mephisto and see the list of commands and options. However when I run mephisto register mturk it throws No module named 'mephisto.core.argparse_parser' because of an import statement in the python file. This seems like a general issue of a module installing but not importing the module properly, but would appreciate help in how to fix it. Is it because my $PYTHONPATH is currently empty?
Mephisto Lead here! This seems to have been a case of unfortunate timing, as we were in the midst of a refactor there and some code got pushed to master that should've received more scrutiny. We'll be moving to stable releases via PyPI in the near future to prevent things like this!
I created a python3 virtual environment and then went to the directory and ran
sudo pip3 install -e .
You should not have used sudo to install this library, if you meant to install it in a virtual environment. By using sudo the library probably got installed in the global environment (not in the virtual environment).
Typically:
create a virtual environment:
python3 -m venv path/to/venv
install tools and libraries in this environment with:
path/to/venv/bin/python -m pip install Mephisto
use python in the virtual environment:
path/to/venv/bin/python -c 'import mephisto'
use a tool in the virtual environment:
path/to/venv/bin/mephisto
Is it because my $PYTHONPATH is currently empty?
Forget PYTHONPATH. Basically one should never have to modify this environment variable (this is almost always ill-informed advice to get PYTHONPATH involved).
Check the __init__.py file is in the module's file directory. If not try creating an empty one.
I am trying to install a Python package with this command
pip install <name of package>
I'm getting permission errors and I'm not sure why. I could run it with sudo, but someone told me that was a bad idea, and I should use a virtualenv instead.
What is a virtualenv? What does it do for me?
Running with the system Python and libraries limits you to one specific Python version, chosen by your OS provider. Trying to run all Python applications on one Python installation makes it likely that version conflicts will occur among the collection of libraries. It's also possible that changes to the system Python will break other OS features that depend on it.
Virtual environments, or "virtualenvs" are lightweight, self-contained Python installations, designed to be set up with a minimum of fuss, and to "just work" without requiring extensive configuration or specialized knowledge.
virtualenv avoids the need to install Python packages globally. When a virtualenv is active, pip will install packages within the environment, which does not affect the base Python installation in any way.
In Python 3.3 or later, you can create a virtualenv as follows:
$ python3 -m venv ENV_DIR
For Windows, you should replace python3 with the full path to python.exe:
>C:\Python34\python.exe -m venv ENV_DIR
(This is a typical Python installation; your system may vary.)
In older versions of Python, including Python 2, one of the following commands should work in most cases:
$ virtualenv ENV_DIR
$ venv ENV_DIR
$ pyvenv ENV_DIR
$ pyvenv3 ENV_DIR
ENV_DIR should be a non-existent directory. The directory can have any name, but to keep these instructions simple, I will assume you have created your virtualenv in a directory called venv (e.g. with python3 -m venv ./venv).
To work in your virtualenv, you activate it:
$ . ./venv/bin/activate
(venv)$
Or use this if you have a windows system:
$ venv\Scripts\activate
The (venv) in the shell prompt lets you know which virtualenv you have activated, but you can turn this feature off if you do not like it. You can run all the usual Python commands, and they will be local to your virtualenv:
(venv)$ pip install requests numpy
[...]
(venv)$ python
[...]
>>> import requests
>>> import numpy as np
>>>
python will run the version of Python that you installed into your virtualenv, so (for example) you don't have to type python3 to get Python 3. The Python that it runs will have access to all the standard library modules and all the packages you installed into the virtualenv, but (by default) none of the packages installed in the system-wide site-packages directory.
This last rule is important: by restricting your virtualenv to only use locally-installed packages, you can ensure that you control exactly which dependencies your project is using, even if some new system-wide package gets installed or updated next week. If you like, you can get a listing of your installed packages:
(venv)$ pip freeze
requests==2.13.0
numpy==1.12.0
(venv)$
pip can also parse this format and install from it, and it will install the same versions, even if updates have been released in the meantime:
(venv)$ pip freeze >requirements.txt
(some-other-venv)$ pip install -r requirements.txt
[...]
(some-other-venv)$ python
>>> import requests
>>> import numpy as np
>>>
You can get out of the virtualenv by deactivating it:
(venv)$ deactivate
$ python
[...]
>>> import requests
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named 'requests'
You can create as many virtualenvs as you like, and they won't interfere with each other, nor with your system packages. A virtualenv is "just" a directory with a bunch of binaries and scripts under it, so you can remove a virtualenv the same way you remove any directory (rm -r venv on Unix). If the virtualenv is activated when you remove it, you may confuse your shell, so it's probably a good idea to deactivate first in that case.
Some times you are not given root privileges and you might end up not being able to use sudo. Many other times, it's not advisable to use sudo to install packages as it might overwrite some package which might be in use by some other applications.
Virtualenv can help you create a separate environment where you don't need root privileges as well as be able to tailor the environment according to your need. It consists of self-contained python installation which only interacts with your specific created environment.
So basically, it gives you a bit of freedom as well as avoid damaging (or modifying) the root environment which might be hosting many old functionalities of old applications.
Installation is pretty easy too.
Installing packages with sudo pip will install packages globally, which may break some system tools.
By install globally it means you will install your packages in place like /usr/lib/python2.7/site-package so if some packages need a previous version of your python packages, this action may break it.
virtualenv allows you to avoid installing Python packages globally by making an isolated python environment. That means it will install packages just in your desire project folder.
On mac and linux
Install
python3 -m pip install --user virtualenv
Creating a Virtual Env: Go to your desired project folder
python3 -m virtualenv env
Activating a virtualenv: In your desired project folder
source env/bin/activate
After activating you can install your packages using pip.
For more information about using it in Windows:
How to use virtualenv in Windows
I am going to break your question into two parts.
What is a virtualenv?
Python has its own way of downloading, storing, and resolving site packages. But Python can not differentiate between different versions in the site-package directory. Packages will be installed in one of the directories, whose name can be found by running the site.getsitepackages() commands.
>>> import site
>>> site.getsitepackages()
This means package_v2.0.1 and package_v3.0.1 have to be in the same directory with the same name i.e. package, which is obviously not possible. Now, you may ask why we would need the same package with different versions on our system. This is because multiple projects may require different versions of Python packages or even different Python versions themselves. So there was a need to have something which will resolve these conflicts and Virtualenv came into the picture to solve this issue.
What does it do for me?
It isolates the environment for Python projects so that each project can have its own dependencies.
I'm new to virtualenv but I'm writting django app and finally I will have to deploy it somehow.
So lets assume I have my app working on my local virtualenv where I installed all the required libraries. What I want to do now, is to run some kind of script, that will take my virtualenv, check what's installed inside and produce a script that will install all these libraries on fresh virtualenv on other machine. How this can be done? Please help.
You don't copy paste your virtualenv. You export the list of all the packages installed like -
pip freeze > requirements.txt
Then push the requirements.txt file to anywhere you want to deploy the code, and then just do what you did on dev machine -
$ virtualenv <env_name>
$ source <env_name>/bin/activate
(<env_name>)$ pip install -r path/to/requirements.txt
And there you have all your packages installed with the exact version.
You can also look into Fabric to automate this task, with a function like this -
def pip_install():
with cd(env.path):
with prefix('source venv/bin/activate'):
run('pip install -r requirements.txt')
You can install virtualenvwrapper and try cpvirtualenv, but the developers advise caution here:
Warning
Copying virtual environments is not well supported. Each virtualenv
has path information hard-coded into it, and there may be cases where
the copy code does not know it needs to update a particular file. Use
with caution.
If it is going to be on the same path you can tar it and extract it on another machine. If all the same dependencies, libraries etc are available on the target machine it will work.