Install python libraries using shell script - python

I'm very new in python, sorry if my question is very basic.I have a shell script that using it to run a .py file on a cluster. Here is my shell script:
#!/bin/bash
module add python/2.6
python Myfile.py
Python has been installed on the cluster but some of the libraries and packages needs to be installed. For example, I need to install Numpy package, is there any way that I can do it inside my shell script or my .py file before "import" it?
Thanks

For this (and similar) use case, I would recommend a combination of pip and virtualenv.
You would install pip into your system Python install (i.e. sudo apt-get install python-pip), and then install virtualenv via pip, i.e. pip install virtualenv).
You can then create a specific virtualenv for this project. This represents a sandboxed environment with specific versions of libraries that are specified traditionally through a requirements file (using the -r option), but can also be specified individually through the command line.
You would do this via command like virtualenv venv_test, which will create a virtualenv directory named venv_test in the current directory. You can then run pip from that virtualenv's bin dir to install packages.
For example, to install the flask package in that virutalenv, you would run:
venv_test/bin/pip install flask
You can then either run source venv_test/bin/activate to put the current shell into the virtualenv's, or invoke a script directly from the virtualenv's interpreter, i.e.:
venv_test/bin/python foo.py
Here's link to a virtualenv introduction for some additional details/steps.

Related

Create fresh python venv while local have some package installed [duplicate]

I am very new at command line usage. I am using python 3.7.2, Bash and VSCode Integrated Terminal. I am trying to create a virtual environment using venv and following python documentation:
https://docs.python.org/3/tutorial/venv.html#creating-virtual-environments
The command to use is this one:
$ python3 -m venv test-env
and I get:
bash: python3: command not found
Later I have found a similar answer in an stackoverflow post:
How to create and activate virtual environment in windows 10 using bash command
And I use the command:
py -m virtualenv test-env
and I get this:
No module named virtualenv
I am very new using the command line so i don´t really know what is going on and how to work it around.
Hi i can see that you are using two different tools to create your environment.
Those are "venv" and "virtualenv".
Venv is a library that already comes with your python installation.
Virtualenv is an external one.
I had the same problem before and the solution is very simple.
I recommend you to stick with venv because it works pretty ok and you don´t need to do extra job installing external libraries.
So for solving your problem the Bash Shell is telling you that the command Python3 has not been found.
So try instead just:
python -m venv test-env
Sometimes Python documentation is not accurate enough and I know when you start using commands, accuracy in the sintax is extremely important.
Try this steps,it'll helped you:
First, make a directory :
mkdir testing
Then, moved to this directory named testing :
cd testing
When you type following command in this directory:
python3 -m venv env (OR, python -m venv env)
You got error like :
The virtual environment was not created successfully because ensurepip is not
available. On Debian/Ubuntu systems, you need to install the python3-venv
package using the following command.
apt install python3.8-venv
Type the following command but before that keep an eye on the version of python you installed on the machine; in my case its python3.8
sudo apt install python3.8-venv
Now, we can create a virtual environment and store its tools in the "bhandari" folder .
python3 -m venv bhandari
Note: you can named this "bhandari" folder; anyname you like( Standard practice is to name it "env" ...)
Now to activate your virtual environment, from the directory of your folder, type the following command this will activate our virtual environment in the “bhandari” folder
source bhandari/bin/activate
If you have successfully activated your virtual environment, you should see the (bhandari) word indicating that we are working in a virtual environment.
After this, we can install anything that will be isolated from the rest of the system....

Is there any way to package python code so that other machine does not need to install all the dependencies using pip

I have installed python in my system and wrote a simple script using GET REST API for Jenkins data.
I have installed all the required modules using pip. Now I want to package this script with all the dependencies and run on another machine. However, in another machine, I don't want to perform all the pip installation steps.
I know we can mention all the modules in the requirements.txt and use pip install -r requirements.txt. But, is there any way so that I don't need to install modules using pip for each dependency, such that I can install Python and all other dependencies must be installed when I run the zip file.
You can install pip dependencies to a certain directory using -t (target).
pip install -r requirements.txt -t .
That will install your pip modules to the current directory. You can zip the whole thing then and deploy. Make sure that the environment you install the dependencies in matches your intended deployment environment. For consistency you can run the command in a docker container, for example.
I think you should use virtualenv module which makes your project easily deploy-able.
Virtual Environment should be used whenever you work on any Python based project. It is generally good to have one new virtual environment for every Python based project you work on. So the dependencies of every project are isolated from the system and each other.
I came across a link which can help Virtual Env explained

Adding installed PIP package to path automatically

For my package, foo, I'm using the following setup.py:
from setuptools import setup
setup(name='foo',
version='0.0.1',
description='Lol',
url='https://github.com/foo/foo',
author='legend',
author_email='lol#gmail.com',
license='GPLv3',
packages=['foo'],
install_requires=["bar"],
entry_points = {'console_scripts': ['foo = foo:main']},
keywords = ['foo'],
zip_safe=False)
When testing on my Arch system, it added the script to PATH automatically so I could just run foo on my command line and it'd run the function main() automatically. Then, I booted up a VM and tested it on Windows 7. Pip installed the package just fine, but it wasn't in my path!
Help?
setuptools, pip and easy_install don't modify the system PATH variable. The <python directory>\Scripts directory, where all of them install the script by default, is normally added to PATH by the Python installer during installation.
If the scripts folder was not added to your PATH during installation, you can fix that by running <python directory>\Tools\scripts\win_add2path.py. (See How can I find where Python is installed on Windows?)
The above sample setup.py file worked fine for me (with the Scripts directory in PATH), by the way. I tested it with
python setup.py bdist_wheel
pip install dist\foo-0.0.1-py3-none-any.whl
and
python setup.py sdist
pip install dist\foo-0.0.1.zip
Do not expect pip or easy_install to modify your PATH, their task is to install a package into current environment.
On Linux, if you use global Python environment, you are likely to need root privileges, so you typically do:
$ sudo pip install <package>
However, this is not recommended method as it spoils system-wide Python environment (imagine having two applications having a bit different requirements to the same package version and you might have a problem).
Recommended method is to use some sort of virtualenv, which allows installing python package into separate python environment, which is also easy to remove and recreate.
How I install python based scripts into system
It seems like you have custom python based script, which you want to use in your system.
For this scenario I use following method (assuming virtualenv tool is installed into system-wide python):
$ mkdir ~/apps
$ mkdir ~/apps/myutil
$ cd ~/apps/myutil
$ virtualenv .env
$ source .env/bin/activate
(.env)$ pip install <package-or-more>
Now you shall have in ~/apps/myutil/.env/bin directory installed all the script installed by pip, let us call it myscript (there can be more).
Remaining step is to make symlink from some directory which is already on PATH, e.g. into /usr/local/bin:
$ cd /usr/local/bin
$ sudo ln -s ~/apps/myutil/.env/bin/myscript
From now on, you shall be able calling command myscript even without virtualenv being activated.
Updating the script
If you need to install later version of the script:
$ cd ~/apps/myutil
$ source .env/bin/activate
(.env)$ pip install --upgrade <package-or-more>
As you have the script linked, it will automatically be available in the latest version.
naming with virtualenvwrapper
virtualenvwrapper allows you to create multiple named virtualenvs and give you easy activation and
deactivation. In such case I do the following:
$ mkvirtualenv bin-myscript
(bin-myscript)$ pip install <package-or-more>
(bin-myscript)$ which `myscript`
~/.Evns/bin-myscript/bin/myscript
(bin-myscript)$ cd /usr/local/bin
(bin-myscript)$ sudo ln -s ~/.Evns/bin-myscript/bin/myscript
Upgrade is even simpler:
$ workon bin-myscript
(bin-myscript)$ pip install --upgrade <package-or-two>
and you are done
alternative with tox
tox is great tool for automated creation of virtualenvs and testing. I use it for creating
virtualenvs in directories I like. For more information see my other SO answer

Installing python packages in a virtual environment via bash script and pip

To clarify I have read these questions already:
How to source virtualenv activate in a Bash script
How to activate python virtual environment by shell script
Activating virtualenv in Bash script not working
Bash: How _best_ to include other scripts?
My goal is to create a script that will automate the process of installing several (roughly 27) python packages via pip in a virtual environment using (preferably) a bash script.
So far in bash scripts I have tried:
source env/bin/activate
pip install numpy Scipy ez_setup
As well as
activate() {
. ../.env/bin/activate
}
activate
pip install numpy Scipy ez_setup
Neither of which works. What is the best way to approach this problem given it must be executable as part of a larger bash (or python) script?
Update: So I figured out my problem. The solution for me is to force the script to pull from the correct pip directory with
/env/bin/pip install numpy Scipy ez_setup
The second part is to break down the install instructions into each line because the registries were not being updated properly.

How can i make pip install my package to PATH?

I have followed the pip tutorial at http://peterdowns.com/posts/first-time-with-pypi.html. But when i run pip install mypackage it installs the source code into site-packages under my Python folder.
How can i make it install into path, so i can run $ mypackage?
That is, I would like to be able to use my python package as a regular binary application. Right now i can cd into the folder under site-packages and run chmod +x mypackage.py, then ./mypackage.py to run it. But i would like to be able to run it from any directory.
Ideally you do it by defining console scripts in your package configuration. Then you install the package with pip just as you have done before (or better in a virtualenv), but during the installation a link will be created in the bin directory to the console script that you have configured. See e.g. http://calvinx.com/2012/09/09/python-packaging-define-an-entry-point-for-console-commands/

Categories

Resources