How can i make pip install my package to PATH? - python

I have followed the pip tutorial at http://peterdowns.com/posts/first-time-with-pypi.html. But when i run pip install mypackage it installs the source code into site-packages under my Python folder.
How can i make it install into path, so i can run $ mypackage?
That is, I would like to be able to use my python package as a regular binary application. Right now i can cd into the folder under site-packages and run chmod +x mypackage.py, then ./mypackage.py to run it. But i would like to be able to run it from any directory.

Ideally you do it by defining console scripts in your package configuration. Then you install the package with pip just as you have done before (or better in a virtualenv), but during the installation a link will be created in the bin directory to the console script that you have configured. See e.g. http://calvinx.com/2012/09/09/python-packaging-define-an-entry-point-for-console-commands/

Related

python install on machine with no internet but has basic python interpreter

following is my question/problem:
I am have installed python on two machine, machine A with internet ,machine B without internet.
I need to install packages(say for example pillow package) on machine B
I tried :pip download pillow ,on machine a in folder.
It created wheel file which does not work on machine B (some packages download as zip which can be installed on machine B ,but not one with wheel file.
I am trying route of virtual env.
On machine A I am doing :1. C:\pro1> myenv\scripts\activate
myenv C:\vi\pro1> pip install pillow
Taking the whole folder to machine B .
Assuming it should work as the package is in virtual env folder ,but it doesn't . :(
How can I make pillow package work on offline machine ?
Thank you.
Try to follow this. Obviously you must replace the packages with the ones you need.
virtualenv my-new-virtual-env
cd my-new-virtual-env
Activate the environment using the commands shown above.
For our convenience lets create in the root folder of the env a Wheelhouse/Tarhouse folder which will we install all our packages.
(Windows) mkdir Wheelhouse and afterwards cd Wheelhouse
(Linux) mkdir Tarhouse and afterwards cd Tarhouse
pip download virtualenv django numpy
pip freeze > requirements.txt
Take the downloaded .whl/.tar files to an offline station
Make sure you are in the root folder of the virtual envrionment and
issue the following command in your command line:
pip install -r requirements.txt --find-links=(Wheelhouse or Tarhouse)
I just wanted to add up that you can issue the command(if you want
install single package at a time):
pip install Wheelhouse/some-package-file.whl (or .tar on linux)

Installing Python Dependencies locally in project

I am coming from NodeJS and learning Python and was wondering how to properly install the packages in requirements.txt file locally in the project.
For node, this is done by managing and installing the packages in package.json via npm install. However, the convention for Python project seems to be to add packages to a directory called lib. When I do pip install -r requirements.txt I think this does a global install on my computer, similar to nodes npm install -g global install. How can I install the dependencies of my requirements.txt file in a folder called lib?
use this command
pip install -r requirements.txt -t <path-to-the-lib-directory>
If you're looking to install dependencies in special (non-standard) local folder for a specific purpose (e.g. AWS Lambda), see this question: install python package at current directory.
For normal workflows the following is the way to install dependencies locally (instead of globally, equivalent to npm i instead of npm i -g in Node):
The recommended way to do this is by using a virtual environment. You can install virtualenv via pip with
pip install virtualenv
Then create a virtual environment in your project directory:
python3 -m venv env # previously: `virtualenv env`
Which will create a directory called env (you can call it anything you like though) which will mirror your global python installation. Inside env/ there will be a directory called lib which will contain Python and will store your dependencies.
Then activate the environment with:
source env/bin/activate
Then install your dependencies with pip and they will be installed in the virtual environment env/:
pip install -r requirements.txt
Then any time you return to the project, run source env/bin/activate again so that the dependencies can be found.
When you deploy your program, if the deployed environment is a physical server, or a virtual machine, you can follow the same process on the production machine. If the deployment environment is one of a few serverless environments (e.g. GCP App Engine), supplying a requirements.txt file will be sufficient. For some other serverless environments (e.g. AWS Lambda) the dependencies will need to be included in the root directory of the project. In that case, you should use pip install -r requirements.txt -t ./.
I would suggest getting the Anaconda navigator.
You can download it here: https://www.anaconda.com
Anaconda allows you to create virtual environments through a graphical interface. You can download any pip package that is available through Anaconda.
Then all you have to do after you have created and added onto your environment is to got to your designated python editor (I mainly use Pycharm) and setting the path to the virtual environment’s interpreter when you select or change the interpreter for your project.
Hope this helps.

pip has to reinstall all packages in exported virtualenv

I have a question about python virtualenv. I get a virtualenv for a project with all packages required to run that project. But when i run it for the first time and it crash 'cause python has some requirements not satisfied. So i check if there is all packages inside:
virtualenv/lib/python2.7/site-packages/
and all packages required are inside.
But when i type:
pip list
packages doesn't shown. So i have to run:
pip install -r requirements.txt
pip downloads them again.
So my question is, why pip downloads and reinstall them again if they are installed yet ? And how i can force pip to reinstall all packages inside virtualenv ?
The problem was that all scripts inside the virtualenv were created on another pc with them paths. Indeed when i launched python or pip from virtualenv they ran from my global path 'cause couldn't find virtualenv script path and in particular pip shown my global packages.
Fixing directives path of all script inside virtualenb/bin/ to my real virtualenv path solved this issue.

Adding installed PIP package to path automatically

For my package, foo, I'm using the following setup.py:
from setuptools import setup
setup(name='foo',
version='0.0.1',
description='Lol',
url='https://github.com/foo/foo',
author='legend',
author_email='lol#gmail.com',
license='GPLv3',
packages=['foo'],
install_requires=["bar"],
entry_points = {'console_scripts': ['foo = foo:main']},
keywords = ['foo'],
zip_safe=False)
When testing on my Arch system, it added the script to PATH automatically so I could just run foo on my command line and it'd run the function main() automatically. Then, I booted up a VM and tested it on Windows 7. Pip installed the package just fine, but it wasn't in my path!
Help?
setuptools, pip and easy_install don't modify the system PATH variable. The <python directory>\Scripts directory, where all of them install the script by default, is normally added to PATH by the Python installer during installation.
If the scripts folder was not added to your PATH during installation, you can fix that by running <python directory>\Tools\scripts\win_add2path.py. (See How can I find where Python is installed on Windows?)
The above sample setup.py file worked fine for me (with the Scripts directory in PATH), by the way. I tested it with
python setup.py bdist_wheel
pip install dist\foo-0.0.1-py3-none-any.whl
and
python setup.py sdist
pip install dist\foo-0.0.1.zip
Do not expect pip or easy_install to modify your PATH, their task is to install a package into current environment.
On Linux, if you use global Python environment, you are likely to need root privileges, so you typically do:
$ sudo pip install <package>
However, this is not recommended method as it spoils system-wide Python environment (imagine having two applications having a bit different requirements to the same package version and you might have a problem).
Recommended method is to use some sort of virtualenv, which allows installing python package into separate python environment, which is also easy to remove and recreate.
How I install python based scripts into system
It seems like you have custom python based script, which you want to use in your system.
For this scenario I use following method (assuming virtualenv tool is installed into system-wide python):
$ mkdir ~/apps
$ mkdir ~/apps/myutil
$ cd ~/apps/myutil
$ virtualenv .env
$ source .env/bin/activate
(.env)$ pip install <package-or-more>
Now you shall have in ~/apps/myutil/.env/bin directory installed all the script installed by pip, let us call it myscript (there can be more).
Remaining step is to make symlink from some directory which is already on PATH, e.g. into /usr/local/bin:
$ cd /usr/local/bin
$ sudo ln -s ~/apps/myutil/.env/bin/myscript
From now on, you shall be able calling command myscript even without virtualenv being activated.
Updating the script
If you need to install later version of the script:
$ cd ~/apps/myutil
$ source .env/bin/activate
(.env)$ pip install --upgrade <package-or-more>
As you have the script linked, it will automatically be available in the latest version.
naming with virtualenvwrapper
virtualenvwrapper allows you to create multiple named virtualenvs and give you easy activation and
deactivation. In such case I do the following:
$ mkvirtualenv bin-myscript
(bin-myscript)$ pip install <package-or-more>
(bin-myscript)$ which `myscript`
~/.Evns/bin-myscript/bin/myscript
(bin-myscript)$ cd /usr/local/bin
(bin-myscript)$ sudo ln -s ~/.Evns/bin-myscript/bin/myscript
Upgrade is even simpler:
$ workon bin-myscript
(bin-myscript)$ pip install --upgrade <package-or-two>
and you are done
alternative with tox
tox is great tool for automated creation of virtualenvs and testing. I use it for creating
virtualenvs in directories I like. For more information see my other SO answer

Install python libraries using shell script

I'm very new in python, sorry if my question is very basic.I have a shell script that using it to run a .py file on a cluster. Here is my shell script:
#!/bin/bash
module add python/2.6
python Myfile.py
Python has been installed on the cluster but some of the libraries and packages needs to be installed. For example, I need to install Numpy package, is there any way that I can do it inside my shell script or my .py file before "import" it?
Thanks
For this (and similar) use case, I would recommend a combination of pip and virtualenv.
You would install pip into your system Python install (i.e. sudo apt-get install python-pip), and then install virtualenv via pip, i.e. pip install virtualenv).
You can then create a specific virtualenv for this project. This represents a sandboxed environment with specific versions of libraries that are specified traditionally through a requirements file (using the -r option), but can also be specified individually through the command line.
You would do this via command like virtualenv venv_test, which will create a virtualenv directory named venv_test in the current directory. You can then run pip from that virtualenv's bin dir to install packages.
For example, to install the flask package in that virutalenv, you would run:
venv_test/bin/pip install flask
You can then either run source venv_test/bin/activate to put the current shell into the virtualenv's, or invoke a script directly from the virtualenv's interpreter, i.e.:
venv_test/bin/python foo.py
Here's link to a virtualenv introduction for some additional details/steps.

Categories

Resources