I want to download python libraries like NumPy, scipy, etc. in a separate folder. I want to include that folder in the python project so that whenever I switch to some other laptop, I don't need to install the libraries again rather I import libraries from that folder. Is there any way?
You can easily install python virtualenv.
Your libraries will be installed in directory created by virtualenv.
https://pypi.org/project/virtualenv/.
Other option, you can also use docker.
I suggest using virtual environment in this case. You could use pipenv so that the project hast exactly the libraries you need for it to run.
You can do it.
for numpy library: https://pypi.org/project/numpy/#files
You can download the files statically from pypi.
I would not recommend you go with this approach. There are several reasons to do that.
There would be a dependency on this kind of library. So you have to keep these dependencies along with the NumPy package.
These libraries are getting updated after some time with some newly added functionality and some bug fixes. So with the time other libraries might not compatible with this library.
Recommended way:
Just create an requirement.txt file that contains all the dependency with its version number.
whenever you want to use your project elsewhere, just install all these libraries with below command.
pip install -r requirement.txt
There are two major way you can install python libs to a separate folder: a virtual environment or a container.
Virtula environment (like venv, pipenv, etc) is good as this is the simplest way to your project's own liblaries set which is not impact any other pythonic script in your system. The downside of this case is thet you really have to set up an environment (including lib install) on every computer you move your script to. This can and should be autimated, of course, but this should be done either way.
The container, in other hand, requires additional resources to handle and to build, build, but it is exactly the box with a specific version of your script along with all libs and binaries it requires. No need to reinstall libs while moving to new laptop/desktop/server/cloud/whatever. For this case I would recommend the Docker/Kubernetes. But it's better to start with Docker.
Related
I have a Pycharm project where I have copied two github projects (download zip and copy paste into project). My problem is that I cannot access the from main, which is located at root.
For some unknown reason from . syntax only shows me files/folders I have created myself.
I'm trying to access build module where there is a class TFNET which I want to import
from darkflow-master.darkflow.net.build import TFNET
Downloading directories like this isn't how you'd typically include external dependencies in a python project.
A more conventional approaches would be ot install the project using pip, it looks like darkflow gives information on how to do this.
Alternatively, just ensure the libraries are in your PYTHONPATH, it looks like pycharm has a way to do this: PyCharm and PYTHONPATH
Reading between the lines, it sounds like you have downloaded two libraries from github, and copied them into your project.
Python needs to know where to find source files.
If you want do it this way, you need to tell your python environment where to find the new sources. Pycharm looks after python environments while you are in python.
Please see https://www.jetbrains.com/help/pycharm/configuring-folders-within-a-content-root.html
but this won't tell python outside of Pycharm where to find your library source.
pip is most likely the answer. pip can install globally or inside python virtual environments, in either case, it puts the library code in a location python is expecting to find it.
On this point, please learn about python virtual environments. These are self-contained python mini-worlds. In a venv, you can run a specific version of python with specific packages. Pycharm works well with them, it is easy to set up virtual envs with pycharm. When you are 'inside' a venv, pip will install into the venv, therefore not touching your system python or the python of any other projects.
Also, pip normally installs from an official repository (pypi) but you can tell it to use a git repository as the source of your install. Normally people who write libraries send their mature versions to pypi so it is unusual to fetch from a git repository, but if you want the very latest version, or if the author has not published the library, it's an option.
Note that pip doesn't work with any arbitrary python code. It must be set up to be seen by pip as a python package.
I have an installable python package (mypackage) and it needs to use specific versions of a number of dependencies. At the minute I have some .sh scripts that just pip these into an internal package folder (eg C:\Python27\Lib\site-packages\mypackage\site-packages). When mypackage executes it adds this internal folder to the beginning of the python path so that it will override any other versions of the required dependencies elsewhere in the python path.
I know this will only work if the user doesn't import the dependencies prior to importing mypackage but I will document this.
I want to remove the .sh scripts and integrate the above into either dist_utils install or pip standard installation process. What is the best way to do this? I know about install_requires but it does not seem to allow specification of a location.
I eventually found the virtualenv tool which solved the above problem much more elegantly than the solution I'd put in place:
https://docs.python.org/3/tutorial/venv.html
I have a Python program that has dependencies on other Python libraries. I have used virtualenv and pip to get requirements.txt for all the libs needed to run the app, and thus keeping the environment clean of unnecessary libs. Things work great and I can make progress in developing the app.
This works on my machine, but the issue is that I need to package the app and deploy/distribute to an environment where the requirements.txt and pip cannot be used to simply download the dependencies. The target environment needs a fully functional application.
I'm a bit confused with all these tools offered by Python, such as setuptools and distutils, since none of them seem to offer this (at least easily).
I'm used to Java way, with Maven/Gradle etc. where one simply states dependencies and they are added to the distributable jar/war, unless explicitly stated otherwise.
The dependencies are install inside my virtual environment, under scripts/dir. Is there some easy way to get the dependencies bundled within my app with standard tools, or do I need to roll my own for this?
My windows laptop is being replaced due to some hardware problems. I've spent the last two years installing third party libraries every so often, which I'll need to reinstall on the new machine. I just need to:
Generate a list of third party python libs current installed
Reinstall those on the new machine
My thought right now is to look at c:\python\lib\site-packages and manually write out the list, then try to create a configuration file for Setuptools and EasyInstall which I could run as a script of sorts, which would install all these libraries. Is that feasible?
Take a look at pip's freeze command and install from requirements file:
http://pip.openplans.org/#freezing-requirements
Warning: pip may not be able to install all your packages in Windows (I think it has some limitations for some packages with native libraries that must be recompiled).
You might want to look into virtualenv. It lets you have a "virtual environment" for Python that would eliminate this need in the future. As for getting the old libs on the new system, you can check the PYTHONPATH environment variable and/or sys.path from the interpreter to see what all paths are searched and you can do some looking around to see what you'd need to install to get back to where you were.
I have read the documentation but I don't understand.
Why do I have to use distutils to install python modules ?
Why do I just can't save the modules in python path ?
You don't have to use distutils. You can install modules manually, just like you can compile a C++ library manually (compile every implementation file, then link the .obj files) or install an application manually (compile, put into its own directory, add a shortcut for launching). It just gets tedious and error-prone, as every repetive task done manually.
Moreover, the manual steps I listed for the examples are pretty optimistic - often, you want to do more. For example, PyQt adds the .ui-to-.py-compiler to the path so you can invoke it via command line.
So you end up with a stack of work that could be automated. This alone is a good argument.
Also, the devs would have to write installing instructions. With distutils etc, you only have to specify what your project consists of (and fancy extras if and only if you need it) - for example, you don't need to tell it to put everything in a new folder in site-packages, because it already knows this.
So in the end, it's easier for developers and for users.
what python modules ? for installing python package if they exist in pypi you should do :
pip install <name_of_package>
if not, you should download them .tar.gz or what so ever and see if you find a setup.py and run it like this :
python setup.py install
or if you want to install it in development mode (you can change in package and see the result without installing it again ) :
python setup.py develop
this is the usual way to distribute python package (the setup.py); and this setup.py is the one that call disutils.
to summarize this distutils is a python package that help developer create a python package installer that will build and install a given package by just running the command setup.py install.
so basically what disutils does (i will sit only important stuff):
it search dependencies of the package (install dependencies automatically).
it copy the package modules in site-packages or just create a sym link if it's in develop mode
you can create an egg of you package.
it can also run test over your package.
you can use it to upload your package to pypi.
if you want more detail see this http://docs.python.org/library/distutils.html
You don't have to use distutils to get your own modules working on your own machine; saving them in your python path is sufficient.
When you decide to publish your modules for other people to use, distutils provides a standard way for them to install your modules on their machines. (The "dist" in "distutils" means distribution, as in distributing your software to others.)