Get available modules - python

With PHP you have the phpinfo() which lists installed modules and then from there look up what they do.
Is there a way to see what packages/modules are installed to import?

Type help() in the interpreter
then
To get a list of available modules, keywords, or topics, type "modules",
"keywords", or "topics". Each module also comes with a one-line summary
of what it does; to list the modules whose summaries contain a given word
such as "spam", type "modules spam".
help> modules

If you use ipython, which is an improved interactive Python shell (aka "REPL"), you can type import  (note the space at the end) followed by a press of the [TAB] key to get a list of importable modules.
As noted in this SO post, you will have to reset its hash of modules after installing (certain?) new ones. You likely don't need to worry about this yet.
If you don't use ipython, and you haven't tried it, it might be worth checking out. It's a lot better than the basic Python shell, or pretty much any other REPL I've used.
ipython Installation
If you're running linux, there is most likely an ipython package that you can install through your system management tools. Others will want to follow these instructions.
If your installation route requires you to use easy_install, you may want to consider instead using pip. pip is a bit smarter than easy_install and does a better job of keeping track of file locations. This is very helpful if you end up wanting to uninstall ipython.
Listing packages
Note that the above tip only lists modules. For a list which also includes packages —which contain modules— you can do from  + [TAB]. An explanation of the difference between packages and modules can be found in the Modules chapter of the helpful official Python tutorial.
#rtfm
As an added note, if you are very new to python, your time may be better spent browsing the standard library documentation than by just selecting modules based on their name. Python's core documentation is well-written and well-organized. The organizational groups —File and Directory Access, Data Types, etc.— used in the library documentation's table of contents are not readily apparent from the module/package names, and are not really used elsewhere, but serve as a valuable learning aid.

This was very useful. Here is a script version of this:
# To list all installed packages just execfile THIS file
# execfile('list_all_pkgs.py')
for dist in __import__('pkg_resources').working_set:
print dist.project_name.replace('Python', '')

You can list available modules like so:
python -c "for dist in __import__('pkg_resources').working_set:print dist.project_name.replace('Python', '')"

As aaronasterling says, all .py or .pyc files on sys.path is a module because it can be imported. There are scripts that can let you find what external module is installed in site-packages.
Yolk is a Python command-line tool and library for obtaining information about packages installed by setuptools, easy_install and distutils and it can also query pypi packages.
http://tools.assembla.com/yolk/

You may use the pip module:
from pip._internal.operations.freeze import freeze
for line in freeze():
print(line.split('=='))

Related

How do I package a single python script which takes command line arguments and also has dependencies?

I have a single Python file which is supposed to take in a bunch of inputs during the command.
For eg: python script.py "string_1" "string_2"
I also have a bunch of dependencies including pandas, datetime and even Python3.
I want to package all this code in a manner that anyone can install the package along with the dependencies as well (in a directory or so) and then just call the script/module : in the above manner. Without having to actually go into a Python interpreter.
I tried using the python-packaging resource, but with that I would need to go into the interpreter, right ?
I found a good article today that explains quite well the procedure: https://medium.com/dreamcatcher-its-blog/making-an-stand-alone-executable-from-a-python-script-using-pyinstaller-d1df9170e263
pyinstaller --onefile <script.py> is the tl;dr on linux. On windows you need also py32exe
If you can rely on a base install of python being present already.
Then it's worth looking at Python's zipapp module introduced in Python3.5 https://docs.python.org/3/library/zipapp.html#creating-standalone-applications-with-zipapp For background info PEP441 https://www.python.org/dev/peps/pep-0441/
Also there is a project called Shiv which adds some extra abilities to the zipapp module bundled in python3.5
https://shiv.readthedocs.io/en/latest/
Have a look at pex (https://pex.readthedocs.io/en/stable/). It wraps up your python scripts, files, dependencies, etc into a single executable. You still need the python interpreter installed, but it includes everything else.

Check if one package is installed in my system with Python?

How can I check is some package is installed in my system. My system is Linux, but even better if it could works in other OSs. I mean OS specific package (like could be *.rpm or *.deb).
Is there any python module or script that could do it?
To find out whether you've installed a .deb, .rpm, etc. package, you need to use the appropriate tools for your packaging system.
APT has a Python wrapper named python-apt in Debian, or just apt at PyPI.
RPM has a whole slew of Python tools—in fact, most of Redhat's installer ecosystem is built on Python, and you should already have the rpm module installed. Read Programming RPM with Python (or, better, search for a newer version…) before looking for a high-level wrapper, so you understand what you're actually doing; it's only a couple lines of code even with the low-level interface.
As far as I know, nobody has wrapped these up in a universal tool for every packaging format and database that any linux distro has ever used (and, even if they had, that wouldn't do you much good on linux systems that don't use a packaging system). But if you just want to handle a handful of popular systems, python-apt and either Redhat's own tools or search PyPI for RPM, and that will cover almost everything you care about.
Alternatively, pkg-config is the closest thing to a universal notion of "packages installed on this system". Every linux system will have it (and most other non-Windows systems), but not every package registers with pkg-config. Still, if this is what you're looking for, pkgconfig is the Python answer.
The word "package" has a half-dozen similar but incompatible meanings, but the fact that you said "package or module" implies you specifically want to know about Python packages and modules, as in the things you can import.
In which case, the way to test it is to import them.
Manually, do this:
$ python
>>> import foo
ImportError: No module named foo
Well, foo isn't installed.
Programmatically:
try:
import foo
except ImportError:
# do whatever you wanted if foo is missing
Note that this doesn't actually tell you foo is missing, just that it couldn't be imported. In a simple "test whether you have this" script, that's generally what you want to actually check for. But what if you really want to check "is installed (even if broken)"?
In recent Python (I think 3.4+), the ImportError will have additional information in it that you can access—name for the name you were trying to import, path if it was found, etc. However, this is one of those cases where EAFP may not be better than LBYL. You can use importlib to search for the module without trying to import it, like this:
spec = importlib.util.find_spec('foo')
What if you're using an older Python? There are similar features going back to 3.2, but not quite as nice, and if you're using 2.7, there's really nothing worth using, because the import machinery wasn't exposed very well.
For that case (and many, many other cool things related to package installation), use setuptools—which isn't in the stdlib, but a huge number of third-party packages depend on it (until recently it was the cornerstone of Python package installation, even if unofficially):
pkg_resources.get_distribution('foo')
However, that looks for a distutils/setuptools/PyPI package, not a Python module or package. There's a lot of overlap there, but they're not exactly the same thing. For a simple example, when you pip install more-itertools, you get the more-itertools PyPI package, which installs the more_itertools Python package into your site-packages.

Give installed module a new name

I want to install two versions of the same module in site-packages but want to call one 'deprecated_Bio' and the other 'Bio'
using Biopython installation
python setup.py install
of course creates a nice module Bio for you under lib/site-packages/Bio
is there a way where I can take a past version and name 'deprecated_Bio' in the same lib/site-packages/deprecated-Bio using setuptools so they can can co-exist one being called with
import Bio
and the other being called with
import deprecated_Bio
Possible duplicate but I just can't seem to find the answer !
It is not clear why you want the libraries to co-exist. Do you need to have access to both types of functions within the same program, or are you testing a new library and might want to roll back? If the latter, one approach is virtualenv, which will allow you to install multiple libraries, and even Python interpreters on the same machine.

Use Selenium with Python Portable

My question seems somewhat inane, but I cannot seem to find any resources for what I need to do.
Essentially I'm using my work computer to write python applications in my spare time. I'm using Python Portable (syntax version 3.2) because I do not have administrative access and can't do things with path variables etc.
How (if possible) do I install or import selenium so I can use it in Python Portable?
Thanks all!
Based on answer found Importing modules on portable python
and How to install external libraries with Portable Python?
Check for what import sys; print sys.path says?
It displays the list of directories and zipfiles where portable python looks for modules to import. Just copy your modules into one of those directories or zipfiles, or sys.path.append('/whatever/dir') if you have your modules in /whatever/dir and want to keep them there (the latter approach will last only for the current session, be it interactive or a script's execution).
Also on their FAQs
You don’t have package I need, can I add it?
For simpler packages you can use easy install or even extract them in site-packages folder of
the Portable Python distribution. However some packages are installing additional dependencies
in windows system folders - in this case your Portable Python distribution will not work once
you move it to some other workstation. Make sure to do proper testing !

Python compile all modules into a single python file

I am writing a program in python to be sent to other people, who are running the same python version, however these some 3rd party modules that need to be installed to use it.
Is there a way to compile into a .pyc (I only say pyc because its a python compiled file) that has the all the dependant modules inside it as well?
So they can run the programme without needing to install the modules separately?
Edit:
Sorry if it wasnt clear, but I am aware of things such as cx_freeze etc but what im trying to is just a single python file.
So they can just type "python myapp.py" and then it will run. No installation of anything. As if all the module codes are in my .py file.
If you are on python 2.3 or later and your dependencies are pure python:
If you don't want to go the setuptools or distutiles routes, you can provide a zip file with the pycs for your code and all of its dependencies. You will have to do a little work to make any complex pathing inside the zip file available (if the dependencies are just lying around at the root of the zip this is not necessary. Then just add the zip location to your path and it should work just as if the dependencies files has been installed.
If your dependencies include .pyds or other binary dependencies you'll probably have to fall back on distutils.
You can simply include .pyc files for the libraries required, but no - .pyc cannot work as a container for multiple files (unless you will collect all the source into one .py file and then compile it).
It sounds like what you're after is the ability for your end users to run one command, e.g. install my_custom_package_and_all_required_dependencies, and have it assemble everything it needs.
This is a perfect use case for distutils, with which you can make manifests for your own code that link out to external dependencies. If your 3rd party modules are available publicly in a standard format (they should be, and if they're not, it's pretty easy to package them yourself), then this approach has the benefit of allowing you to very easily change what versions of 3rd party libraries your code runs against (see this section of the above linked doc). If you're dead set on packaging others' code with your own, you can always include the required files in the .egg you create with distutils.
Two options:
build a package that will install the dependencies for them (I don't recommend this if the only dependencies are python packages that are installed with pip)
Use virtual environments. You use an existing python on their system but python modules are installed into the virtualenv.
or I suppose you could just punt, and create a shell script that installs them, and tell them to run it once before they run your stuff.

Categories

Resources