Testing for python module by importing in a Makefile - python

I am trying to create a generate Makefile. Is there a way to test whether a python module exists and then perform different actions in the Makefile based on that?
I have tried something like this
all:
ifeq (,$(# python -c 'import somemodule'))
echo "DEF MODULE = 1" > settings.pxi
else
echo "DEF MODULE = 0" > settings.pxi
endif
python setup.py build_ext --build-lib . --build-temp build --pyrex-c-in-temp
however doing this does not produce any result. Also if the module does not exist, python throws an error- how to store this information rather than simply crashing?

Consider making use of Python's imp module. Specifically, imp.find_module should be exactly what you're looking for.

Wrap everything in bash -c “cmd” works for me.
python_mod := $(shell bash -c "echo -e 'try:\n import toml\n print(\"good\")\nexcept ImportError:\n print(\"bad\")' | python3 -")
ifeq "$(python_mod)" "error"
$(error "python module is not installed")
endif

Related

Is there a which statement for PYTHONPATH similar to for PATH in linux?

I would like to know where a script like "tensorflow.python.tools.inspect_checkpoint" is located when I use the handy command "python -m tensorflow.python.tools.inspect_checkpoint --file_name xyz". It is somewhere in my PYTHONPATH, but it is tedious to go through every path.
Is there a similar command to "which", aimed at quickly locating python scripts that can be reached from PYTHONPATH? Thanks.
You can access the __file__ attribute of the module. Here is an example:
$ python -c "import tensorflow.python.tools.inspect_checkpoint as m; print(m.__file__)"
/srv/conda/envs/notebook/lib/python3.6/site-packages/tensorflow/python/tools/inspect_checkpoint.py
You can make a shell command that takes an module as an argument and returns the __file__ attribute.
function pywhich() {
python -c "import $1 as m; print(m.__file__)"
}
$ pywhich numpy
/home/jakub/miniconda3/lib/python3.8/site-packages/numpy/__init__.py

To print python's sys.path from the command line

I want to print python's sys.path from the command line, but this one is not working:
python -m sys -c "print (sys.path)"
although print -c "import sys; print (sys.path)" would work. It seems that "-m" in the first one above does not load the module "sys". Any clarification on how to correctly import a module from python flags? Thanks.
There is no such flag. -m does something completely different from what you want. You can go through the Python command line docs to see the lack of such a flag if you want.
Just put the import in the command.
python -c "import sys; print (sys.path)"

Python doctest does not run on files named as signal.py

Problem
I'm using python 3.6.6 on Fedora 28. I have a project structure as follows :
test/__init__.py
test/signal.py
test/notsignal.py
If I run $ python3 -m doctest -v test/signal.py
I get:
10 items had no tests:
signal
signal.Handlers
signal.ItimerError
signal.Sigmasks
signal.Signals
signal._enum_to_int
signal._int_to_enum
signal.struct_siginfo
signal.struct_siginfo.__reduce__
signal.struct_siginfo.__repr__
0 tests in 10 items.
0 passed and 0 failed.
Test passed.
which, to me, clearly shows that doctest is trying to run on the built-in signal module. By renaming the file I was able to run docset. Am I missing something or is this a bug?
To reproduce
You may use the following shell script.
#!/bin/sh
mkdir -p test
touch test/__init__.py
echo -e ""'"'""'"'""'"'"\n>>> _ = print(f'Doctest at {__name__} was run.')\n"'"'""'"'""'"'"" > test/signal.py
cp test/signal.py test/notsignal.py
python3 -m doctest -v test/signal.py
python3 -m doctest -v test/notsignal.py
If you look at the doctest source, you can see that doctest tries to import the modules that you pass to it.
It's very likely that the standard library's signal module has already been imported:
$ python -c 'import sys;import doctest;print("signal" in sys.modules)'
True
When doctest tries the import the Python interpreter finds that there is already a module named "signal" in sys.modules and returns that rather than your signal module.
Perhaps this is a bug - maybe doctest could be smarter about how it imports - but in practice I think the best course of action is to rename your module. In general, having modules with the same names as standard library modules almost always causes problems.

Not getting LD_LIBRARY_PATH

I am amending an existing script in which I want to check the set of libraries used in an executable with the shared libraries called at the run time. I have the list of libraries which I need to compare with the shared libraries. For getting shared libraries I am trying to get LD_LIBRARY_PATH by giving below code but I had no luck. I tried checking the variable on command line by giving
echo $LD_LIBRARY_PATH
and it returned /opt/cray/csa/3.0.0-1_2.0501.47112.1.91.ari/lib64:/opt/cray/job/1.5.5-0.1_2.0501.48066.2.43.ari/lib64
the things that I have already tried are (this is a python script)
#! /usr/bin/python -E
import os
ld_lib_path = os.environ.get('LD_LIBRARY_PATH')
#ld_lib_path = os.environ["LD_LIBRARY_PATH"]
I think you are just missing a print in your script? This works for me from the command line:
python -c 'import os; temp=os.environ.get("LD_LIBRARY_PATH"); print temp'
script:
#! /usr/bin/python -E
import os
ld_lib_path = os.environ.get('LD_LIBRARY_PATH')
print ld_lib_path

Check for existence of Python dev files from bash script

I am creating a simple bash script to download and install a python Nagios plugin. On some older servers the script may need to install the subprocess module and as a result I need to make sure the correct python-devel files are installed.
What is an appropriate and cross platform method of checking for these files. Would like to stay away from rpm or apt.
If you can tell me how to do the check from within python that would work. Thanks!
Update:
This is the best I have come up with. Anyone know a better or more conclusive method?
if [ ! -e $(python -c 'from distutils.sysconfig import get_makefile_filename as m; print m()') ]; then echo "Sorry"; fi
That would be pretty much how I would go about doing it. Seems reasonable simple.
However, if I need to be really sure that python-devel files are installed for the current version of Python, I would look for the relevant Python.h file. Something along the lines of:
# first, makes sure distutils.sysconfig usable
if ! $(python -c "import distutils.sysconfig.get_config_vars" &> /dev/null); then
echo "ERROR: distutils.sysconfig not usable" >&2
exit 2
fi
# get include path for this python version
INCLUDE_PY=$(python -c "from distutils import sysconfig as s; print s.get_config_vars()['INCLUDEPY']")
if [ ! -f "${INCLUDE_PY}/Python.h" ]; then
echo "ERROR: python-devel not installed" >&2
exit 3
fi
Note: distutils.sysconfig may not be supported on all platforms so not the most portable solution, but still better than trying to cater for variations in apt, rpm and the likes.
If you really need to support all platforms, it might be worth exploring what is done in the AX_PYTHON_DEVEL m4 module. This module can be used in a configure.ac script to incorporate checks for python-devel during the ./configure stage of an autotools-based build.
Imho your solutions works well.
Otherwise, a more "elegant" solution would be to use a tiny script like:
testimport.py
#!/usr/bin/env python2
import sys
try:
__import__(sys.argv[1])
print "Sucessfully import", sys.argv[1]
except:
print "Error!"
sys.exit(4)
sys.exit(0)
And call it with testimport.sh distutils.sysconfig
You can adapt it to check for internal function if needed...
For those looking for a pure python solution that also works for python3:
python3 -c 'from distutils.sysconfig import get_makefile_filename as m; from os.path import isfile; import sys ; sys.exit(not isfile(m()))')
Or as a file script check-py-dev.py:
from distutils.sysconfig import get_makefile_filename as m
from os.path import isfile
import sys
sys.exit(not isfile(m()))
To get a string in bash, just use the exit output:
python3 check-py-dev.py && echo "Ok" || echo "Error: Python header files NOT found"

Categories

Resources