Python 2.7 comes with json library included. In my PYTHONPATH I include third party sources and one of them is also called json. The result ending up with loaded the wrong json library. What would be a good practice to handle and avoid situations like above? Is there a way to instruct Python to explicitly load the native library in this fashion from ? import json.
You could try
from path import json as anotherjson
This way the namespace conflict can be removed.
Also you can see the discussions about relative/absolute import.
http://docs.python.org/whatsnew/2.5.html#pep-328
It says :
In Python 2.5, you can switch import‘s behaviour to absolute imports
using a from future import absolute_import directive. This
absolute- import behaviour will become the default in a future version
(probably Python 2.7). Once absolute imports are the default, import
string will always find the standard library’s version. It’s suggested
that users should begin using absolute imports as much as possible.
from __future__ import absolute_import
# from standard path
import json as _json
# from a package
from pkg import json as pkgjson
The other technique is to use the imp module
import imp
json = imp.load_source('json', '/path/to/json.py')
There really is no good way to have multiple modules with the same name on PYTHONPATH[docs], this means that you should probably move the third party json module to an alternate location that is not on PYTHONPATH, and then import it using some other method.
The easiest way to do this is to move the third party json module into a subdirectory of the location it is already in, and then make that subdirectory a module by adding __init__.py to it.
If you named this new directory 'thirdparty', you could then import your third party json module using from thirdparty import json, and import json would always import Python's json module.
Alternatively, you could rename the module to something that does not conflict.
Related
I would like to import a module from inside a functions. For example from this:
from directory.folder.module import module
def import():
app.register_blueprint(module)
To this:
def import():
from directory.folder.module import module
But, without hardcoding it. For example:
def import():
m = "module"
from directory.folder.m import m
Is it possible? Thanks in advance
You want the importlib module.
Here's the most simplistic way to use this module. There are lots of different ways of weaving the results of calls to the module into the environment:
import importlib
math = importlib.import_module("math")
print(math.cos(math.pi))
Result:
-1.0
I've used this library a lot. I built a whole plug-in deployment system with it. Scripts for all the various deploys were dropped in directories and only imported when they were mentioned in a config file rather than everything having to be imported right away.
Something I find very cool about this module is what's stated at the very top of its documentation:
The purpose of the importlib package is two-fold. One is to provide the implementation of the import statement (and thus, by extension, the import() function) in Python source code.
The intro in the 2.7 docs is interesting as well:
New in version 2.7.
This module is a minor subset of what is available in the more full-featured package of the same name from Python 3.1 that provides a complete implementation of import. What is here has been provided to help ease in transitioning from 2.7 to 3.1.
No, python import does not work this way.
Such as an example you try to import a module named mod, so you run import mod. Now interpreter will search for mod.py in a list of directories gathered from the following sources:
The directory from where the input script was run or the current directory if the interpreter is being run interactively.
The list of directories contained in the PYTHONPATH environment variable, if it is set. (The format for PYTHONPATH is OS-dependent but should mimic the PATH environment variable.)
An installation-dependent list of directories configured at the time Python is installed.
So if you have a variable named m='mod' and run import m it will search for m.py not mod.py.
But just a silly dangerous workaround is to use exec() (WARNING FOR MALICIOUS INPUT)
m = "module"
exec(f'from directory.folder.m import {m}')
If you don't mind external modules try importlib.
You can use the importlib module to programmatically import modules.
import importlib
full_name = "package." + "module"
m = importlib.import_module(full_name)
Is it a good idea to use a standard library function from an imported module? For example, I write a xyz.py module and within xyz.py, I've this statetment import json
I have another script where I import xyz. In this script I need to make use of json functions. I can for sure import json in my script but json lib was already imported when I import xyz. So can I use xyz.json() or is it a bad practice?
You should use import json again to explicitly declare the dependency.
Python will optimize the way it loads the modules so you don't have to be concerned with inefficiency.
If you later don't need xyz.py anymore and you drop that import, then you still want import json to be there without having to re-analyze your dependencies.
I found the following line in Python code:
from six.moves import urllib
Simultaneously, I can find urllib.py anywhere. I found that there is a file six.py in package root and it has class Module_six_moves_urllib(types.ModuleType): inside.
Is this it? How is this defined?
UPDATE
Sorry I am new to Python and the question is about Python syntax. I learned, that what is after import is Python file name without a py extension. So, where is this file in this case?
six is a package that helps in writing code that is compatible with both Python 2 and Python 3.
One of the problems developers face when writing code for Python2 and 3 is that the names of several modules from the standard library have changed, even though the functionality remains the same.
The six.moves module provides those modules under a common name for both Python2 and 3 (mostly by providing the Python2 module under the name of the Python 3 module).
So your line
from six.moves import urllib
imports urllib when run with Python3 and imports a mixture of urllib, urllib2 and urlparse with Python2, mimicking the structure of Python3's urllib. See also here.
EDIT to address the update of the question:
TLDR; There is not necessarily a direct relation between the imported module urllib and a file on the filesystem in this case. The relevant file is exactly what six.__file__ points to.
Third party modules are defined in a file/directory that is
listed in sys.path. Most of the time you can find the name of the file a module is imported from by inspecting the __file__ attribute of the module in question, e.g. six.__file__. However with six.moves things are not as simple, precisely because the exposed modules might not actually map one to one to actual Python modules but hacked versions of those.
I need help with the next situation. There is one project, that is requiring two versions of one library. Let this lib be lib, and its versions: libold and libnew. These libs are not accessible via pypi, i.e. they are each in their own folder. Let the paths of these folders be /path/to/libold and /path/to/libnew.
In my project I need instances of classes from both these libs, but I can't import them both, but only either old or new lib.
I tried the next method:
import sys
sys.path.insert(0,'path/to/libold')
import lib as libold
sys.path.pop(0)
sys.path.insert(0,'path/to/libnew')
import lib as libnew
After performing this commands, libold and libnew represents the same library, libold.
I also tried importlib and imp and got same result.
How can I perform importing 2 versions of a lib?
Python adds imported modules to sys.modules. When you write import lib as libnew, sys.modules['lib'] already exists, and therefore the new lib is not imported.
To import the new lib, you should delete the old one from sys.modules, like this:
import sys
sys.path.insert(0, 'path/to/libold')
import lib as libold
sys.path.pop(0)
del sys.modules['lib']
sys.path.insert(0, 'path/to/libnew')
import lib as libnew
However, you may encounter serious problems by doing so. In particular, if the old lib tries to import a submodule (say, e.g. lib.submodule), it will get the new one instead. For this reason, you'd better import all submodules of the old lib before deleting sys.modules['lib'] and before importing the new one.
However, that's a dirty hack, not a real solution. In Python, modules and packages are identified by name, not by path. This is how it works and has always worked, and there's nothing you can do about it.
Consider using multiprocessing instead to overcome these "limitations". With multiprocessing you can run two processes: one that uses the old lib and the other that uses the new one. multiprocessing gives you many tools to make interprocess communication easy.
When is it useful to use imp.load_source() method for importing Python module? Has it some advantage in some scenario in opposite to normal importing with import keyword?
import always looks in the following order:
already imported modules
import hooks
files in the locations in sys.path
builtin modules
If you want to import a module which would not be found by any of these mechanisms, but you know the filename, then you could use imp.load_source(). Or if you want to import a module that would be shadowed by an earlier import mechanism, for example if you want to import foo from a directory in sys.path but there is a custom import hook that would find its own version of foo first, then you could use imp.load_source() for that too. Basically it lets you control the source of the module's code in a way that import does not.