I'm using wadofstuff's serializer https://pypi.python.org/pypi/wadofstuff-django-serializers on django 1.5. However, it uses simplejson which is now deprecated so I have to go directly into the library file wadofstuff/django/serializers/json.py and change simplejson into json. The problem is that I cannot import json because the library file is named json.py as well, so it tries to load itself. I don't want to change the file name because other developers in my team will definitely kill me. Is there any way to import json from absolute path?
Thank you.
The simplest, and probably best, fix for this (in Python 2.x) is to not have a module that shadows the name of a top-level stdlib/site-packages module.
In other words, rename json.py to something else. Then you can just import json from within the renamed file (or, better, try that, and on ImportError fall back to simplejson, so you don't break 2.5 compatibility). Then just change whatever code was importing serializers.json to import the new thing.
You should be able to file a bug against the wadostuff package, and submit your patch, and it may make it into version 1.2. (There seems to be an update about once/year or so, and it doesn't seem too unreasonable to finally get up to speed with Python 2.6 and Django 1.5 in 2013…)
Another way is to remove the current directory from the search path.
Assuming "" (denotes the current directory) is at the beginning of sys.path (the default):
sys.path.remove("")
import json
sys.path.insert(0, "")
You can put this line on top of json.py:
from __future__ import absolute_import
This will tell Python that, while importing this module, it should look for modules only on sys.path and not relatively to it (see also http://www.python.org/dev/peps/pep-0328/#rationale-for-absolute-imports).
Edit:
Also note that the wadofstuff's module doesn't actually import simplejson directly: from django.utils import simplejson. Django code will use the stdlib version by default if you don't have simplejson installed.
1, try change python's import-path:
sys.path.insert(0, "/path/to/your/json.py/dir")
import json
2, try add _ _ init _ _.py file to all dirs to your json.py, then you can use this
import a.b.c.json as myjson
3, if you dont want change any bit, try something deep with
__import__()
Related
I have a module that conflicts with a built-in module. For example, a myapp.email module defined in myapp/email.py.
I can reference myapp.email anywhere in my code without issue. However, I need to reference the built-in email module from my email module.
# myapp/email.py
from email import message_from_string
It only finds itself, and therefore raises an ImportError, since myapp.email doesn't have a message_from_string method. import email causes the same issue when I try email.message_from_string.
Is there any native support to do this in Python, or am I stuck with renaming my "email" module to something more specific?
You will want to read about Absolute and Relative Imports which addresses this very problem. Use:
from __future__ import absolute_import
Using that, any unadorned package name will always refer to the top level package. You will then need to use relative imports (from .email import ...) to access your own package.
NOTE: The above from ... line needs to be put into any 2.x Python .py files above the import ... lines you're using. In Python 3.x this is the default behavior and so is no longer needed.
I feel really dumb asking this question, but it's a quirk of python I've put up with for awhile now that I finally want to fix.
On CentOS 7, given that I have "roflmao.py" and "__init__.py" in the directory:
/usr/lib/python2.7/site-packages/roflmao
Why is it that when I'm using the python interpreter (and not in the directory containing roflmao.py), I must type:
from roflmao import roflmao
Instead of simply:
import roflmao
To gain access to "roflmao.py"'s functions and variables? I can import re, collections, requests, or any PIP-installed module just fine, but not my own custom one.
How can I set things up to accomplish this?
Put from roflmao import * into __init__.py.
If you do this, then you don't really need to use roflmao.py. Because it would then be pointless to do from roflmao import roflmao. So it's best to just put the code from roflmao.py into __init__.py.
I would like to get the path of a library before importing the library itself.
That is, something different from:
import module, os
library_path = os.path.dirname(module.__file__)
Is that possible?
Thank you.
I think what you need is imp module:
import imp
file_handle, module_path, module_doc = imp.find_module(module_name)
The second return value is the path to actual file (assuming there is one, since requested module could be a built-in). First parameter is a file handle, already opened for you.
For as long as your use case is simple, you shouldn't have any issues. If you'll try for a generic solution you will need to read imp module documentation carefully, as there are lots of possible situations and return values for this function.
https://docs.python.org/2/library/imp.html
import commands
print commands.__file__
/usr/lib/python2.7/commands.py
import os
print os.__file__
/usr/lib/python2.7/os.pyc
Yes you can, but all modules doesn't support __ file__
What is the proper way to access resources in python programs.
Basically in many of my python modules I end up writing code like that:
DIRNAME = os.path.split(__file__)[0]
(...)
template_file = os.path.join(DIRNAME, "template.foo")
Which is OK but:
It will break if I will start to use python zip packages
It is boilerplate code
In Java I had a function that did exactly the same --- but worked both when code was lying in bunch of folders and when it was packaged in .jar file.
Is there such function in Python, or is there any other pattern that I might use.
You'll want to look at using either get_data in the stdlib or pkg_resources from setuptools/distribute. Which one you use probably depends on whether you're already using distribute to package your code as an egg.
Since version 3.7 of Python, the proper way to access a file in resources is to use the importlib.resources library.
One can, for example, use the path function to access a particular file in a Python package:
import importlib.resources
with importlib.resources.path("your.package.templates", "template.foo") as template_file:
...
Starting with Python 3.9, this package introduced the files() API, to be preferred over the legacy API.
One can, use the files function to access a particular file in a Python package:
template_res = importlib.resources.files("your.package.templates").joinpath("template.foo")
with importlib.resources.as_file(template_res) as template_file:
...
For older versions, I recommend to install and use the importlib-resources library. The documentation also explains in detail how to migrate your old implementation using pkg_resources to importlib-resources.
Trying to understand how we could combine the two aspect togather
Loading for resources in native filesystem
Packaged in zipped files
Reading through the quick tutorial on zipimport : http://www.doughellmann.com/PyMOTW/zipimport/
I see the following example:
import sys
sys.path.insert(0, 'zipimport_example.zip')
import os
import zipimport
importer = zipimport.zipimporter('zipimport_example.zip')
module = importer.load_module('example_package')
print module.__file__
print module.__loader__.get_data('example_package/README.txt')
I think that output of __file__ is "zipimport_example.zip/example_package/__init__.pyc"
Need to check how it looks from inside.
But then we could always do something like this:
if ".zip" in example_package.__file__:
...
load using get_data
else:
load by building the correct file path
[Edit:] I have tried to work out the example a bit better.
If the the package gets imported as zipped file then, two things happen
__file__ contains ".zip" in it's path.
__loader__ is available in the name space
If these two conditions are met then within the package you could do:
print __loader__.get_data(os.path.join('package_name','README.txt'))
else the module was loaded normally and you can follow the regular approach to loading the file.
I guess the zipimport standard python module could be an answer...
EDIT: well, not the use of the module directly, but using sys.path as shown in the example could be a good way:
I have a zip file test.zip with one python module test and a file test.foo inside
to test that for the zipped python module test can be aware of of test.foo, it contains this code:
c
import os
DIRNAME = os.path.dirname(__file__)
if os.path.exists(os.path.join(DIRNAME, 'test.foo')):
print 'OK'
else:
print 'KO'
Test looks ok:
>>> import sys
>>> sys.path.insert(0, r'D:\DATA\FP12210\My Documents\Outils\SVN\05_impl\2_tools\test.zip')
>>> import test
OK
>>>
So a solution could be to loop in your zip file to retrieve all python modules, and add them in sys.path; this piece of code would be ideally the 1st one loaded by your application.
I'm using python and virtualenv/pip. I have a module installed via pip called test_utils (it's django-test-utils). Inside one of my django apps, I want to import that module. However I also have another file test_utils.py in the same directory. If I go import test_utils, then it will import this local file.
Is it possible to make python use a non-local / non-relative / global import? I suppose I can just rename my test_utils.py, but I'm curious.
You can switch the search order by changing sys.path:
del sys.path[0]
sys.path.append('')
This puts the current directory after the system search path, so local files won't shadow standard modules.
My problem was even more elaborate:
importing a global/site-packages module from a file with the same name
Working on aero the pm recycler I wanted access to the pip api, in particular pip.commands.search.SearchCommand from my adapter class Pip in source file pip.py.
In this case trying to modify sys.path is useless, I even went as far as wiping sys.path completely and adding the folder .../site-packages/pip...egg/ as the only item in sys.path and no luck.
I would still get:
print pip.__package__
# 'aero.adapters'
I found two options that did eventually work for me, they should work equally well for you:
using __builtin__.__import__() the built-in function
global_pip = __import__('pip.commands.search', {}, {}, ['SearchCommand'], -1)
SearchCommand = global_pip.SearchCommand
Reading the documentation though, suggests using the following method instead.
using importlib.import_module() the __import__ conv wrapper.
The documentation explains that import_module() is a minor subset of functionality from Python 3.1 to help ease transitioning from 2.7 to 3.1
from importlib import import_module
SearchCommand = import_module('pip.commands.search').SearchCommand
Both options get the job done while import_module() definitely feels more Pythonic if you ask me, would you agree?
nJoy!
I was able to force python to import the global one with
from __future__ import absolute_import
at the beginning of the file (this is the default in python 3.0)
You could reset your sys.path:
import sys
first = sys.path[0]
sys.path = sys.path[1:]
import test_utils
sys.path = first + sys.path
The first entry of sys.path is "always" (as in "per default": See python docs) the current directory, so if you remove it you will do a global import.
Since my test_utils was in a django project, I was able to go from ..test_utils import ... to import the global one.
Though, in first place, I would always consider keeping the name of local file not matching with any global module name, an easy workaround, without modifying 'sys.path' can be to include global module in some other file and then import this global module from that file.
Remember, this file must be in some other folder then in the folder where file with name matching with global module is.
For example.
./project/root/workarounds/global_imports.py
import test_utils as tutil
and then in
./project/root/mycode/test_utils.py
from project.root.workarounds.global_imports import tutil
# tutil is global test_utils
# you can also do
from project.root.workarounds.global_imports import test_utils