Please assume the following project structure:
/project
/packages
/files
__init__.py
fileChecker.py
/hasher
__init__.py
fileHash.py
mainProject.py
/test
I would like to get access to the module fileChecker.py from within the module fileHash.py.
This is some kind of global package.
One way is to append paths to sys.path.
[Is this PYTHONPATH by the way?]
What would be a solution when distributing the project?
Same as above? --> But then there could be paths to modules with the
same name in PYTHONPATH?
Is setuptools doing all the work?
How can I achieve it in a nice and clean way?
Thanks alot.
Update:
Also see my answer below
--> when calling fileHash.py (including an import like from files import fileChecker) directly from within its package directory, the project's path needs to be added to sys.path (described below).
Test cases situated within /test (see structure above) also need the path added to sys.path, when called from within /test.
Thanks mguijarr.
I found a solution here on stackoverflow:
source:
How to fix "Attempted relative import in non-package" even with __init__.py
when I am in the project folder /project, I can call the module like this:
python -m packages.files.fileHash (no .py here, because it is a package)
This is wokring well.
In this case PYTHONPATH is known and the import can look like this:
from packages.files import fileChecker
If it is not called directly, but from within the package directory in my case /packages/hasher --> setting the PYTHONPATH is needed:
if __package__ is None:
import sys
from os import path
sys.path.append( path.dirname( path.dirname( path.abspath(__file__) ) ) )
from packages.files import fileChecker
else:
from packages.files import fileChecker
The important thing for me here is, that the path to include is the PROJECT path.
The code snippet above(the last one) already includes the case describes both cases (called as package and directly).
Thanks alot for your help.
Update:
Just to make my answer more complete
Python adds the current path to the PYTHONPATH automatically when doing
python fileHash.py
Another option, in addition to the one above, is to set the PYTHONPATH when running the program like this
PYTHONPATH=/path/to/project python fileHash.py
I gained some experience, I would like to share:
I don't run modules from within their directories anymore.
Starting the app, running tests or sphinx or pylint or whatever is all done from the project directory.
This ensures that the project directory is contained in the python path and all packages, modules are found without doing additional stuff on imports.
The only place I still set the python path to the project folder using sys.path is in my setup.py in order to make codeship work.
Still, in my opinion this is somehow not an easy matter and I find myself reflecting the PYTHONPATH often enough :)
I finally found another solution, quite intuitive, no sys path or anything needed: https://www.tutorialsteacher.com/python/python-package
And then do what they explain in ''Install a Package Globally''.
In your case, put the "setup.py" file in the packages directory and add the two packages:
from setuptools import setup
setup(name='mypackage',
version='0.1',
description='Testing installation of Package',
url='#',
author='malhar',
author_email='mlathkar#gmail.com',
license='MIT',
packages=['files', 'hasher'], ## here the names
zip_safe=False)
Related
I'm developing a system in Python that includes a calculation engine and a front end. I've split them up into two projects as the calculation engine can be used for other front ends as well.
I'm using Eclipse and PyDev. Everything works perfectly in Eclipse/PyDev, but as soon as I try to run it outside of PyDev (from command line) I get importing errors. I've done quite a bit of research to find the problem, but I just don't see a solution that works nicely. I believe that PyDev modifies the Python path.
In my project layout below I have two packages (package1 and tests) within one project (Calculations). I can't seem to import anything from package1 in tests. I also have another project (Frontend). Here I also can't import anything from package1.
What I want to understand is the proper way of calling my script/tests files from the command line? Both for two separate projects and two packages in the same project. I assume it would be similar to how PyDev does it. So far I think I have the following options:
Create python code to append to sys.path (seems hacky/not good practice)
Modify the PYTHONPATH when I call the test_some_calc.py like this: PYTHONPATH= python test_some_calc.py (I think this is how PyDev does it, but it seems a bit long - there must be a simpler way?
Make a install package (eventually I might go this method, but not yet.)
I have the following project layout.
CodeSolution/
Calculations/
package1/
__init__.py
subpackage/
__init__.py
some_calc.py
subpackage2/
__init__.py
another_calc.py
tests/
__init__.py
subpackage/
__init__.py
test_some_calc.py # Unable to import from package1
subpackage2/
__init__.py
test_another_calc.py # Unable to import from package1
Frontend/
some_script.py # Unable to import from package1
Comments on my project layout will also be appreciated.
A clean, quick and modular way to include certain python from anywhere on your system is to make a file named mymodule.pth and put it inside the path of site-packages
mymodule.pth should have the path of your project. Project folder must have an __init__.py file.
for example put:
for Linux:
/home/user/myproject
inside
/usr/lib/python2.7/site-packages/mymodule.pth
or
for Windows
C:\\Users\myUsername\My Documents\myproject
inside
C:\PythonXY\Lib\site-packages\mymodule.pth
I wrote a script to load PYTHONPATHs from PyDev's project properties. It's allow you to run your code from console without problems like "ModuleNotFoundError: No module named ...".
import sys
from xml.dom import minidom
import os
print(sys.path)
def loadPathsFromPyDev():
sys_path = sys.path[0]
# Load XML
xmldoc = minidom.parse(sys_path+'\.pydevproject')
# Get paths
xmlpaths = xmldoc.getElementsByTagName('path')
# Get paths' values
paths = list()
for xmlpath in xmlpaths:
paths.append(xmlpath.firstChild.data)
# set path variable
for path in paths:
# Set backslashes to forwardslashes
path = os.path.normpath(path)
# Set string's sys_path
path = path.replace("\${PROJECT_DIR_NAME}", sys_path)
if path not in sys.path:
# Add to system path
sys.path.insert(1,path)
loadPathsFromPyDev()
print(sys.path)
I hope it will help You :)
I just got set up to use pytest with Python 2.6. It has worked well so far with the exception of handling "import" statements: I can't seem to get pytest to respond to imports in the same way that my program does.
My directory structure is as follows:
src/
main.py
util.py
test/
test_util.py
geom/
vector.py
region.py
test/
test_vector.py
test_region.py
To run, I call python main.py from src/.
In main.py, I import both vector and region with
from geom.region import Region
from geom.vector import Vector
In vector.py, I import region with
from geom.region import Region
These all work fine when I run the code in a standard run. However, when I call "py.test" from src/, it consistently exits with import errors.
Some Problems and My Solution Attempts
My first problem was that, when running "test/test_foo.py", py.test could not "import foo.py" directly. I solved this by using the "imp" tool. In "test_util.py":
import imp
util = imp.load_source("util", "util.py")
This works great for many files. It also seems to imply that when pytest is running "path/test/test_foo.py" to test "path/foo.py", it is based in the directory "path".
However, this fails for "test_vector.py". Pytest can find and import the vector module, but it cannot locate any of vector's imports. The following imports (from "vector.py") both fail when using pytest:
from geom.region import *
from region import *
These both give errors of the form
ImportError: No module named [geom.region / region]
I don't know what to do next to solve this problem; my understanding of imports in Python is limited.
What is the proper way to handle imports when using pytest?
Edit: Extremely Hacky Solution
In vector.py, I changed the import statement from
from geom.region import Region
to simply
from region import Region
This makes the import relative to the directory of "vector.py".
Next, in "test/test_vector.py", I add the directory of "vector.py" to the path as follows:
import sys, os
sys.path.append(os.path.realpath(os.path.dirname(__file__)+"/.."))
This enables Python to find "../region.py" from "geom/test/test_vector.py".
This works, but it seems extremely problematic because I am adding a ton of new directories to the path. What I'm looking for is either
1) An import strategy that is compatible with pytest, or
2) An option in pytest that makes it compatible with my import strategy
So I am leaving this question open for answers of these kinds.
The issue here is that Pytest walks the filesystem to discover files that contain tests, but then needs to generate a module name that will cause import to load that file. (Remember, files are not modules.)
Pytest comes up with this test package name by finding the first directory at or above the level of the file that does not include an __init__.py file and declaring that the "basedir" for the module tree containing a module generated from this file. It then adds the basedir to sys.path and imports using the module name that will find that file relative to the basedir.
There are some implications of this of which you should beware:
The basepath may not match your intended basepath in which case the module will have a name that doesn't match what you would normally use. E.g., what you think of as geom.test.test_vector will actually be named just test_vector during the Pytest run because it found no __init__.py in src/geom/test/ and so added that directory to sys.path.
You may run into module naming collisions if two files in different directories have the same name. For example, lacking __init__.py files anywhere, adding geom/test/test_util.py will conflict with test/test_util.py because both are loaded as import test_util.py, with both test/ and geom/test/ in the path.
The system you're using here, without explicit __init__.py modules, is having Python create implicit namespace packages for your directories. (A package is a module with submodules.) Ideally we'd configure Pytest with a path from which it would also generate this, but it doesn't seem to know how to do that.
The easiest solution here is simply to add empty __init__.py files to all of the subdirectories under src/; this will cause Pytest to import everything using package/module names that start with directory names under src/.
The question How do I Pytest a project using PEP 420 namespace packages? discusses other solutions to this.
import looks in the following directories to find a module:
The home directory of the program. This is the directory of your root script. When you are running pytest your home directory is where it is installed (/usr/local/bin probably). No matter that you are running it from your src directory because the location of your pytest determines your home directory. That is the reason why it doesn't find the modules.
PYTHONPATH. This is an environment variable. You can set it from the command line of your operating system. In Linux/Unix systems you can do this by executing: 'export PYTHONPATH=/your/custom/path' If you wanted Python to find your modules from the test directory you should include the src path in this variable.
The standard libraries directory. This is the directory where all your libraries are installed.
There is a less common option using a pth file.
sys.path is the result of combining the home directory, PYTHONPATH and the standard libraries directory. What you are doing, modifying sys.path is correct. It is something I do regularly. You could try using PYTHONPATH if you don't like messing with sys.path
If you include an __init__.py file inside your tests directory, then when the program is looking to set a home directory it will walk 'upwards' until it finds one that does not contain an init file. In this case src/.
From here you can import by saying :
from geom.region import *
you must also make sure that you have an init file in any other subdirectories, such as the other nested test directory
I was wondering what to do about this problem too. After reading this post, and playing around a bit, I figured out an elegant solution. I created a file called "test_setup.py" and put the following code in it:
import sys, os
sys.path.append(os.path.dirname(os.path.abspath(__file__)))
I put this file in the top-level directory (such as src). When pytest is run from the top-level directory, it will run all test files including this one since the file is prefixed with "test". There are no tests in the file, but it is still run since it begins with "test".
The code will append the current directory name of the test_setup.py file to the system path within the test environment. This will be done only once, so there are not a bunch of things added to the path.
Then, from within any test function, you can import modules relative to that top-level folder (such as import geom.region) and it knows where to find it since the src directory was added to the path.
If you want to run a single test file (such as test_util.py) instead of all the files, you would use:
pytest test_setup.py test\test_util.py
This runs both the test_setup and test_util code so that the test_setup code can still be used.
Are so late to answer that question but usining python 3.9 or 3.10 u just need to add __init__.py folder in tests folders.
When u add this file python interprets this folders as a module.
Wold be like this
src/
main.py
util.py
test/
__init__.py
test_util.py
geom/
vector.py
region.py
test/
__init__.py
test_vector.py
test_region.py
so u just run pytest.
Sorry my poor english
Not the best solution, but maybe the fastest one:
cd path/python_folder
python -m pytest python_file.py
I am having the following directory structure and I'm new to python. In folder bin I have a single file "script.py" in which I want to import "module.py" from code package. I have seen many solutions on stackoverflow to this problem that involve modifying sys-path or giving full pathname. Is there any way I could import these with an import statement that works relatively, instead of specifying full path?
project/
bin/
script.py
code/
module.py
__init__.py
To make sure that bin/script.py can be run without configuring environment, add this preambule to script.py before from code import module line (
from twisted/bin/_preambule.py):
# insert `code/__init__.py` parent directory into `sys.path`
import sys, os
path = os.path.abspath(sys.argv[0]) # path to bin/script.py
while os.path.dirname(path) != path: # until top-most directory
if os.path.exists(os.path.join(path, 'code', '__init__.py')):
sys.path.insert(0, path)
break
path = os.path.dirname(path)
The while loop is to support running it from bin/some-other-directory/.../script.py
While correct, I think that the "dynamically add my current directory to the path" is a dirty, dirty hack.
Add a (possibly blank) __init__.py to project. Then add a .pth file containing the path to project to your sites-packages directory. Then:
from project.code import module
This has two advantages, in my opinion:
1) If you refactor, you just need to change the from project.code line and avoid messing with anything else.
2) There will be nothing special about your code -- it will behave exactly like any other package that you've installed through PyPi.
It may seem messy to add your project directory to your PYTHONPATH but I think it's a much, much cleaner solution than any of the alternatives. Personally, I've added the parent directory that all of my python code lives in to my PYTHONPATH with a .pth file, so I can deal with all of the code I write just like 3rd party libraries.
I don't think that there is any issue with cluttering up your PYTHONPATH, since only folders with an __init__.py will be importable.
I created a project in python, and I'm curious about how packages work in python.
Here is my directory layout:
top-level dir
\ tests
__init__.py
\ examples
__init__.py
example.py
module.py
How would I go about including module.py in my example.py module. I know I could set PYTHONPATH to the top-level directory, but that doesn't seem like a good solution. This is how pydev gets around this problem, but I'd like a solution that didn't require updating environment variables.
I could put something at the top of the example.py to update sys.path like this:
from os import path
import sys
sys.path.append( path.dirname(path.abspath(path.dirname(__file__))) )
I don't think this is an appropriate solution either.
I feel like I'm missing some basic part of python packages. I'm testing this on python 2.6. If any further clarification is needed, please let me know.
Python packages are very simple: a package is any directory under any entry in sys.path that has an __init__.py file. However, a module is only considered to be IN a package if it is imported via a relative import such as import package.module or from package import module. Note that this means that in general, someone must set up sys.path to contain the directories above any package you want to be importable, whether via PYTHONPATH or otherwise.
The primary wrinkle is that main modules (those run directly from the command line) are always __main__ no matter their location. They therefore have to do absolute imports, and either rely on PYTHONPATH being set up, or munge sys.path themselves.
In your case, I'd recommend either having a small Python script that runs your examples after setting up the correct path. Say you put it in the top level directory:
#!/usr/bin/env python
import sys
import os.path
sys.path.append(os.path.dirname(__file__))
example = __import__("examples", globals(), locals(), sys.argv[1])
Then your example can do "import module".
Alternately, if module.py is meant to also be in a package, add the PARENT of your "top level directory" to sys.path and then use the from .. import module syntax in your example modules. Also, change the first parameter to the __import__ in the wrapper to "tldname.examples" where tldname is the name of your top-level directory.
If you don't want to rely on an "example runner" module, each example needs sys.path boilerplate, or you need a PYTHONPATH setting. Sorry.
http://docs.python.org/tutorial/modules.html#intra-package-references
from .. import module
I'm working on a project where all the code in the source tree is separated into module directories, e.g.:
modules/check/lib/check.py
modules/edit/lib/edit.py
During installation, the Python files are put in the same directory program_name under Python's site-packages. All the modules therefore use the syntax import program_name.edit.
Because of the directory and import structure, the source modules are unable to import each other, so you'd have to install them each time you want to run anything in the source tree.
My questions are therefore: Without modifying the directory structure, how can I make sure that modules/check/lib/check.py imports from modules/edit/lib/edit.py and that site-packages/program_name/check.py imports from site-packages/program_name/edit.py? And for a possible reorganization, what are best practices for the directory structure and imports in an environment like this?
You can just add the /modules/ directories to your PYTHONPATH in your dev environment. Once installed in site-packages, calling import edit inside check.py will import the correct module since they are in the same directory. Calling import edit from your dev environ will import the one you added to your PYTHONPATH
Why don't you install symlinks underneath prog_name on your dev machine?