Accessing resource files in Python unit tests & main code - python

I have a Python project with the following directory structure:
project/
project/src/
project/src/somecode.py
project/src/mypackage/mymodule.py
project/src/resources/
project/src/resources/datafile1.txt
In mymodule.py, I have a class (lets call it "MyClass") which needs to load datafile1.txt. This sort of works when I do:
open ("../resources/datafile1.txt")
Assuming the code that creates the MyClass instance created is run from somecode.py.
The gotcha however is that I have unit tests for mymodule.py which are defined in that file, and if I leave the relative pathname as described above, the unittest code blows up as now the code is being run from project/src/mypackage instead of project/src and the relative filepath doesn't resolve correctly.
Any suggestions for a best practice type approach to resolve this problem? If I move my testcases into project/src that clutters the main source folder with testcases.

I usually use this to get a relative path from my module. Never tried in a unittest tho.
import os
print(os.path.join(os.path.dirname(__file__),
'..',
'resources'
'datafile1.txt'))
Note: The .. tricks works pretty well, but if you change your directory structure you would need to update that part.

On top of the above answers, I'd like to add some Python 3 tricks to make your tests cleaner.
With the help of the pathlib library, you can explicit your ressources import in your tests. It even handles the separators difference between Unix (/) and Windows ().
Let's say we have a folder structure like this :
`-- tests
|-- test_1.py <-- You are here !
|-- test_2.py
`-- images
|-- fernando1.jpg <-- You want to import this image !
`-- fernando2.jpg
You are in the test_1.py file, and you want to import fernando1.jpg. With the help to the pathlib library, you can read your test resource with an object oriented logic as follows :
from pathlib import Path
current_path = Path(os.path.dirname(os.path.realpath(__file__)))
image_path = current_path / "images" / "fernando1.jpg"
with image_path.open(mode='rb') as image :
# do what you want with your image object
But there's actually convenience methods to make your code more explicit than mode='rb', as :
image_path.read_bytes() # Which reads bytes of an object
text_file_path.read_text() # Which returns you text file content as a string
And there you go !

in each directory that contains Python scripts, put a Python module that knows the path to the root of the hierarchy. It can define a single global variable with the relative path. Import this module in each script. Python searches the current directory first so it will always use the version of the module in the current directory, which will have the relative path to the root of the current directory. Then use this to find your other files. For example:
# rootpath.py
rootpath = "../../../"
# in your scripts
from rootpath import rootpath
datapath = os.path.join(rootpath, "src/resources/datafile1.txt")
If you don't want to put additional modules in each directory, you could use this approach:
Put a sentinel file in the top level of the directory structure, e.g. thisisthetop.txt. Have your Python script move up the directory hierarchy until it finds this file. Write all your pathnames relative to that directory.
Possibly some file you already have in the project directory can be used for this purpose (e.g. keep moving up until you find a src directory), or you can name the project directory in such a way to make it apparent.

You can access files in a package using importlib.resources (mind Python version compatibility of the individual functions, there are backports available as importlib_resources), as described here. Thus, if you put your resources folder into your mypackage, like
project/src/mypackage/__init__.py
project/src/mypackage/mymodule.py
project/src/mypackage/resources/
project/src/mypackage/resources/datafile1.txt
you can access your resource file in code without having to rely on inferring file locations of your scripts:
import importlib.resources
file_path = importlib.resources.files('mypackage').joinpath('resources/datafile1.txt')
with open(file_path) as f:
do_something_with(f)
Note, if you distribute your package, don't forget to include the resources/ folder when creating the package.

The filepath will be relative to the script that you initially invoked. I would suggest that you pass the relative path in as an argument to MyClass. This way, you can have different paths depending on which script is invoking MyClass.

Related

File operations not working in relative paths

I am working on a python3 app with a fairly simple file structure, but I'm having issues reading a text file in a script, both of which are lower in the file structure than the script calling them. To be absolutely clear, the file structure is as follows:
app/
|- cli-script
|- app_core/
|- dictionary.txt
|- lib.py
cli-script calls lib.py, and lib.py requires dictionary.txt to do what I need it to, so it gets opened and read in lib.py.
The very basics of cli-script looks like this:
from app_core import lib
def cli_func():
x = lib.Lib_Class()
x.lib_func()
The problem area of lib is here:
class Lib_Class:
def __init__(self):
dictionary = open('dictionary.txt')
The problem I'm getting is that while I have this file structure, the lib file can't find the dictionary file, returning a FileNotFoundError. I would prefer to only use relative paths for portability reasons, but otherwise I just need to make the solution OS agnostic. Symlinks are a last resort option I've figured out, but I want to avoid it at all costs. What are my options?
When you run a Python script, calls involving paths are executed relative to where you run them from, not where the files are actually from.
The __file__ variable stores the path of the current file (no matter where it is), so relative files will be siblings to that.
In your structure, __file__ refers to the path app/app_core/lib.py, so to create app/app_core/dictionary.txt, you need to co up and then down again.
app/app_core/lib.py
import os.path
class Lib_Class:
def __init__(self):
path = os.path.join(os.path.dirname(__file__), 'dictionary.txt')
dictionary = open(path)
or using pathlib
path = pathlib.Path(__file__).parent / 'dictionary.txt'
Because you are expecting the dictionary.txt to be present in the same path as your lib.py file you can do the following.
Instead of dictionary = open('dictionary.txt') use
dictionary = open(Path(__file__).parent / 'dictionary.txt')

How do I ensure that a python package module saves results to a sub-directory of that package?

I'm creating a package with the following structure
/package
__init__.py
/sub_package_1
__init__.py
other_stuff.py
/sub_package_2
__init__.py
calc_stuff.py
/results_dir
I want to ensure that calc_stuff.py will save results to /results_dir, unless otherwise specified (yes, I'm not entirely certain having a results directory in my package is the best idea, but it should work well for now). However, since I don't know from where, or on which machine calc_stuff will be run, I need the package, or at least my_calc.py, to know where it is saved.
So far the two approaches I have tried:
from os import path
saved_dir = path.join(path.dirname(__file__), 'results_dir')
and
from pkg_resources import resource_filename
filepath = resource_filename(__name__, 'results_dir')
have only given me paths relative to the root of the package.
What do I need to do to ensure a statement along the lines of:
pickle.dump(my_data,open(os.path.join(full_path,
'results_dir',
'results.pkl'), 'wb')
Will result in a pickle file being saved into results_dir ?
I'm not entirely certain having a results directory in my package is the best idea, me either :)
But, if you were to put a function like the following inside a module in subpackage2, it should return a path consisting of (module path minus filename, 'results_dir', the filename you passed the function as an argument):
def get_save_path(filename):
import os
return os.path.join(os.path.dirname(__file__), "results_dir", filename)
C:\Users\me\workspaces\workspace-oxygen\test36\TestPackage\results_dir\foo.ext

Where to place absolute paths in a python project?

I have the following project structure
ts_tools
/bin
/docs
/lib
/ts_tools
/data_transfer
/tests
data_import.py
__init__.py
/data_manipulation
/tests
data_smoothing.py
__init__.py
__init__.py
config.yaml.conf
setup.py
LICENSE
README.md
TODO.md
I would like to import data with the data_import.py file from an external source. I use the config.yaml.conf file to specify the absolute paths of the data with:
root_path:
windows:
data_fundamental: C:\\Users\\Malcom\\Data_Fundamental
data_event: C:\\Users\\Malcom\\Data_Event
linux:
data_fundamental: /home/data/data_fundamental
data_event: /home/data/data_event
The respective paths should be available for all tools in the ts_tools package (i.e. data_import.py and data_smoothing.py). Furthermore, the program should identify the os and choose the path structure accordingly.
I know how to set the paths with the yaml file using
import yaml
with open("config.yaml.conf", "r") as ymlfile:
cfg = yaml.load(ymlfile)
and I know how to discriminate between the os with
if platform.system().lower() == 'windows':
ROOT_DATA_PATH = cfg['windows']
else:
ROOT_DATA_PATH = cfg['linux']
but I don't know where to place these code snippets. I don't think that it is appropriate to use it in the setup.py file. On the other hand I consider it inappropriate to specify a new .py file. What is a good design structure for this problem? Where should a specify absolute file paths? Is my ansatz a step in the right direction?
Thank you in advance.
In this case, you can make it relative to the home directory, so you can have ~/data_fundamental and ~/data_event (Which should be equivalent on both platforms). You can expand this with os.path.expandhome
import os.path
def get_path(s):
return os.path.normpath(os.path.normcase(
os.path.expanduser(os.path.expandvars(s))
))
# get_path('~/data_fundamental') on Windows:
# r'c:\users\malcom\data_fundamental'
# get_path('~/data_fundamental') on Linux:
# r'/home/data/data_fundamental'
# (Assuming your username on Windows is Malcolm
# and on Linux is data and you haven't changed
# the default home path)
In any case, having two different setup things might be overly confusing, and you should expand ~ and %VARS% and ${VARS} anyways to make setting it up easier and run as expected.
Your other alternatives include:
Reading from environment variables
Writing it in setup.py (You should probably allow some way to change where the config file is as setup.py might put it in a write-protected location)
You could also not have a default at all, and when not given either make a default based on sys.platform() or raise an error message telling the user to set it.
Let's identify two different type of files/data.
Files/data written by the user or for the user during installation/deploy
Files/data written by the coder
It can be okay to have absolute paths in files/data defined by the user or generated by the program executing on the user machine.
Absolute paths are intrinsically more fragile than relative paths, but it's not that bad in the first case.
In the second case you should never use absolute paths. I see that you are even using two different paths for windows and linux. You don't have to do that and you shouldn't.
In Python you have things such as os.path.expanduser('~') to find the user path, or packages like appdirs. You want to be cross-platform as much as possible, and with Python is almost always possible.

How to make python config file, in which relative paths are defined, but when scripts in other directories import config, paths are correct?

I have the following directory structure for a program I'm writing in python:
\code\
main.py
config.py
\module_folder1\
script1.1.py
\data\
data_file1
data_file2
My config.py is a set of global variables that are set by the user, or generally fixed all the time. In particular config.py defines path variables to the 2 data files, something like path1 = os.path.abspath("../data/data_file1"). The primary use is to run main.py which imports config (and the other modules I wrote) and all is good.
But sometimes I need to run script1.1.py by itself. Ok, no problem. I can add to script1.1 the usual if __name__ == '__main__': and I can import config. But then I get path1 = "../code/data/data_file1" which doesn't exist. I thought that since the path is created in config.py the path would be relative to where config.py lives, but it's not.
So the question is, how can I have a central config file which defines relative paths, so I can import the config file to scripts in different directories and have the paths still be correct?
I should mention that the code repo will be shared among multiple machines, so hardcoding an absolute path is not an option.
You know the correct relative path to the file from the directory where config.py is located
You know the correct relative path to the directory where config.py is located (in your case, ..)
Both of this things are system-independent and do not change unless you change the structure of you project. Just add them together using os.path.join('..', config.path_repative_to_config)
(Not sure who posted this as a comment, then deleted it, but it seems to work so I'm posting as an answer.) The trick is to use os.path.dirname(__file__) in the config file, which gives the directory of the config file (/code/) regardless of where the script that imports config is.
Specifically to answer the question, in the config file define
path1 = os.path.abspath(os.path.join(os.path.join(os.path.join( os.path.dirname(__file__) , '..'), 'data' ), 'data_file1' ) )

Proper design of Python project

I'm planning to have project with following structure:
./runme.py
./my_modules/__init__.py
./my_modules/global_imports.py
./my_modules/user_defined_functions.py
Idea is to store important variables in global_imports.py from where they will be imported into runme.py using from my_modules.global_imports import * (I know it is a bad practice import modules this way, but I promise there will be just few variables with not colliding names)
Four questions:
Two of the variables contained inside global_imports.py should be SCRIPT_PATH and SCRIPT_DIR. I've tried SCRIPT_PATH = os.path.realpath(__file__) and SCRIPT_DIR = os.path.dirname(SCRIPT_PATH) but it returns path (directory) for global_imports.py not for runme.py. How can I get path (directory) of runme.py?
Inside global_imports.py I will probably import modules such as os and sys. I also need to import those modules inside runme.py. Is this considered as problem, when modules are imported first from another module and later from main script or vice versa?
Is it possible to import variables from global_imports.py into user_defined_functions.py? I consider this as bad practice I'm just curious.
Is there better approach to separate project into modules?
Addressing your questions in order:
In the first variable SCRIPT_DIR you are getting the full path of the file global_imports.py, which would be something like this:
SCRIPT_PATH = '/home/../../my_project/my_modules/global_imports.py'
now in order to get the directory of runme.py, we should consider another variable:
SCRIPT_PATH_DIR = os.path.dirname(SCRIPT_PATH)
this will give us the path
SCRIPT_PATH_DIR = '/home/../../my_project/my_modules/'
now to get to its parent directory which contains runme.py we can get like this:
SCRIPT_DIR = os.path.abspath(os.path.join(SCRIPT_PATH_DIR, os.pardir))
Now, SCRIPT_DIR gives the path of runme.py i.e.:
SCRIPT_DIR = '/home/../../my_project/'
As per our project structure, runme.py should only conrtain an import to the main module and then command to run the app. So, it shouldn't contain any other imports. Still if you need to use a module then explicitly import it in the runme.py as one of the Zen of Python says 'Explicit is better than implicit'
Yes, it is possible, you can do it like this:
from .global_imports import variable_name
but in general you should have a separate config.py or settings.py file in '/../my_project/' directory which should contain all the settings and variables which you may need to use anywhere in the project.
This approach is good enough as far as I've seen. Your main project directory contains runme.py and other modules which are used inside the project. 'my_modules' is one of the modules I think. You can have more of such modules inside the project directory. A better approach is to have settings and configurations inside one of the modules(such as my_modules) only and to have other modules for the functionality.
Hope this helps, please comment if something is unclear.

Categories

Resources