How to import a module statically within a dynamically loaded module - python

I am dynamically loading filters to my app. Now I want to be able to extend existing filters:
In my Main.py I do:
spec = importlib.util.spec_from_file_location(filter, file)
inst = importlib.util.module_from_spec(spec)
spec.loader.exec_module(inst)
I have a file called varianceFilter.py that loads and runs fine:
varianceFilter.py:
def run(images):
#do something
return result
Now I want to reuse and extend this filter like so:
testFilter.py:
import varianceFilter as vf
def run(images):
ret = vf.run(images)
#do something with ret
return ret
However as soon as I try to import varianceFilter.py the exception
No module named 'varianceFilter'
is thrown. both files are in the same directory.
What am I doing wrong?
EDIT:
My directory structure is:
main.py
filters/varianceFilter.py
filters/testFilter.py
After creating a copy of varianceFilter in the main directory, testFilter.py works fine.

Adding the empty file
filters/__init__.py
and changing the import statement in testFilter.py to
import filters.varianceFilter as vf
fixed the problem.

Related

Import pyarmor obfuscated code using importlib

Suppose I have 2 modules - one has been obfuscated by PyArmor. The other imports the obfuscated module and uses it:
# obfuscated.py
def run_task(conn):
conn.send_msg("Here you go")
print(conn.some_val + 55)
return 0
# Non obfuscated (user) code
import importlib.util
class conn:
some_val = 5
def send_msg(msg):
print(msg)
def main():
# import obfuscated # This works...but I need to dynamically load it:
# This does not:
spec = importlib.util.spec_from_file_location("module.name", r'c:\Users\me\obfuscated.py')
obfuscated = importlib.util.module_from_spec(spec)
spec.loader.exec_module(swdl)
ret = obfuscated.run_task(conn)
print("from main: ", ret)
if __name__ == "__main__":
main()
If I import the obfuscated file using import it is fine. But I need to use importlib to dynamically import the obfuscated file. The importlib does not work - I get:
AttributeError: module 'module.name' has no attribute 'obfuscated'
The idea is that the user can write a script using the API available within obfuscated.py but need to load the module from wherever it resides on their system.
Is there anyway to achieve this?
I think I have a method based on what I read here: https://pyarmor.readthedocs.io/en/latest/mode.html#restrict-mode
I use a proxy between the user code and the obfuscated code.
User code may or may not be obfuscated
The obfuscated code is obviously obfuscated!
The proxy must not be obfuscated (for simplicity, I obfuscated everything then copied the original proxy.py over the obfuscated one)
So, now user code imports the proxy.py using importlib instead of the obfuscated.py.
And the proxy merely imports the obfuscated.py:
# proxy.py
import obfuscated
I managed to import modules dynamically in this way:
code = open('c:\Users\me\obfuscated.py','r').read()
spec = importlib.util.spec_from_loader(package_name,loader=None)
module = importlib.util.module_from_spec(spec)
module.__file__ = 'c:\Users\me\obfuscated.py'
globals_dict = {"__file__":module.__file__}
exec(code, globals_dict)
for item in [x for x in globals_dict["__builtins__"] if not x.startswith("_")]:
setattr(module,item,globals_dict["__builtins__"].get(item))
It reads code from a file, initiates a module, and eventually puts variables in a dictionary. You can find the module's functions in globals_dict["__builtins__"]

Failed to import defined modules in __init__.py

My directory looks like this :
- HttpExample:
- __init__.py
- DBConnection.py
- getLatlong.py
I want to import DBConnection and import getLatlong in __init__.py. There is no error in my __init__.py until I run it, I received :
System.Private.CoreLib: Exception while executing function: Functions.HttpExample. System.Private.CoreLib: Result: Failure
Exception: ModuleNotFoundError: No module named 'getLatlong'
I'm trying to use function in getLatlong to use the information input by user from __init__.py to getLatlong. Below is the code:
__init__.py :
from getLatlong import result
from DBConnection import blobService, container_name, account_key, file_path
def main(req: func.HttpRequest) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
section = req.params.get('section')
bound = req.params.get('bound')
km_location = req.params.get('km_location')
location = req.params.get('location')
if not section:
try:
req_body = req.get_json()
except ValueError:
pass
else:
section = req_body.get('section')
if section and bound and km_location:
result(section, km_location, bound, location).getResult() #HERE
return func.HttpResponse(f"Hello {section}, {bound}!")
#elif section is None or bound is None or km_location is None:
# return func.HttpResponse("All values are mandatory!")
I am also receiving compile error at getLatlong to import DBConnection to this class. The following values will pass to getLatlong.py. The code :
from DBConnection import blobService, container_name, account_key, file_path #Another import error here says : Unable to import DBConnection
class result:
def __init__(self, section, bound, km_location, location):
self.section = section
self.bound = bound
self.km_location = km_location
self.location = location
def getResult(self):
print(self.section)
print(self.bound)
print(self.km_location)
print(self.location)
I've tried every way to import these files before I lost my mind..
You get these errors, because Python does not know where to look for the files you want to import. Depending on which Python version you are using, I see three ways to solve this:
You could add HttpExample to your PYTHONPATH and than your imports should work as you have them currently.
Another way would be to use the sys module and append the path to HttpExample, e.g.
import sys
sys.path.append('PATH/TO/HttpExample')
But you would have to do this in all files, where you want to import something from the parent folder.
Or you use relative imports, which have been available since Python 2.5 (See PEP238). Those are only available in modules, but as you have your __init__.py file, it should work. For relative imports you are using dots . to tell Python where to look for the import. One dot . tells Python to look for the desired import in the parent folder. You could also use .. to go up two levels. But one level should be enough in your case.
So in your case changing your code to this, should solve your problem.
In __init.py__:
from .getLatlong import result
from .DBConnection import blobService, container_name, account_key, file_path
In getLangLong.py:
from .DBConnection import blobService, container_name, account_key, file_path
You could try from __app__.HttpExample import getLatlong.
There is a document about how to import module in the Shared Code folder. Check this doc:Folder structure.
It says Shared code should be kept in a separate folder in __app__. And in my test this could work for me.

Creating PyTest fixture parameters dynamically from another fixture

I have an AWS S3 directory containing several JSON files, which are used as test inputs.
I've created a PyTest module that downloads all JSON files once using a module wide fixture, and then runs several test functions - each being parameterized over the set of JSONs:
import pytest
import os
from tempfile import mkdtemp, TemporaryDirectory
from glob import glob
JSONS_AWS_PATH = 's3://some/path/'
def download_jsons():
temp_dir = mkdtemp()
aws_sync_dir(JSONS_AWS_PATH, temp_dir)
json_filenames = glob(os.path.join(local_path, "*.json"))
return json_filenames
#pytest.fixture(scope='module', params=download_jsons()) #<-- Invoking download_jsons() directly
def json_file(request):
return request.param
def test_something(json_file):
# Open the json file and test something
def test_something_else(json_file):
# Open the json file and test something else
def test_another_thing(json_file):
# you got the point...
This test module in itself works - the only pain point is how to cleanup the temp_dir at the end of the module\session.
Since download_jsons() is being invoked directly, before json_file fixture is even started - it has no context of its own. So I can't make it clean temp_dir after all the tests are done.
I would like to make download_jsons() a module\session scope fixture in itself. Something like:
fixture(scope='module')
def download_jsons():
temp_dir = mkdtemp()
# Download and as glob, as above
yield json_filenames
shutil.rmtree(temp_dir)
or
fixture(scope='module')
def download_jsons(tmpdir_factory):
#...
as #Gabriela Melo has suggested.
The question is how to make the json_file fixture parameterized over the list returned by download_jsons(), without invoking it directly?
I've tried implementing this solution with either mark.parametrize, setup_module(), or pytest_generate_tests(metafunc) - but wasn't able to implement the exact functionality I was looking for.
This seems to be what you're looking for: https://docs.pytest.org/en/latest/tmpdir.html#the-tmpdir-factory-fixture
(Using Pytest's tmpdir_factory fixture and setting the scope of your json_file function to session instead of module)
If you want to use a resource for parametrization, it can't be returned by a fixture (at least with the current version of pytest). However, you can move the setup/teardown code out to the hooks - this will also enable parametrizing via pytest_generate_tests hook. Example: in the root dir of your project, create a file named conftest.py with the following contents:
from tempfile import TemporaryDirectory
from pathlib import Path
def pytest_sessionstart(session):
# this is where we create the temp dir and download the files
session.config._tempdir = TemporaryDirectory()
d = session.config._tempdir.name
aws_sync_dir(JSONS_BLOBBY_PATH, d)
session.config._json_files = Path(d).glob("*.json")
def pytest_sessionfinish(session):
# this is where we delete the created temp dir
session.config._tempdir.cleanup()
def pytest_generate_tests(metafunc):
if "file" in metafunc.fixturenames:
# parametrize the file argument with the downloaded files
files = metafunc.config._json_files
metafunc.parametrize("file", files)
You can now use the file argument in tests as usual, e.g.
def test_ends_with_json(file):
assert file.suffix == ".json"

Set path of dynamically loaded module in Python 3.7

OK, so this will be a little tricky to explan well. I will try my best.
So consider my naive implementation of "load python module by filename" as follows:
def load_module_by_filepath(module_name, module_filepath):
module=None
failure=None
try:
spec = importlib.util.spec_from_file_location(module_name, module_filepath)
module = importlib.util.module_from_spec(spec)
spec.loader.exec_module(module)
except Exception as e:
failure=f"Import of module '{module_name}'from file '{module_filepath}' had errors ({e})"
module=None
return module, failure
Let's say I have the following code in a file my_mod.py:
from my_other_mod import something
def somefunc():
something()
Then I load it using my loader like this:
module, failure = load_module_by_filepath('my_mod', 'my_mod.py')
At this point I will be able to call somefunc from my loaded module like so:
my_callable = getattr(module, 'somefunc')
my_callable()
All good, except, if you look carefully there is an import statement in my_mod.py for something from my_other_mod.
So what happens when I run this code is that I get a failure to load the module saying "No module 'my_other_mod'".
I am guessing this is because my_other_mod was installed in a location that is not available to my newly loaded module. They don't belong to the same environment or context if you will.
So my question is, how can I change my loader to add my_other_mod to the path of my_mod so that it will load and run?

Import all files in current directory

I have just started a python project. The directory structure is as follows:
/algorithms
----/__init__.py
----/linkedlist
--------/__init__.py
--------/file1.py
--------/file2.py
/tests
----/test_linkedlist
You can also check the Github repository.
In each of the sub folders under algorithms, in the __init__ file I am including the following for all the files one by one:
from .file1 import *
from .file2 import *
And so on.
The task that I am trying to achieve is running all tests together using the query:
python3 -m unittest discover tests
Each file in the tests directory starts as follows:
from algorithms.linkedlist import *
import unittest
Right now if I want to add a new file to the linkedlist directory, I create the file and then add another from .filename import * in the __init__ file.
How do I write a script in the __init__ file so that each time I create a new file, I do not have to manually insert the import command?
So the __init__ is in the same folder? As the docs say The import statement is syntactic sugar for the __import__ function.
So we can use:
import importlib
import glob
for file in glob.iglob('*.py'):
importlib.__import__(file)
Some reasons why this does not work:
You want to load the functions in the module - the import * from syntax. With this code you can only run file1.test.
You run the script loading from another directory, which confuses glob. We have to specify the actual working directory.
__import__ prefers to know the module name.
To find the solution I combine the import * from function from this answer with pkgutil.walk_packages from this blog.
import importlib
import pkgutil
def custom_import_all(module_name):
""" Use to dynamically execute from module_name import * """
# get a handle on the module
mdl = importlib.import_module(module_name)
# is there an __all__? if so respect it
if "__all__" in mdl.__dict__:
names = mdl.__dict__["__all__"]
else:
# otherwise we import all names that don't begin with _
names = [x for x in mdl.__dict__ if not x.startswith("_")]
# now drag them in
globals().update({k: getattr(mdl, k) for k in names})
__path__ = pkgutil.extend_path(__path__, __name__)
for importer, modname, ispkg in pkgutil.walk_packages(path=__path__, prefix=__name__+'.'):
custom_import_all(modname)

Categories

Resources