I was wondering if there are any sort of python codeing etc that will displays the files imports/used locations in a python file?
Eg. TestA.py contains 3 files from 3 different directory
Import01 : /u/ext/TestA/UI
Import02 : /u/ext/TestA/src
Import03 : /user_data/setup/localImports
And hence, while executing the coding, it will displays the list of directories used in the python file?
I am asking as I am working on several (and maybe tons, in the future) scripts that are heavily involved in Maya, there are times in which when I located the path but they are the wrong ones (with same name) and is actually located in another path
Add this code to module
import inspect
frame = inspect.currentframe()
if frame and frame.f_back:
print('module "{}" is imported by "{}"'.format(__file__, frame.f_back.f_locals['__file__']))
If module_a.py contains the code above, and main.py imports it. the output is
module "/path/to/module_a.py" is imported by "/path/to/main.py"
As documented, this answer may not be an exact solution. Because if not supported, returns None.
CPython implementation detail: This function relies on Python stack frame support in the interpreter, which isn’t guaranteed to exist in all implementations of Python. If running in an implementation without Python stack frame support this function returns None.
At any point in when the code is running, you can determine the origin of a module by checking it's file attribute:
import sys
for name, each_mod in sys.modules.items():
try:
print name, each_mod.__file__
except AttributeError: # = built in module or dll
print "?"
To check the imports without running the code, you'd need do more complex analysis: Here's an example method that could probably be adapted to figure it out :http://www.tarind.com/depgraph.html
You could also create a custom ModuleFinder that printed out file sources as imports are processed. Something like this, which prints out name of py/pyc files when trying to load them.
import os
import sys
import imp
import ihooks
class ReportingFinder(object):
"""Find modules collected in a shelve archive."""
def __init__(self, path_entry):
self.path_entry = path_entry
if not os.path.isdir(path_entry):
raise ImportError
def find_module(self, fullname, path=None):
for suffix in (".py", ".pyc"):
test_path = os.path.join(self.path_entry, fullname + suffix)
print test_path
if os.path.exists(test_path):
print "attemnpting to load from %s" % test_path
return self
return None
def load_module(self, name):
stuff = imp.find_module(name)
return ihooks.FancyModuleLoader(verbose=1).load_module(name, stuff)
sys.path_hooks.insert(0, ReportingFinder)
HACK WARNING!!!! Please be aware this code is a quick diagnostic hack! Don't use it for production :) Among other flaws, it will print out py path names even if the code comes from the pyc, and it's dumb about packages -- I only provided it because it sounds like you're using single-file scripts rather than packages. It is handy for catching imported modules as they get loaded. It won't print out the names of zip files.
It sounds like the real problem is having too many competing paths: you should try to get down to as few as you can so that there are fewer suprises.
Related
I'm developing a Python application that controls hardware. The hardware consists of FPGA-based systems. Their internal structure from the control point of view is managed with an automated system AGWB ( http://github.com/wzab/agwb.git ).
The description of each system is compiled into a Python module "agwb" containing a few submodules.
After the module is loaded, it is used to produce an object handling further communication with the associated system.
When my application works with a few different systems, it has to load a few different "agwb" modules (of course, each one from a different directory).
Loading of the module from a specified directory can be easily achieved by adding that directory at the beginning of the sys.path (and removing it, after the module is loaded).
Unfortunately, Python imports the module "agwb" only once. Later attempts to import it results in using of the previously loaded one.
Using the importlib.reload also does not help.
After some reading of the https://docs.python.org/3/reference/import.html I have found the following solution:
import sys
import importlib
def load_agwb(lddir,top):
sys.path.insert(0,lddir)
importlib.import_module('agwb')
# Get the class for the top entity
res = getattr(sys.modules['agwb'],top)()
sys.path.remove(lddir)
sys.modules.pop('agwb')
mods_to_remove = []
for mod in sys.modules.keys():
if mod.find('agwb.') == 0:
mods_to_remove.append(mod)
for mod in mods_to_remove:
sys.modules.pop(mod)
return res
at1 = load_agwb('./t1','top_v0')
at2 = load_agwb('./t2','top_v0')
print("Call at1.show")
at1.show()
print("Call at2.show")
at2.show()
The directories "./t1", and "./t2" contain two different versions of "agwb" module.
After executing the script we get the following output proving that indeed two different versions of "agwb" have been loaded and are used simultaneously:
Call at1.show
object from t1, desc= That's version from directory t1
type ID= 1111
Call at2.show
object from t2, desc= That's version from directory t2
type ID= 2222
The whole minimal demonstrator is available (as a shar archive) at https://pastebin.com/ZGZj8qV1 (unfortunately, I can't attach a file to my question).
And here come my questions:
Is the above described method a reliable, legal solution in Python?
Maybe it works just by chance and may stop working in the next version of Python?
Are there other, more "official" methods of solving my problem?
I have created an answer based on pavel's comment (thanks a lot!).
I've used the temporary directory and symlinks to solve the problem of long, and complex paths. Below is the result:
import sys
import os
import importlib
import tempfile
# Directory with description of the first board
dir1 = './xt1/very/long/path'
# Directory with description of the second board
dir2 = './xt2/completely/different/path'
# Name of the top entity in the design
top = 'top_v0'
# Create a temporary directory
tmpdir = tempfile.TemporaryDirectory()
# Add it to the Python path
sys.path.insert(0,tmpdir.name)
# Create symlinks to the directories with descriptions of the boards
os.symlink(os.path.abspath(dir1),tmpdir.name+"/t1")
os.symlink(os.path.abspath(dir2),tmpdir.name+"/t2")
# Import description of the first board as agwb
import t1.agwb as agwb
# use the imported agwb
at1 = getattr(agwb,top)()
# if the name of the top entity is constant, you can do it easier
bt1 = agwb.top_v0()
# Import description of the second board as agwb
import t2.agwb as agwb
# use the imported agwb
at2 = getattr(agwb,top)()
# if the name of the top entity is constant, you can do it easier
bt2 = agwb.top_v0()
# Now we can remove the temporary directory from the python path
sys.path.remove(tmpdir.name)
# We can even delete it (otherwise it will be deleted when Python session ends)
tmpdir.cleanup()
# Check if both versions are loaded and available
print("Call at1.show")
at1.show()
print("Call at2.show")
at2.show()
print("Call bt1.show")
bt1.show()
print("Call bt2.show")
bt2.show()
As previously, the whole minimal reproducer is available in the shar format at https://pastebin.com/GrRUh8U9 .
I am currently writing automation scripts for a proprietary Windows desktop application at work using WinAppDriver with Python. Our application has the user upload a handful of files, does some behind the scenes calculating based on the files uploaded and then spits out results. I have automation in place that uploads these files using the UI and am not having any issues with this specifically. The process to do this is as follows:
Click the ‘Choose File’ button. Browse to file location in pop up window
Click in ‘File Name’ field and input the direct path to the file. Click OK (This is being done with the Python Keyboard library)
Repeat previous steps for all necessary files
Click ‘Go’
To tidy up my scripts, I have set the file paths to variables instead of using their direct paths in my code. Then I just call the variable name for the file I need.
E.g. file_to_upload_1: str = r”C:\Users\user\...\filename.txt
I have created a separate filePaths.py where all these file paths set to variables are stored so adding/modifying them in the future is easy and all in one place.
The issue that I am running into with all of this is when I import this .py that contains my file paths set to variables. Right now, I am doing from filePaths import * for simplicity sake. This is generally frowned upon and VS Code throws warnings at me advising I have imported unused imports. I went ahead and set my variables to separate classes and then tried to import them in the following way: from filePaths import dataset_1 When I do this I get the follow error: Undefined variable “variable_name” and my tests fail to run. It seems like I can only get this all to work if I import everything and I would like to avoid doing that if possible. All my scripts are in the same directory. What am I missing here?
Sample of code:
from filePaths import * <-- THIS WORKS!
# from filePaths import class_1 <-- THIS DOES NOT
#Open App
desired_caps = {}
desired_caps["app"] = "C:\\Users\\Public\\Desktop\\Application_Being_Tested.lnk"
driver = webdriver.Remote("http://127.0.0.1:4723", desired_caps)
#Login
driver.find_element_by_accessibility_id("Username").send_keys("tester")
driver.find_element_by_accessibility_id("UserPassword").send_keys("password")
driver.find_element_by_accessibility_id("btnLogin").click()
###Upload Files###
#First File To Upload
driver.find_element_by_accessibility_id("ChooseFile").click()
time.sleep(.1)
driver.find_element_by_accessibility_id("FileName").click()
keyboard.write(filePaths_variable)
keyboard.press_and_release('enter')
You have three options:
Import everything using the wildcard (i.e. from filePaths import *)
Import select objects (i.e. from filePaths import object1, object2, object3 #...)
Use dot notation (i.e. import filePaths then filePaths.object1 #etc)
Some options may be considered better programming style than others.
The reason the wildcard works is because it is the same as option 2 from above if you had listed all created objects within filePaths on you import statement. In general, you should either selectively import only the methods and objects you need, or just import the script and use dot notation to selectively use methods and objects as needed.
The following example code shows how to use dot notation.
file 1:
# objects_to_import.py
bob = 127
string = 'my string'
def foo():
print('bar')
def bar():
print('foo')
def print_var(var):
print(var)
file 2:
# main.py in the same directory as objects_to_import.py
import objects_to_import
print(objects_to_import.bob)
objects_to_import.print_var(objects_to_import.bob)
objects_to_import.foo()
objects_to_import.bar()
try:
print(string)
except NameError:
print("You didn't import that variable or use correct notation!")
Then, running main.py outputs:
"""
127
127
bar
foo
You didn't import that variable or use correct notation!
"""
The results are identical if main.py instead read:
from objects_to_import import bob, foo, bar, print_var
print(bob)
print_var(bob)
foo()
bar()
try:
print(string)
except NameError:
print("You didn't import that variable or use correct notation!")
Note the if we add the following code to both versions of main.py:
if('bob' in globals()):
print('Bob is in your globals!')
else:
print("Can't find bob in your globals")
We find that bob is in your globals' space when explicitly imported, but is not present when using dot notation with the general non-explicit import statement. There therefore might be pragmatic reasons to choose one import method over the other (e.g. if you program is long and complex and you would like to more easily manage potential name collisions, you should use dot notation).
Alright I've come up with a solution!
I have my filePaths.py module with class_1 in there containing a set of certain variables: var_1, var_2, etc. respectively...
In my script that wants these variables, I'm bringing the module in like so:
import filePaths
path = filePaths.class_1
When I call one of the variables in class_1 instead of just var_1 I call path.var_1 and it comes in with no issues. Thank you everyone for helping out with this!
I am trying to upgrade a 10 year old event listener that I didn't write from Python 2.7 to python 3.7. The basic issue I'm running into is the way the original script was importing its plugins. The idea behind the original script was that any python file put into a "plugins" folder, with a "registerCallbacks" function inside it would auto-load itself into the event listener and run. It's been working great for lots of studios for years, but Python 3.7 is not liking it at all.
The folder structure for the original code is as follows:
EventListenerPackage
src
event_listener.py
plugins
plugin_1.py
plugin_2.py
From this, you can see that both the event listener and the plugins are held in folders that are parallel to each other, not nested.
The original code read like this:
# Python 2.7 implementation
import imp
class Plugin(object):
def __init__(self, path):
self._path = 'c:/full/path/to/EventListenerPackage/plugins/plugin_1.py'
self._pluginName = 'plugin_1'
def load(self):
try:
plugin = imp.load_source(self._pluginName, self._path)
except:
self._active = False
self.logger.error('Could not load the plugin at %s.\n\n%s', self._path, traceback.format_exc())
return
regFunc = getattr(plugin, 'registerCallbacks', None)
Due to the nature of the changes (as I understand them) in the way that Python 3 imports modules, none of the other message boards seem to be getting me to the answer.
I have tried several different approaches, the best so far being:
How to import a module given the full path?
I've tried several different methods, including adding the full path to the sys.path, but I always get "ModuleNotFoundError".
Here is roughly where I'm at now.
import importlib.util
import importlib.abc
import importlib
class Plugin(object):
def __init__(self, path):
self._path = 'c:/full/path/to/EventListenerPackage/plugins/plugin_1.py'
self._pluginName = 'plugin_1'
def load(self):
try:
spec = importlib.util.spec_from_file_location('plugins.%s' % self._pluginName, self._path)
plugin = importlib.util.module_from_spec(spec)
# OR I HAVE ALSO TRIED
plugin = importlib.import_module(self._path)
except:
self._active = False
self.logger.error('Could not load the plugin at %s.\n\n%s', self._path, traceback.format_exc())
return
regFunc = getattr(plugin, 'registerCallbacks', None)
Does anyone have any insights into how I can actually import these modules with the given folder structure?
Thanks in advance.
You're treating plugins like it's a package. It's not. It's just a folder you happen to have your plugin source code in.
You need to stop putting plugins. in front of the module name argument in spec_from_file_location:
spec = importlib.util.spec_from_file_location(self._pluginName, self._path)
Aside from that, you're also missing the part that actually executes the module's code:
spec.loader.exec_module(plugin)
Depending on how you want your plugin system to interact with regular modules, you could alternatively just stick the plugin directory onto the import path:
sys.path.append(plugin_directory)
and then import your plugins with import or importlib.import_module. Probably importlib.import_module, since it sounds like the plugin loader won't know plugin names in advance:
plugin = importlib.import_module(plugin_name)
If you do this, plugins will be treated as ordinary modules, with consequences like not being able to safely pick a plugin name that collides with an installed module.
As an entirely separate issue, it's pretty weird that your Plugin class completely ignores its path argument.
When writing throwaway scripts it's often needed to load a configuration file, image, or some such thing from the same directory as the script. Preferably this should continue to work correctly regardless of the directory the script is executed from, so we may not want to simply rely on the current working directory.
Something like this works fine if defined within the same file you're using it from:
from os.path import abspath, dirname, join
def prepend_script_directory(s):
here = dirname(abspath(__file__))
return join(here, s)
It's not desirable to copy-paste or rewrite this same function into every module, but there's a problem: if you move it into a separate library, and import as a function, __file__ is now referencing some other module and the results are incorrect.
We could perhaps use this instead, but it seems like the sys.argv may not be reliable either.
def prepend_script_directory(s):
here = dirname(abspath(sys.argv[0]))
return join(here, s)
How to write prepend_script_directory robustly and correctly?
I would personally just os.chdir into the script's directory whenever I execute it. It is just:
import os
os.chdir(os.path.split(__file__)[0])
However if you did want to refactor this thing into a library, you are in essence wanting a function that is aware of its caller's state. You thus have to make it
prepend_script_directory(__file__, blah)
If you just wanted to write
prepend_script_directory(blah)
you'd have to do cpython-specific tricks with stack frames:
import inspect
def getCallerModule():
# gets globals of module called from, and prints out __file__ global
print(inspect.currentframe().f_back.f_globals['__file__'])
I think the reason it doesn't smell right is that $PYTHONPATH (or sys.path) is the proper general mechanism to use.
You want pkg_resources
import pkg_resources
foo_fname = pkg_resources.resource_filename(__name__, "foo.txt")
I'm writing a python application in which I want to make use of dynamic, one-time-runnable plugins.
By this I mean that at various times during the running of this application, it looks for python source files with special names in specific locations. If any such source file is found, I want my application to load it, run a pre-named function within it (if such a function exists), and then forget about that source file.
Later during the running of the application, that file might have changed, and I want my python application to reload it afresh, execute its method, and then forget about it, like before.
The standard import system keeps the module resident after the initial load, and this means that subsequent "import" or "__import__" calls won't reload the same module after its initial import. Therefore, any changes to the python code within this source file are ignored during its second through n-th imports.
In order for such packages to be loaded uniquely each time, I came up with the following procedure. It works, but it seems kind of "hacky" to me. Are there any more elegant or preferred ways of doing this? (note that the following is an over-simplified, illustrative example)
import sys
import imp
# The following module name can be anything, as long as it doesn't
# change throughout the life of the application ...
modname = '__whatever__'
def myimport(path):
'''Dynamically load python code from "path"'''
# get rid of previous instance, if it exists
try:
del sys.modules[modname]
except:
pass
# load the module
try:
return imp.load_source(modname, path)
except Exception, e:
print 'exception: {}'.format(e)
return None
mymod = myimport('/path/to/plugin.py')
if mymod is not None:
# call the plugin function:
try:
mymod.func()
except:
print 'func() not defined in plugin: {}'.format(path)
Addendum: one problem with this is that func() runs within a separate module context, and it has no access to any functions or variables within the caller's space. I therefore have to do inelegant things like the following if I
want func_one(), func_two() and abc to be accessible within the invocation
of func():
def func_one():
# whatever
def func_two():
# whatever
abc = '123'
# Load the module as shown above, but before invoking mymod.func(),
# the following has to be done ...
mymod.func_one = func_one
mymod.func_two = func_two
mymod.abc = abc
# This is a PITA, and I'm hoping there's a better way to do all of
# this.
Thank you very much.
I use the following code to do this sort of thing.
Note that I don't actually import the code as a module, but instead execute the code in a particular context. This lets me define a bunch of api functions automatically available to the plugins without users having to import anything.
def load_plugin(filename, context):
source = open(filename).read()
code = compile(source, filename, 'exec')
exec(code, context)
return context['func']
context = { 'func_one': func_one, 'func_two': func_two, 'abc': abc }
func = load_plugin(filename, context)
func()
This method works in python 2.6+ and python 3.3+
The approach you use is totally fine. For this question
one problem with this is that func() runs within a separate module context, and it has no access to any functions or variables within the caller's space.
It may be better to use execfile function:
# main.py
def func1():
print ('func1 called')
exec(open('trackableClass.py','r').read(),globals()) # this is similar to import except everything is done in the current module
#execfile('/path/to/plugin.py',globals()) # python 2 version
func()
Test it:
#/path/to/plugin.py
def func():
func1()
Result:
python main.py
# func1 called
One potential problem with this approach is namespace pollution because every file is run in the current namespace which increase the chance of name conflict.