Checking if all modules that will be required are available on PYTHONPATH - python

I have written a large system in Python that I now want to distribute to some colleagues. There are a few folders that need to be added to PYTHONPATH for all of my modules (files) to be found. I am looking for a way to give them sane error messages if they have not setup their PYTHONPATH correctly. Say the structure is:
ParentModule
calls Child
calls GrandChild
calls MyModule
If they run ParentModule, it could be running for a long time before it ever ends up in GrandChild and needs MyModule, and if MyModule's directory is not on PYTHONPATH, it will crash complaining that it can't find MyModule. I was hoping to be able to do something like:
for file in (all files that could ever be reached from here):
if all modules needed by 'file' are not available
print "error: Please make sure your PYTHONPATH points to x,y, and z"
Is something like this possible?

At the top of your main module I would just try to import all of the modules your program depends on, and wrap it in a try/except for printing your sane error if any of the import statements fail:
import sys
try:
import Child
import GrandChild
import MyModule
except ImportError:
print "Error: Please make sure your PYTHONPATH points to x, y, and z"
sys.exit(1)
# current module contents

Depending on how you're "calling" Child, GrandChild, and MyModule, this should happen automatically.
If by "call" you mean import, and you're doing your imports at the type of the module, as is conventional, then all of the import chaining will happen automatically on the import of the parent module. So if a downstream import is unavailable, then you'll get an ImportError when you import the ParentModule. If you're "calling" the scripts, by say, executing them in a subprocess, then no I don't think there's an easy way to ensure the availability of modules, given the totally dynamic nature of what you're doing. Similarly if you're doing totally dynamic imports. This is one of the down sides to dynamic programming in general - there's often no rigorous way to ensure that things will be the way you intended them to be.
Edit:
You could definitely do something heuristic like #F.J. suggests though.

Related

Conventions of Importing Python Main Programs

Often I write command line utilities that are only meant to be run as main. For example, I might have a file that looks like this:
#!/usr/bin/env python
if __name__ == '__main__':
import sys
# do stuff
In other words, there is nothing going on that isn't under the if statement checking that this file is being run as main. I tried importing a file like this to see what would happen, and the import was successful.
So as I expected, one is allowed to import files like this, but what is the convention surrounding this practice? Is one supposed to throw an error telling the user that there is nothing to be imported? Or if all the contents of the file are supposed to be run as main, does one need to check if the program is being run as main? Or is the conditional not necessary?
Also, if I have import statements, should they be at the top of the file, or under the conditional? If the modules are only being used under the conditional, it would seem to me that they should be imported under the conditional and not at the top of the file.
If you are writing simple utilities that you are entirely certain that you will never import as a module in another program, then you really do not need to include the if __name__ == '__main__' stuff. The fundamental point of that construct is to allow a module to be developed that can both be imported as a module for use, and run as a stand-alone program for some other purpose. For example, if you had a module and had some test vectors you wanted to run on it regularly, you would put the trigger mechanism for your test vectors in the if __name__ block.
Another example might be if you have a stand-alone program that you develop, that would also provide useful functions for others. If you have a look at the pip module, this is an excellent example of this technique.

Rename Refactoring in PyDev broken?

I am a bit surprised to find the Rename refactoring facility in PyDev broken. To see my error
create a new PyDev project,
create a new module within it (say util.py),
create a constant in the module, e. g. MYCONST = "some const value",
create a second script in the project, say scriptA.py which
imports util and
uses the constant: print util.MYCONST
create a third script in the project, say scriptB.py which
also imports util and
also uses the constant: print util.MYCONST
Renaming of the constant MYCONST should rename it in all three files now.
Things like go-to-declaration (Ctrl-left-mouse-click or F3) also work, so the connection between util.py and scriptA.py is known to PyDev.
But if you rename the constant (using ShiftAltr on the word MYCONST) in the file scriptA.py, it gets renamed in scriptA.py and in scriptB.py, but not in util.py (effectively breaking the code, of course). If you try renaming it in util.py, it gets renamed only within that file and neither in scriptA.py nor in scriptB.py.
Questions:
Can other people recreate my problem?
Is there a configuration issue causing the problem so that I can remove the effect myself?
Is this a known bug (I didn't find anything concerning it), maybe even with a fix or workaround?
Is this only present in my product versions?
I'm using Eclipse "Luna Service Release 2 (4.4.2)" and PyDev 3.9.2.201502050007.
EDIT:
(removed — the bug is not connected to package or not package as it at first appeared to be).
EDIT2:
I just found out that the problem only appears if I import the module name and then use qualified names to access the constant:
import util
print util.MYCONST
But if I import the name directly:
from util import MYCONST
print MYCONST
then I cannot reproduce the error.
Though this seems like a workaround (and it might be!), I'd like to be able to use qualified names, at least sometimes. So the main question remains open.

importing modules/scripts in python

So at work i'm developing a way for us to push out tools/plugins to the team as a whole. I actually have the system up and running and it's completely dynamic except for the topic i'm about to talk about (this is all done in python). On start up Maya checks the local folder against a folder on the server and checks to see if they are different and handles and copying down files/dirs that are needed as well as the deleting of old plugins that we delete on the server. The system is flexible enough that users can create custom shelves of all the plugins and we can re organize the folders in the back end without breaking the shelves of all the users. The plugins are accessed through a drop down menu in Maya's main interface and we can add sub folders into the system and plugins freely without messing with the code. We can also arrange the order at which menu items and plugins can be displayed through a simple numbering system of the folders.
This is all working fine until I get to making plugins, when they import a module in their folder, dynamic as well. So when I start moving the plugins folder around the root directory, if I have an imported module that I created the path for, the imported modules path in the plugin script is now wrong at that point. I already have a way of getting the proper path info to the plugin through my menu setup. I'm having issue with the import of the module and accessing classes with in that module.
so If the standard for importing a module's class
from fileName import className
and the __import__ way that i'm using looks like.
className = __import__("folderN.folderN.folderN.fileName", {}, {}, ["className"])
But with that method I loose the ability to just call on that class name like I can with the regular from import method. I got around that by doing
className = className.className
but this is a rather ugly method and i'd prefer to be able to just import and call on the name without doing that extra step. I do not know this import method very well and I know i'm missing some things with it.
Am I just going about this import process the wrong way? Is there a way to make it look into the local directory for the plugin without appending to maya's paths that way i can just do the regular way of importing method without a weird path that has to change anytime I move the plugin?
__import__ doesn't work they way you are assuming. It returns a module object for the import path provided, with a guarantee that all the children you specify in the list have been explicitly imported. For classes, it doesn't really make a difference.
mod = __import__('a.b', {}, {}, ['c', 'd'])
Is more accurately the equivalent of
import a.b
try:
import a.b.c
except ImportError:
pass
try:
import a.b.d
except ImportError:
pass
mod = a.b
What you actually probably want here is something like
child = getattr(__import__(import_path, {}, {}, [child_name]), child_name)
As to your versioning and distribution system, have you looked at using an SCM/VCS like SVN or GIT, and/or a package management system? These give you much better change tracking and synchronization than a straight file share, and you could easily integrate them with a sync/install script on your client machines that could be customizable as needed for the client's specific configuration demands.

Python: Create virtual import path

Is there any way to create a virtual import path in Python?
My directory structure is like this:
/
native
scripts
some.py
another.py
[Other unrelated dirs]
The root is the directory from where the program is executed. Atm I add native/scripts/ to the search path so I can do import some, another instead of from native.scripts import some, another, but I'd like to be able to do it like this:
from native import some
import native.another
Is there any way to achieve this?
Related questions:
Making a virtual package available via sys.modules
Why not move some.py and another.py out into the native directory so that everything Just Works and so that people returning to the source code later won't be confused about why things are and aren't importable? :)
Update:
Thanks for your comments; they have usefully clarified the problem! In your case, I generally put functions and classes that I might want to import inside, say, native.some where I can easily get to them. But then I get the script code, and only the script code — only the thin shim that interprets arguments and starts everything running by passing those to a main() or go() function as parameters — and put that inside of a scripts directory. That keeps external-interface code cleanly separate from code that you might want to import, and means you don't have to try to fool Python into having modules several places at once.
In /native/__init__.py, include:
from scripts import some, another

python import hooks: accessing the full dotted module name before processing

I understand the fact that the importer protocol operates at the level of individual imports. However, in my case, it would be really helpful if I could know the full "dotted module name" BEFORE fully experiencing it one path element at time (that is, "a priori" if I'm allowed to say that) from my import hook.
class myimporter(object):
def find_module(self,fullname,path=None):
if fullname == 'rootnamespace':
#at this moment I need to know the full dotted module name
#that I am going to process (e.g. rootnamespace.foo.bar.baz)
def load_module(self,fullname):
pass
Is there a way (hopefully safe and sound) to know beforehand the full "dotted module name" from within an import hook?
--- more infos, probably not really needed (so you can skip straight to answering me ;) ), however:
If the reader wonders why would I need that or just maybe a little bit more info is to be helpful: I want to import perl modules in python. How strange does this sound? :) In reality, the import process will just make available a python stub for a perl module. The stub is generated by introspecting the perl package and see what it has to offer; I wont enter in the details of implementation here:
import perlmod.www.mechanize as wwwmech
mymech = wwwmech() #return a python proxy object to be used with later calls
mymech.get(uri) #returns another proxy object to the newly generated HTTP::Response perl object
#etc etc
my problem lies at
import perlmod.www.mechanize
because I need to know beforehand if the requested module does really exist in the system's perl installation, without being forced to wait the full module name so I can take the decision if I should fail or not. Moreover, if the user does:
import perlmod.www
I have no chance to get back and say "wait wait wait, this is wrong" after loading a www stub package. My hook won't be called after that.
Basically, when my import hook is called for "www" I must already know if the import will fail or succeed for the full dotted module name. I will stop here with the details, please don't forget the real question :)
Thank you very much, even only for reading this.

Categories

Resources