Rename Refactoring in PyDev broken? - python

I am a bit surprised to find the Rename refactoring facility in PyDev broken. To see my error
create a new PyDev project,
create a new module within it (say util.py),
create a constant in the module, e. g. MYCONST = "some const value",
create a second script in the project, say scriptA.py which
imports util and
uses the constant: print util.MYCONST
create a third script in the project, say scriptB.py which
also imports util and
also uses the constant: print util.MYCONST
Renaming of the constant MYCONST should rename it in all three files now.
Things like go-to-declaration (Ctrl-left-mouse-click or F3) also work, so the connection between util.py and scriptA.py is known to PyDev.
But if you rename the constant (using ShiftAltr on the word MYCONST) in the file scriptA.py, it gets renamed in scriptA.py and in scriptB.py, but not in util.py (effectively breaking the code, of course). If you try renaming it in util.py, it gets renamed only within that file and neither in scriptA.py nor in scriptB.py.
Questions:
Can other people recreate my problem?
Is there a configuration issue causing the problem so that I can remove the effect myself?
Is this a known bug (I didn't find anything concerning it), maybe even with a fix or workaround?
Is this only present in my product versions?
I'm using Eclipse "Luna Service Release 2 (4.4.2)" and PyDev 3.9.2.201502050007.
EDIT:
(removed — the bug is not connected to package or not package as it at first appeared to be).
EDIT2:
I just found out that the problem only appears if I import the module name and then use qualified names to access the constant:
import util
print util.MYCONST
But if I import the name directly:
from util import MYCONST
print MYCONST
then I cannot reproduce the error.
Though this seems like a workaround (and it might be!), I'd like to be able to use qualified names, at least sometimes. So the main question remains open.

Related

pycharm project: source root and imports not updated?

I have a python project (in Pycharm), and I have, let's say, 2 folders in it. One is called data and the other is algorithms. The data folder has a python file where I import some data from an excel sheet. And another file where I have defined some constants.
The algorithm folder has, let's say, one python file where I import the constants and the data from the data folder. I use:
from data.constants import const
from data.struct import table
When I run the algorithms (that are in the algorithm folder), things work perfectly. But when I change a constant in the constant file or the data in the excel sheet, nothing much changes. In other words, the constants are not updated when imported again and the same for the excel data imported. The old values of the constants and the table are used.
I tried to mark both folders as source root, but the problem persists.
What I do now, is close pycharm and reopen it again, but if there is a better way to handle this rather than closing and losing vars in the python console, I would be grateful to know about it!
I am not sure if I get it correct or not but try following. Once you change the constants in constants file try to import it again, i.e. do the following
from data.constants import const.
After this you see that constants are not changed ?
Please try this:
from constants.constant import v
print('v =', v)
del v
The problem can be connected with cache. Here is similar problem as yours but for spider
Spyder doesn't detect changes in imported python files
Check this post
pycharm not updating with environment variables
As it suggests you may have to take few steps. Set Environment Variables,
or check the solution suggested here
Pycharm: set environment variable for run manage.py Task .
I found the answer in this post :
https://stackoverflow.com/a/5399339/13890967
Basically add these two lines into the settings>Console>Python Console
%load_ext autoreload
%autoreload 2
see this answer as well for better visualization:
https://stackoverflow.com/a/60099037/13890967
and this answer for syntax errors:
https://stackoverflow.com/a/32994117/13890967

I'm having trouble understanding importing in python3

I've looked on many sites and many related questions, but following the solutions to those questions still didn't seem to help. I figured maybe I am missing something, so here goes.
My project is to create a DM's tool for managing table top role playing games. I need to be able to split my project into many different files in order to keep everything organized. (so far) I have only three files I'm trying to work with. I have my main file which I called dmtool.py3, I have a file for class definitions called classdef.py3, and I have a file for creating race objects called races.py3.
1] The first of my questions is regarding importing singular files. I've tried organizing the files in several different ways, so for this lets assume all of my three files are in the same directory.
If I want to import my class definitions from classdef.py3 into my main file dmtool.py3, how would I do that? import classdef and import classdef.py3 do not seem to work properly saying their is no module with that name.
2] So I then made a module, and it seemed to work. I did this by creating a sub-directory called defs and putting the classdef.py3 and races.py3 files into it. I created the __init__.py3 file, and put import defs in dmtool.py3. As a test I put x = 1 at the very top of races.py3 and put print("X =", defs.x) in dmtool.py3. I get an error saying that module doesn't have an attribute x.
So I guess my second question is if it is possible to just use variables from other files. Would I use something like defs.x or defs.races.x or races.x or maybe simply x? I can't seem to find the one that works. I need to figure this out because I will be using specific instances of a class that will be defined in the races.py3 file.
3] My third question is a simple one that kind of spawned from the previous two. Now that races.py3 and classdef.py3 are in the same module, how do I make one access the other. races.py3 has to use the classes defined in classdef.py3.
I would really appreciate any help. Like I said I tried looking up other questions related to importing, but their simple solutions seemed to come up with the same errors. I didn't post my specific files because other than what I mentioned, there is just very simple print lines or class definitions. Nothing that should affect the importing.
Thanks,
Chris
Firstly, do not use .py3 as a file extension. Python doesn't recognize it.
Python 3's import system is actually quite simple. import foo looks through sys.path for a package (directory) or module (.py file) named foo.
sys.path contains various standard directories where you would normally install libraries, as well as the Python standard library. The first entry of sys.path is usually the directory in which the __main__ script lives. If you invoke Python as python -m foo.bar, the first entry will instead be the directory which contains the foo package.
Relative imports use a different syntax:
from . import foo
This means "import the foo module or package from the package which contains the current module." It is discussed in detail in PEP 328, and can be useful if you don't want to specify the entire structure of your packages for every import.
Start python and type these commands:
>>> import sys
>>> sys.path
The path is the list of directories where python looks for libraries. If your modules are not on the list, none are found.

Checking if all modules that will be required are available on PYTHONPATH

I have written a large system in Python that I now want to distribute to some colleagues. There are a few folders that need to be added to PYTHONPATH for all of my modules (files) to be found. I am looking for a way to give them sane error messages if they have not setup their PYTHONPATH correctly. Say the structure is:
ParentModule
calls Child
calls GrandChild
calls MyModule
If they run ParentModule, it could be running for a long time before it ever ends up in GrandChild and needs MyModule, and if MyModule's directory is not on PYTHONPATH, it will crash complaining that it can't find MyModule. I was hoping to be able to do something like:
for file in (all files that could ever be reached from here):
if all modules needed by 'file' are not available
print "error: Please make sure your PYTHONPATH points to x,y, and z"
Is something like this possible?
At the top of your main module I would just try to import all of the modules your program depends on, and wrap it in a try/except for printing your sane error if any of the import statements fail:
import sys
try:
import Child
import GrandChild
import MyModule
except ImportError:
print "Error: Please make sure your PYTHONPATH points to x, y, and z"
sys.exit(1)
# current module contents
Depending on how you're "calling" Child, GrandChild, and MyModule, this should happen automatically.
If by "call" you mean import, and you're doing your imports at the type of the module, as is conventional, then all of the import chaining will happen automatically on the import of the parent module. So if a downstream import is unavailable, then you'll get an ImportError when you import the ParentModule. If you're "calling" the scripts, by say, executing them in a subprocess, then no I don't think there's an easy way to ensure the availability of modules, given the totally dynamic nature of what you're doing. Similarly if you're doing totally dynamic imports. This is one of the down sides to dynamic programming in general - there's often no rigorous way to ensure that things will be the way you intended them to be.
Edit:
You could definitely do something heuristic like #F.J. suggests though.

importing modules/scripts in python

So at work i'm developing a way for us to push out tools/plugins to the team as a whole. I actually have the system up and running and it's completely dynamic except for the topic i'm about to talk about (this is all done in python). On start up Maya checks the local folder against a folder on the server and checks to see if they are different and handles and copying down files/dirs that are needed as well as the deleting of old plugins that we delete on the server. The system is flexible enough that users can create custom shelves of all the plugins and we can re organize the folders in the back end without breaking the shelves of all the users. The plugins are accessed through a drop down menu in Maya's main interface and we can add sub folders into the system and plugins freely without messing with the code. We can also arrange the order at which menu items and plugins can be displayed through a simple numbering system of the folders.
This is all working fine until I get to making plugins, when they import a module in their folder, dynamic as well. So when I start moving the plugins folder around the root directory, if I have an imported module that I created the path for, the imported modules path in the plugin script is now wrong at that point. I already have a way of getting the proper path info to the plugin through my menu setup. I'm having issue with the import of the module and accessing classes with in that module.
so If the standard for importing a module's class
from fileName import className
and the __import__ way that i'm using looks like.
className = __import__("folderN.folderN.folderN.fileName", {}, {}, ["className"])
But with that method I loose the ability to just call on that class name like I can with the regular from import method. I got around that by doing
className = className.className
but this is a rather ugly method and i'd prefer to be able to just import and call on the name without doing that extra step. I do not know this import method very well and I know i'm missing some things with it.
Am I just going about this import process the wrong way? Is there a way to make it look into the local directory for the plugin without appending to maya's paths that way i can just do the regular way of importing method without a weird path that has to change anytime I move the plugin?
__import__ doesn't work they way you are assuming. It returns a module object for the import path provided, with a guarantee that all the children you specify in the list have been explicitly imported. For classes, it doesn't really make a difference.
mod = __import__('a.b', {}, {}, ['c', 'd'])
Is more accurately the equivalent of
import a.b
try:
import a.b.c
except ImportError:
pass
try:
import a.b.d
except ImportError:
pass
mod = a.b
What you actually probably want here is something like
child = getattr(__import__(import_path, {}, {}, [child_name]), child_name)
As to your versioning and distribution system, have you looked at using an SCM/VCS like SVN or GIT, and/or a package management system? These give you much better change tracking and synchronization than a straight file share, and you could easily integrate them with a sync/install script on your client machines that could be customizable as needed for the client's specific configuration demands.

Python: Create virtual import path

Is there any way to create a virtual import path in Python?
My directory structure is like this:
/
native
scripts
some.py
another.py
[Other unrelated dirs]
The root is the directory from where the program is executed. Atm I add native/scripts/ to the search path so I can do import some, another instead of from native.scripts import some, another, but I'd like to be able to do it like this:
from native import some
import native.another
Is there any way to achieve this?
Related questions:
Making a virtual package available via sys.modules
Why not move some.py and another.py out into the native directory so that everything Just Works and so that people returning to the source code later won't be confused about why things are and aren't importable? :)
Update:
Thanks for your comments; they have usefully clarified the problem! In your case, I generally put functions and classes that I might want to import inside, say, native.some where I can easily get to them. But then I get the script code, and only the script code — only the thin shim that interprets arguments and starts everything running by passing those to a main() or go() function as parameters — and put that inside of a scripts directory. That keeps external-interface code cleanly separate from code that you might want to import, and means you don't have to try to fool Python into having modules several places at once.
In /native/__init__.py, include:
from scripts import some, another

Categories

Resources