pycharm project: source root and imports not updated? - python

I have a python project (in Pycharm), and I have, let's say, 2 folders in it. One is called data and the other is algorithms. The data folder has a python file where I import some data from an excel sheet. And another file where I have defined some constants.
The algorithm folder has, let's say, one python file where I import the constants and the data from the data folder. I use:
from data.constants import const
from data.struct import table
When I run the algorithms (that are in the algorithm folder), things work perfectly. But when I change a constant in the constant file or the data in the excel sheet, nothing much changes. In other words, the constants are not updated when imported again and the same for the excel data imported. The old values of the constants and the table are used.
I tried to mark both folders as source root, but the problem persists.
What I do now, is close pycharm and reopen it again, but if there is a better way to handle this rather than closing and losing vars in the python console, I would be grateful to know about it!

I am not sure if I get it correct or not but try following. Once you change the constants in constants file try to import it again, i.e. do the following
from data.constants import const.
After this you see that constants are not changed ?
Please try this:
from constants.constant import v
print('v =', v)
del v
The problem can be connected with cache. Here is similar problem as yours but for spider
Spyder doesn't detect changes in imported python files
Check this post
pycharm not updating with environment variables
As it suggests you may have to take few steps. Set Environment Variables,
or check the solution suggested here
Pycharm: set environment variable for run manage.py Task .

I found the answer in this post :
https://stackoverflow.com/a/5399339/13890967
Basically add these two lines into the settings>Console>Python Console
%load_ext autoreload
%autoreload 2
see this answer as well for better visualization:
https://stackoverflow.com/a/60099037/13890967
and this answer for syntax errors:
https://stackoverflow.com/a/32994117/13890967

Related

Are different between 'RUN' and 'RUN (full) selection' in Spyder?

I have a source code consists of many custom modules.
In the first few lines, there are import words such as...
import custom_module_1
import custom_module_2
import custom_module_3
....
When I run(shortcut is F9) this code with partial or full selection, I found ModuleNotFoundError: custom_module_1.
However, when I run(F5) code, I works well runfile('C:/Users/user/Desktop/test.py', wdir='C:/Users/user/Desktop')
I am so confused because I thought 'run(F5)' and 'full selection run(F9)' are same.
However the results are very different.
Is there any different between 'run(F5)' and 'full selection run(F9)' in Spyder?
(Spyder maintainer here) The difference is the following:
Run selection takes the code you've selected in the Editor, pastes it in the console and runs it.
Run file is similar to executing python myfile.py, but before doing that it changes the directory your code will be run to the one your file is placed in. It'll also run it in a clean namespace so it's not affected by the variables currently defined in the console. Especially due to this last feature, you should avoid using Run selection as much as possible.
In your case I think the problem is that Run selection doesn't change directories, so Python can't find the modules you have next to test.py.

I'm having trouble understanding importing in python3

I've looked on many sites and many related questions, but following the solutions to those questions still didn't seem to help. I figured maybe I am missing something, so here goes.
My project is to create a DM's tool for managing table top role playing games. I need to be able to split my project into many different files in order to keep everything organized. (so far) I have only three files I'm trying to work with. I have my main file which I called dmtool.py3, I have a file for class definitions called classdef.py3, and I have a file for creating race objects called races.py3.
1] The first of my questions is regarding importing singular files. I've tried organizing the files in several different ways, so for this lets assume all of my three files are in the same directory.
If I want to import my class definitions from classdef.py3 into my main file dmtool.py3, how would I do that? import classdef and import classdef.py3 do not seem to work properly saying their is no module with that name.
2] So I then made a module, and it seemed to work. I did this by creating a sub-directory called defs and putting the classdef.py3 and races.py3 files into it. I created the __init__.py3 file, and put import defs in dmtool.py3. As a test I put x = 1 at the very top of races.py3 and put print("X =", defs.x) in dmtool.py3. I get an error saying that module doesn't have an attribute x.
So I guess my second question is if it is possible to just use variables from other files. Would I use something like defs.x or defs.races.x or races.x or maybe simply x? I can't seem to find the one that works. I need to figure this out because I will be using specific instances of a class that will be defined in the races.py3 file.
3] My third question is a simple one that kind of spawned from the previous two. Now that races.py3 and classdef.py3 are in the same module, how do I make one access the other. races.py3 has to use the classes defined in classdef.py3.
I would really appreciate any help. Like I said I tried looking up other questions related to importing, but their simple solutions seemed to come up with the same errors. I didn't post my specific files because other than what I mentioned, there is just very simple print lines or class definitions. Nothing that should affect the importing.
Thanks,
Chris
Firstly, do not use .py3 as a file extension. Python doesn't recognize it.
Python 3's import system is actually quite simple. import foo looks through sys.path for a package (directory) or module (.py file) named foo.
sys.path contains various standard directories where you would normally install libraries, as well as the Python standard library. The first entry of sys.path is usually the directory in which the __main__ script lives. If you invoke Python as python -m foo.bar, the first entry will instead be the directory which contains the foo package.
Relative imports use a different syntax:
from . import foo
This means "import the foo module or package from the package which contains the current module." It is discussed in detail in PEP 328, and can be useful if you don't want to specify the entire structure of your packages for every import.
Start python and type these commands:
>>> import sys
>>> sys.path
The path is the list of directories where python looks for libraries. If your modules are not on the list, none are found.

Rename Refactoring in PyDev broken?

I am a bit surprised to find the Rename refactoring facility in PyDev broken. To see my error
create a new PyDev project,
create a new module within it (say util.py),
create a constant in the module, e. g. MYCONST = "some const value",
create a second script in the project, say scriptA.py which
imports util and
uses the constant: print util.MYCONST
create a third script in the project, say scriptB.py which
also imports util and
also uses the constant: print util.MYCONST
Renaming of the constant MYCONST should rename it in all three files now.
Things like go-to-declaration (Ctrl-left-mouse-click or F3) also work, so the connection between util.py and scriptA.py is known to PyDev.
But if you rename the constant (using ShiftAltr on the word MYCONST) in the file scriptA.py, it gets renamed in scriptA.py and in scriptB.py, but not in util.py (effectively breaking the code, of course). If you try renaming it in util.py, it gets renamed only within that file and neither in scriptA.py nor in scriptB.py.
Questions:
Can other people recreate my problem?
Is there a configuration issue causing the problem so that I can remove the effect myself?
Is this a known bug (I didn't find anything concerning it), maybe even with a fix or workaround?
Is this only present in my product versions?
I'm using Eclipse "Luna Service Release 2 (4.4.2)" and PyDev 3.9.2.201502050007.
EDIT:
(removed — the bug is not connected to package or not package as it at first appeared to be).
EDIT2:
I just found out that the problem only appears if I import the module name and then use qualified names to access the constant:
import util
print util.MYCONST
But if I import the name directly:
from util import MYCONST
print MYCONST
then I cannot reproduce the error.
Though this seems like a workaround (and it might be!), I'd like to be able to use qualified names, at least sometimes. So the main question remains open.

python modules missing in sage

I have Sage 4.7.1 installed and have run into an odd problem. Many of my older scripts that use functions like deepcopy() and uniq() no longer recognize them as global names. I have been able to fix this by importing the python modules one by one, but this is quite tedious. But when I start the command-line Sage interface, I can type "list2=deepcopy(list1)" without importing the copy module, and this works fine. How is it possible that the command line Sage can recognize global name 'deepcopy' but if I load my script that uses the same name it doesn't recognize it?
oops, sorry, not familiar with stackoverflow yet. I type: 'sage_4.7.1/sage" to start the command line interface; then, I type "load jbom.py" to load up all the functions I defined in a python script. When I use one of the functions from the script, it runs for a few seconds (complex function) then hits a spot where I use some function that Sage normally has as a global name (deepcopy, uniq, etc) but for some reason the script I loaded does not know what the function is. And to reiterate, my script jbom.py used to work the last time I was working on this particular research, just as I described.
It also makes no difference if I use 'load jbom.py' or 'import jbom'. Both methods get the functions I defined in my script (but I have to use jbom. in the second case) and both get the same error about 'deepcopy' not being a global name.
REPLY TO DSM: I have been sloppy about describing the problem, for which I am sorry. I have created a new script 'experiment.py' that has "import jbom" as its first line. Executing the function in experiment.py recognizes the functions in jbom.py but deepcopy is not recognized. I tried loading jbom.py as "load jbom.py" and I can use the functions just like I did months ago. So, is this all just a problem of layering scripts without proper usage of import/load etc?
SOLVED: I added "from sage.all import *" to the beginning of jbom.py and now I can load experiment.py and execute the functions calling jbom.py functions without any problems. From the Sage doc on import/load I can't really tell what I was doing wrong exactly.
Okay, here's what's going on:
You can only import files ending with .py (ignoring .py[co]) These are standard Python files and aren't preparsed, so 1/3 == int(0), not QQ(1)/QQ(3), and you don't have the equivalent of a from sage.all import * to play with.
You can load and attach both .py and .sage files (as well as .pyx and .spyx and .m). Both have access to Sage definitions but the .py files aren't preparsed (so y=17 makes y a Python int) while the .sage files are (so y=17 makes y a Sage Integer).
So import jbom here works just like it would in Python, and you don't get the access to what Sage has put in scope. load etc. are handy but they don't scale up to larger programs so well. I've proposed improving this in the past and making .sage scripts less second-class citizens, but there hasn't yet been the mix of agreement on what to do and energy to do it. In the meantime your best bet is to import from sage.all.

Python: Create virtual import path

Is there any way to create a virtual import path in Python?
My directory structure is like this:
/
native
scripts
some.py
another.py
[Other unrelated dirs]
The root is the directory from where the program is executed. Atm I add native/scripts/ to the search path so I can do import some, another instead of from native.scripts import some, another, but I'd like to be able to do it like this:
from native import some
import native.another
Is there any way to achieve this?
Related questions:
Making a virtual package available via sys.modules
Why not move some.py and another.py out into the native directory so that everything Just Works and so that people returning to the source code later won't be confused about why things are and aren't importable? :)
Update:
Thanks for your comments; they have usefully clarified the problem! In your case, I generally put functions and classes that I might want to import inside, say, native.some where I can easily get to them. But then I get the script code, and only the script code — only the thin shim that interprets arguments and starts everything running by passing those to a main() or go() function as parameters — and put that inside of a scripts directory. That keeps external-interface code cleanly separate from code that you might want to import, and means you don't have to try to fool Python into having modules several places at once.
In /native/__init__.py, include:
from scripts import some, another

Categories

Resources