Within my project XYZ, I have a file superSource.py, which contains some functions.
Now, I've used the new cool pyCharm feature of creating an IPython notebook, which I calltest test.ipynb, and saved it in the projects main directory (next to superSource.py).
However, when I run import superSource; foo = superSource.parameters() nothing happens, I don't even get a warning. pyCharm will underline superSource within the code though, warning me that there is no module called superSource.
How can I include other files from the same directory using the IPython notebook and/or pyCharm?
I've had the same problem and have a partial solution.
To include your file, add the following to a cell:
execfile("superSource.py").
This should load and execute it and make its contents available for reuse so that you can access variable and call functions defined or imported by it in other cells.
Unfortunately, PyCharm does not know about it, so that as you type, there is no statement completion and if you have "Show import popup" enabled in PyCharm, it will suggest adding an import but highlight it as an error afterwards. It should still work, however.
Related
Hey everyone!
Just moved over to VScode and dealing with some initial transition problems.
I'm using VScode for Python and have been using the interactive window and debugger. For my python interpreter, I've been selecting Python 3.9.7, which is a part of my Anaconda installation.
I've noticed that when I've been changing and saving my functions in one py file, and then calling the function from another file, that the changes I've made in my code aren't reflected in the code output.
It's worth noting that when the changes are made and saved in a file, and the same file is run, the changes WILL be reflected, so it's purely an issue between files. I reload the functions from the file after I make the changes and save them, so it's not an issue of reloading the function.
To provide some context in the photo, I have different functions in the file "Metric_Functions.py". I'm testing the code using different tests in the file "UnitTestCode.py". However, as I'm running the tests (reloading the functions and running the cell with the specific test), I noticed that when I made updated in the file "Metric_Functions", those changes were not being reflected in the unit test results.
Any help/experience with this kind of issue/suggestions of where to start to look would be really appreciated! Really inexperienced with VScode, so any help would be awesome.
Thanks!
In iPython and Jupyter imported modules persist throughout the session. If a module has already been imported then running the import statement again doesn't do anything at all since the interpreter can see that this module already exists in the namespace.
In order for the changes to external module to be seen in iPython/Jupyter you need to kill/restart the current instance and then run the import again.
I have two .py files I wrote that I've imported into a third .py file to use:
(The top of driver.py)
import wafer_diagram
import LightIV
wafer_diagram.py and LightIV.py contain functions that I'm using in driver.py. However, whenever I have a cleared/restarted kernel, I have to run each individual .py file, otherwise they aren't defined in driver.py. Is there a way to automatically do this when I run driver.py? I am using Spyder with python 3.8. Thanks!
Edit: To clarify, after running wafer diagram.py and LightIV.py, I am able to use the functions in driver.py without issue. However, if I restart my kernel, and then try running driver.py, an error will throw that the two modules do not exist.
I was being very silly! Although I imported my other files, I was not calling on their functions correctly. For example, for the function print_struct() from LightIV.py, I would write in driver.py:
import LightIV
print_struct()
Instead, I should have written:
import LightIV
LightIV.print_struct()
The reason that I was able to get away with this for so long was likely due to how Spyder saves variables. I would run LightIV.py and wafer_diagram.py, "saving" their functions, and then using them later on instead of properly importing them.
I am coming into Python from R, and installed Python 3.5 with Anaconda. Now, PyCharm console has a prompt identical to an iPython Notebook, i.e. instead of >>>, it shows [1] at the command line.
After writing a toy line of code (below) in a .py document, and running it from within PyCharm, showing no errors, I was under the assumption that the function toss(), which was defined in the .py document would be ready to use in the console. However this did not seem to be the case. I ended up copying and pasting the pertinent lines of code on the console, entering, and then, finally, the function toss() was accessible to produce random examples of the roll of a die.
Logically, there has to be a smoother way of moving code from a .py file in the Editor to the environment accessible from the Python Console. But this shorter way doesn't seem to be simply running the .py file.
Code:
import random
def toss():
return(random.randint(1,6))
So how do you make the code in a Python file in the Editor accessible in the local environment?
You need to import it first. Let's say that your function toss() is in a file called foo.py then that means that you can do
from foo import toss
toss()
in your Python Console to use your function. A Python source file is, by definition, a module and you'll need to import it in order to use any functions defined there.
I started programming some scripts with ipython notebook but now the project is becoming to big for a notebook. Nevertheless I love to execute my stuff in an ipython notebook (load de data only once, online figures...).
What I would want is to program everything with eclipse but executing it in ipython. I know I can save the notebooks as .py by adding the --script option at the beginning. But now I want to automatically make the process the other way around. I mean, I want my ipython notebook to reload de code I modify with Eclipse.
Is it possible?
Should I make a program that makes it using the converter?
Thanks!!
I found the solution for manually updating the functions without rerunning the whole .ipynb file. However, I do not know how to make it automated. Here is the solution:
Synchronizing code between jupyter/iPython notebook script and class methods
To cut it short you need to put reload function from importlib module in the cell of interest:
import babs_visualizations
from importlib import reload
reload(babs_visualizations)
Just a little addition: make sure that you are addressing the function in the form of moldule.function. If you previously imported function by from module import function and then reloaded the module the function will not be reloaded. You can put the function inside the notebook cell and rerun it to see, how the changes in the module affected the function output in the notebook.
I hope this was helpful for someone.
I see how to change certain settings for matplotlib in such a way that the are used to configure it each time I launch, including when I launch interactively with
ipython --pylab
but I'm not sure how run arbitrary code each time I launch in this way, or how to ensure that certain packages have been imported. For example I'd like to do the following whenever I launch as above:
from mpltools import style
style.use('ggplot')
How do I load and run specific packages when I launch 'matplotlib'?
From the ipython website it seems you can place any .py file in the startup folder of your profile and it will be run when ipython is initiated. For help finding the profile where your startup folder is [see here].
To get the effect your looking for you need to check that matplotlib has been imported, the only way I can think of doing that is to add the following
import sys
if "matplotlib" in sys.modules:
# Do something
print
print "This seems to be working"
I have left it with the test so you can quickly determine if it is working.
Two things to note: I am using ipython 2.0.0-dev (in case this has any bearing), and secondly I suspect that calling sys.modules is not recommended, or at least that was the impression I had off the SO post I borrowed the idea from, perhaps someone better than I can enlighten us.