NameError when trying to import function from another script - python

I've seen many posts on this and the solution always seems to be to add * when importing the original script to be able to access its functions. I have done this and still get the error.
I have a file called toolbox.py whose path is: /Users/justinbenfit/Desktop/Programming/Python/Explorium /transformations_deduping/ipynb/toolbox.py and contains the following:
def tz():
print('works')
Then I have a file called explorium_check_acceptable.ipynb whose path is: /Users/justinbenfit/Desktop/Programming/Python/Explorium /transformations_deduping/ipynb/explorium_check_acceptable.ipynb
and contains:
from toolbox import *
tz()
which results in:
NameError Traceback (most recent call last)
/var/folders/cf/ft88j_856fv5rk3whgs12d9w0000gq/T/ipykernel_20696/4114475549.py in <module>
2 import numpy as np
3 from toolbox import *
----> 4 tz()
NameError: name 'tz' is not defined
I don't know what else could be wrong unless you can't import functions from a .py file to a .ipynb file.
update:
I ran
!export PYTHONPATH=.
!python -m site
in one of my ipynb code blocks and got this:
sys.path = [
'/Users/justinbenfit/Desktop/Programming/Python/Explorium /transformations_deduping/ipynb',
'/Users/justinbenfit/opt/anaconda3/lib/python39.zip',
'/Users/justinbenfit/opt/anaconda3/lib/python3.9',
'/Users/justinbenfit/opt/anaconda3/lib/python3.9/lib-dynload',
'/Users/justinbenfit/opt/anaconda3/lib/python3.9/site-packages',
'/Users/justinbenfit/opt/anaconda3/lib/python3.9/site-packages/aeosa',
'/Users/justinbenfit/opt/anaconda3/lib/python3.9/site-packages/locket-0.2.1-py3.9.egg',
]
USER_BASE: '/Users/justinbenfit/.local' (exists)
USER_SITE: '/Users/justinbenfit/.local/lib/python3.9/site-packages' (exists)
ENABLE_USER_SITE: False

To debug import issues, it is always helpful to view (and post on SO) the output of python -m site, which will reveal the contents of sys.path in a readable way.
You will usually want
export PYTHONPATH=.
or similar, to ensure that current directory is in sys.path.
Notice that you'll need that export
in the context of the running python kernel,
when you start it.
Also, the current working directory
at time of starting will make a difference.
Inspect sys.path within your notebook to verify,
perhaps using:
from pprint import pp
import sys
pp(sys.path)
It appears you have this module in your path:
https://pypi.org/project/toolbox
Try
from toolbox import Item
to verify that.
Also, as a very generic way of testing such import details,
a rename to e.g. toolbox1.py is a good sanity check.
That is, we expect the import's success to track
whether it has caught up with the new name yet, or not.
A classic example is some poor student naming a file test.py.
Notice that python -c 'import test' always succeeds,
since there is a seldom-used builtin test module.
Prefer
from toolbox import tz
over the * format.
It makes your code easier to read and to analyze.
And here, it would offer a clearer diagnostic,
failing in the import statement rather than waiting until the tz() reference.

Related

Attempted relative import with no known parent package error

Error: While importing "wsgi-contract-test", an ImportError was raised:
Traceback (most recent call last):
File "/Users/karl/Development/tral/bin/tral-env/lib/python3.9/site-packages/flask/cli.py", line 236, in locate_app
__import__(module_name)
File "/Users/karl/Development/tral/test/contract/inject/wsgi-contract-test.py", line 8, in <module>
from . import (
ImportError: attempted relative import with no known parent package
ERROR (run-tral): trap on error (rc=2) near line 121
make: *** [component.mk:114: tral-local-run-api-contract-test] Error 2
wsgi-contract-test.py:
from ...libs.tral import app, setup_from_env
from ...libs.tral import routes
I have the regular source files under the libs/tral directory, however the entry file for this test is located under test/contract/inject. I do NOT want to move this test file into libs since this file should be nowhere near production code as it is a rather hazardous file security wise.
In node.js this would of worked fine but there seems to be something with python imports I'm not grasping?
Since tests don't belong inside the src tree, there are basically two approaches. If you are testing a library you will frequently install it (in a virtualenv) and then just import it exactly the same way you would anywhere else. Frequently you also do this with a fresh install for the test: this has the great advantage that your tests mirror real setups, and helps to catch bugs like forgetting to commit/include files (guilty!) and only noticing once you've deployed...
The other option is to modify sys.path so that your test is effectively in the same place as your production code:
# /test/contract/inject.py
from pathlib import Path
import sys
sys.path.insert(0, str(Path(__file__).parent.parent.parent))
Since sys.path uses strs you may prefer to use os.path, like every single other example on SO. Personally I'm addicted to pathlib. Here:
__file__ is the absolute path of the current file
Path(str) wraps it in a pathlib.Path object
.parent gets you up the tree, and
str() casts back to the format sys.path expects, or it won't work.
Using a test runner
Normally we use a test runner for tests. But these tests need a running instance! No problem:
# tests/conftest.py
import pytest
from sys
from pathlib import Path
sys.path.insert(0, str(Path(__file__).parent.parent))
# now pytest can find the src
from app import get_instance # dummy
#pytest.fixture
def instance():
instance = get_instance(some_param)
# maybe
instance.do_some_setup()
yield instance
# maybe
instance.do_some_cleanup()
If you don't need to do any cleanup, you can just return the instance rather than yielding it. Then in another file (for neatness) you write tests like this:
# tests/test_instance.py
def test_instance(instance): # note how we requested the fixture
assert instance.some_method()
And you run your tests with pytest:
pytest tests/
Pytest will run the code in conftest.py, discover all test fns starting with test in files whose names start with test, run the tests (supplying all the fixtures you have defined) and report.
Fixture Lifetime
Sometimes spinning up a fixture can be expensive. See the docs on fixture scope for telling pytest to keep the fixture around and supply it to multiple tests.
Using a runner vs. just running a script
You don't have to use a runner. But they do have many advantages:
decent metrics e.g. code/branch coverage
everyone uses them
parallel testing against different envs (e.g. python versions)
tests tend to be a lot neater
I took for granted that Python was able to handle simple relative paths; not the case. Instead I just added the path to the packages I wanted to include in the PYTHONPATH variable and walla, it found everything.
export PYTHONPATH="${PYTHONPATH}:$(pwd)/libs"
Then running it from the root project directory.
I had to change the code to the following though:
from tral import app, setup_from_env
from tral import routes

how to import scripts as modules in ipyhon?

So, I've two python files:
the 1st "m12345.py"
def my():
return 'hello world'
the 2nd "1234.py":
from m12345 import *
a = m12345.my()
print(a)
On ipython I try to exec such cmds:
exec(open("f:\\temp\\m12345.py").read())
exec(open("f:\\temp\\1234.py").read())
the error for the 2nd command is:
ImportError: No module named 'm12345'
Please, help how to add the 1st file as a module for the 2nd?
First off, if you use the universal import (from m12345 import *) then you just call the my() function and not the m12345.my() or else you will get a
NameError: name 'm12345' is not defined
Secondly, you should add the following snippet in every script in which you want to have the ability of directly running it or not (when importing it).
if "__name__" = "__main__":
pass
PS. Add this to the 1st script ("m12345.py").
PS2. Avoid using the universal import method since it has the ability to mess the namespace of your script. (For that reason, it isn't considered best practice).
edit: Is the m12345.py located in the python folder (where it was installed in your hard drive)? If not, then you should add the directory it is located in the sys.path with:
import sys
sys.path.append(directory)
where directory is the string of the location where your m12345.py is located. Note that if you use Windows you should use / and not \.
However it would be much easier to just relocate the script (if it's possible).
You have to create a new module (for example m12345) by calling m12345 = imp.new_module('m12345') and then exec the python script in that module by calling exec(open('path/m12345.py').read(), m12345.__dict__). See the example below:
import imp
pyfile = open('path/m12345.py').read()
m12345 = imp.new_module('m12345')
exec(pyfile, m12345.__dict__)
If you want the module to be in system path, you can add
sys.modules['m12345'] = m12345
After this you can do
import m12345
or
from m12345 import *

Backing up/copying an entire folder tree in batch or python?

I'm trying to copy an entire directory from one locations to another via python every 7 days to essentially make a backup...
The backup folder/tree folder may or may not exist so it needs to create the folder if it doesn't exist, that's why I assumed distutils is better suited over shutil
Note Is it better for me to use batch or some other language for the said job?
The following code:
import distutils
distutils.dir_util.copy_tree("C:\Users\A\Desktop\Test", "C:\Users\A\Desktop\test_new", preserve_mode=1, preserve_times=1, preserve_symlinks=0, update=1, verbose=0, dry_run=0)
Returns:
Traceback (most recent call last):
File "C:\Users\A\Desktop\test.py", line 2, in <module>
distutils.dir_util.copy_tree("C:\Users\A\Desktop\test", "C:\Users\A\Desktop\test2", preserve_mode=1, preserve_times=1, preserve_symlinks=0, update=1, verbose=0, dry_run=0)
AttributeError: 'module' object has no attribute 'dir_util'
What am I doing wrong?
Thanks in advance
- Hyflex
You need to import dir_util specifically to access it's functions:
from distutils import dir_util
If there are other modules in that package that you need, add them to the line, separated by commas. Only import the modules you need.
For Unix/Linux, I suggest 'rsync'.
For windows: xcopy
I've been attempting essentially the same thing to back up what I write on a plethora of virtual machines.
I ran into the same problem you did with distutils. From what I can tell, the Python community is using the distutils module to start standardizing how new modules interface with Python. I think they're still in the thick of it though as everything I've seen relating to it seems more complicated, not less complicated. Hopefully, I'm just seeing all the crazy that happens in the middle of a big change.
But I did figure out how to get it working. To use distutil.dir_util.copytree(),
>>> from distutils import dir_util
>>> dir_util.copy_tree("/home/user/backing_up/temp", "/home/user/backing_up/other")
['/home/user/backing_up/other/stuff.txt'] # Return value indicating success
If you feel like it's worthwhile, you can import distutils.core and make the longer call to distutils.dir_util.copy_tree().
>>> import distutils.core
>>> distutils.dir_util.copy_tree("/home/user/backing_up/temp", "/home/user/backing_up/other")
['/home/user/backing_up/other/stuff.txt'] # Return value indicating success
(I know, I know, there are subtle differences between "import module.submodule" and "from module import submodule" but that's not the intent of the question and so long as you're importing the correct stuff and calling the functions appropriately, it doesn't make a difference.)
Like you, I also explicitly stated that I wanted the default for preserve_mode and preserve_times, but I didn't touch the other variables. Everything worked as expected once I imported and called the function the way it wanted me to.
Now that my back up script works, I realize I should have written it in Bash since I plan on having it run whenever the machine goes to a specific runlevel. I'm using a wrapper instead now, even if I should just re-write it.

In Python, can one get the path of a relative import imported after and before chdir calls?

I'm looking to get the path of a module after os.chdir has been called.
In this example:
import os
os.chdir('/some/location')
import foo # foo is located in the current directory.
os.chdir('/other/location')
# How do I get the path of foo now? ..is this impossible?
..the foo.__file__ variable will be 'foo.py', as will inspect.stack()[0][1] -- yet, there's no way to know where 'foo.py' is located now, right?
What could I use, outside (or inside, without storing it as a variable at import time) of 'foo', which would allow me to discover the location of foo?
I'm attempting to build a definitive method to determine which file a module is executing from. Since I use IPython as a shell, this is something I could actually run into.
Example usage:
I have two versions of a project I'm working on, and I'm comparing their behavior during the process of debugging them. ..let's say they're in the directories 'proj1' and 'proj2'. ..which foo do I have loaded in the IPython interpreter again?
The ideal:
In [242]: from my_tools import loc
In [243]: loc(foo)
'/home/blah/projects/proj2/foo.py'
** As abarnert noted, that is not possible, as python does not record the base directory location of relative imports. This will, however, work with normal (non-relative) imports.
** Also, regular python (as opposed to IPython) does not allow imports from the current directory, but rather only from the module directory.
The information isn't available anymore, period. Tracebacks, the debugger, ipython magic, etc. can't get at it. For example:
# foo.py
def bar():
1/0
$ ipython
In [1]: import foo
In [2]: os.chdir('/tmp')
In [3]: foo.baz()
---------------------------------------------------------------------------
ZeroDivisionError Traceback (most recent call last)
<ipython-input-5-a70d319d0d05> in <module>()
----> 1 foo.baz()
/private/tmp/foo.pyc in baz()
ZeroDivisionError: integer division or modulo by zero
So:
the foo.__file__ variable will be 'foo.py', as will inspect.stack()[0][1] -- yet, there's no way to know where 'foo.py' is located now, right?
Right. As you can see, Python treats it as a relative path, and (incorrectly) resolves it according to the current working directory whenever it needs an absolute path.
What could I use, outside (or inside, without storing it as a variable at import time) of 'foo', which would allow me to discover the location of foo?
Nothing. You have to store it somewhere.
The obvious thing to do is to store os.path.abspath(foo.__file__) from outside, or os.path.abspath(__file__) from inside, at import time. Not what you were hoping for, but I can't think of anything better.
If you want to get tricky, you can build an import hook that modifies modules as they're imported, adding a new __abspath__ attribute or, more simply, changing __file__ to always been an abspath. This is easier with the importlib module Python 3.1+.
As a quick proof of concept, I slapped together abspathimporter. After doing an import imppath, every further import you do that finds a normal .py file or package will absify its __file__.
I don't know whether it works for .so/.pyd modules, or .pyc modules without source. It definitely doesn't work for modules inside zipfiles, frozen modules, or anything else that doesn't use the stock FileFinder. It won't retroactively affect the paths of anything imported before it. It requires 3.3+, and is horribly fragile (most seriously, the FileFinder class or its hook function has to be the last thing in sys.path_hooks—which it is by default in CPython 3.3.0-3.3.1 on four Mac and linux boxes I tested, but certainly isn't guaranteed).
But it shows what you can do if you want to. And honestly, for playing around in iPython for the past 20 minutes or so, it's kind of handy.
import os
import foo
foodir = os.getcwd()
os.chdir('/other/location')
foodir now has original directory stored in it...

python refresh/reload

This is a very basic question - but I haven't been able to find an answer by searching online.
I am using python to control ArcGIS, and I have a simple python script, that calls some pre-written code.
However, when I make a change to the pre-written code, it does not appear to result in any change. I import this module, and have tried refreshing it, but nothing happens.
I've even moved the file it calls to another location, and the script still works fine. One thing I did yesterday was I added the folder where all my python files are to the sys path (using sys.append('path') ), and I wonder if that made a difference.
Thanks in advance, and sorry for the sloppy terminology.
It's unclear what you mean with "refresh", but the normal behavior of Python is that you need to restart the software for it to take a new look on a Python module and reread it.
If your changes isn't taken care of even after restart, then this is due to one of two errors:
The timestamp on the pyc-file is incorrect and some time in the future.
You are actually editing the wrong file.
You can with reload re-read a file even without restarting the software with the reload() command. Note that any variable pointing to anything in the module will need to get reimported after the reload. Something like this:
import themodule
from themodule import AClass
reload(themodule)
from themodule import AClass
One way to do this is to call reload.
Example: Here is the contents of foo.py:
def bar():
return 1
In an interactive session, I can do:
>>> import foo
>>> foo.bar()
1
Then in another window, I can change foo.py to:
def bar():
return "Hello"
Back in the interactive session, calling foo.bar() still returns 1, until I do:
>>> reload(foo)
<module 'foo' from 'foo.py'>
>>> foo.bar()
'Hello'
Calling reload is one way to ensure that your module is up-to-date even if the file on disk has changed. It's not necessarily the most efficient (you might be better off checking the last modification time on the file or using something like pyinotify before you reload), but it's certainly quick to implement.
One reason that Python doesn't read from the source module every time is that loading a module is (relatively) expensive -- what if you had a 300kb module and you were just using a single constant from the file? Python loads a module once and keeps it in memory, until you reload it.
If you are running in an IPython shell, then there are some magic commands that exist.
The IPython docs cover this feature called the autoreload extension.
Originally, I found this solution from Jonathan March's blog posting on this very subject (see point 3 from that link).
Basically all you have to do is the following, and changes you make are reflected automatically after you save:
In [1]: %load_ext autoreload
In [2]: %autoreload 2
In [3]: Import MODULE
In [4]: my_class = Module.class()
my_class.printham()
Out[4]: ham
In [5]: #make changes to printham and save
In [6]: my_class.printham()
Out[6]: hamlet
I used the following when importing all objects from within a module to ensure web2py was using my current code:
import buttons
import table
reload(buttons)
reload(table)
from buttons import *
from table import *
I'm not really sure that is what you mean, so don't hesitate to correct me. You are importing a module - let's call it mymodule.py - in your program, but when you change its contents, you don't see the difference?
Python will not look for changes in mymodule.py each time it is used, it will load it a first time, compile it to bytecode and keep it internally. It will normally also save the compiled bytecode (mymodule.pyc). The next time you will start your program, it will check if mymodule.py is more recent than mymodule.pyc, and recompile it if necessary.
If you need to, you can reload the module explicitly:
import mymodule
[... some code ...]
if userAskedForRefresh:
reload(mymodule)
Of course, it is more complicated than that and you may have side-effects depending on what you do with your program regarding the other module, for example if variables depends on classes defined in mymodule.
Alternatively, you could use the execfile function (or exec(), eval(), compile())
I had the exact same issue creating a geoprocessing script for ArcGIS 10.2. I had a python toolbox script, a tool script and then a common script. I have a parameter for Dev/Test/Prod in the tool that would control which version of the code was run. Dev would run the code in the dev folder, test from test folder and prod from prod folder. Changes to the common dev script would not run when the tool was run from ArcCatalog. Closing ArcCatalog made no difference. Even though I selected Dev or Test it would always run from the prod folder.
Adding reload(myCommonModule) to the tool script resolved this issue.
The cases will be different for different versions of python.
Following shows an example of python 3.4 version or above:
hello import hello_world
#Calls hello_world function
hello_world()
HI !!
#Now changes are done and reload option is needed
import importlib
importlib.reload(hello)
hello_world()
How are you?
For earlier python versions like 2.x, use inbuilt reload function as stated above.
Better is to use ipython3 as it provides autoreload feature.

Categories

Resources