When I try to $> python ./tools/test.py I get an import error that I cannot import a module that exists in the directory from which I am invoking python. However, I can import this module, $> python -c "import mod" works.
I'm relying on the fact that ./ is (in effect) on the PYTHONPATH.
What is python doing to the python path when I run the interpreter on a script that exists in a different directory? Is there a way to "lock" the current working directory so that I can get the import to work?
My setup:
./mod.py :
x = 5 # just for demonstration purposes
./tools/test.py :
from mod import x
# ... snip ... actual content
I am invoking python from the directory that contains mod.py and tools/:
$> python -c "from mod import x" # works fine
$> python tools/test.py
Traceback (most recent call last):
File "tools/test.py", line 1, in <module>
from mod import x
ModuleNotFoundError: No module named 'mod'
Note that the current directory, which contains mod.py and tools is not on my PYTHONPATH.
I'm relying on the fact that ./ is (in effect) on the PYTHONPATH.
It's not. It's not on PYTHONPATH, and it's not on sys.path. When you run a script by file path, it's the script's directory that gets added to sys.path. You can see what gets added to sys.path for each way of specifying a program to run in the Python command line docs.
Related
I have a certain project structure:
- azima
- .vscode
- core
- project_setup.py
- helper
- log_helper
- venv
In project_setup.py:
import os
import json
import numpy as np
import pandas as pd
import random
from helper.log_helper import log
if __name__ == "__main__":
print('hello world')
Running this file in terminal:
(venv) rmali#rakeshmali:~/git/azima$ /home/rmali/git/azima/venv/bin/python /home/rmali/git/azima/core/project_setup.py
Traceback (most recent call last):
File "/home/rmali/git/azima/core/project_setup.py", line 6, in <module>
from helper.log_helper import log
ModuleNotFoundError: No module named 'helper'
I get this error. What am I doing wrong? Am I missing something?
But running like this python -m core.project_setup works.
Reason:
The path of folder azima does not in the sys.path(PYTHONPATH).
Solution:
You can do this to modify the PYTHONPATH:
Add these in the settings.json file to Modify the PYTHONPATH in the terminal:
"terminal.integrated.env.windows": {
"PYTHONPATH": "xxx/site-packages"
}
Create a .env file under your workspace, and add these settings in it to modify the PYTHONPATH for the extension and debugger:
PYTHONPATH=xxx/site-packages
You can refer to here to understand the effects of these two configurations.
Modify it directly in the python file. Add these codes in the b.py file.
import sys; sys.path.append("xxx/Project/src")
The reason that running
(venv) rmali#rakeshmali:~/git/azima$ python ./core/project_setup.py
fails while
(venv) rmali#rakeshmali:~/git/azima$ python -m core.project_setup
succeeds is that when running python -m <module-name, Python adds the current directory to the start of sys.path, which allows modules in that directory such as helper to be imported as top level modules, i.e. with import helper. Running python <script> does not add the current directory to the start of sys.path. Instead, Python adds the directory containing the script to the start of sys.path.
Here are the relevant sections of the docs.
The -m switch
As with the -c option, the current directory will be added to the start of sys.path.
and the docs for the -c option add
the current directory will be added to the start of sys.path (allowing modules in that directory to be imported as top level modules)
Docs for python <script>
If the script name refers directly to a Python file, the directory containing that file is added to the start of sys.path, and the file is executed as the __main__ module.
If the script name refers to a directory or zipfile, the script name is added to the start of sys.path and the __main__.py file in that location is executed as the __main__ module.
In python I have a folder with three files
- __init__.py
- module.py
- test_module.py
in which the module module.py is imported inside the file test_module.py as follows:
from . import module
Of course, when I just run test_module.py I get an error
> python test_module.py
Traceback (most recent call last):
File "test_module.py", line 4, in <module>
from . import module
ImportError: attempted relative import with no known parent package
But as I set the PYTHONPATH to the absolute path to where I am working in
export PYTHONPATH=`pwd`
I expect the import to work (as I did set the PYTHONPATH). But to my surprise I get the same error!
So can I fix the relative import error without any code change?
Since the directory you describe (let's call it thatdirectory) is a package, as marked by an __init__ file, you can "fix" that by cding a directory higher up, then running
python -m thatdirectory.test_module
since running modules with -m fixes up sys.path in a way that makes this particular configuration work.
(In general, any code that messes with sys.path manually, or requires changes to PYTHONPATH to work, is broken in my eyes...)
I read that to use a python file as a module for import, I need to put __init__.py in the directory.
I have the following directory structure:
data_load
-- __init__.py
-- rand_data.py
etc
-- __init__.py
-- test.py
In test.py I import a class defined in rand_data and I get the error:
python test.py
Traceback (most recent call last):
File "test.py", line 8, in <module>
from data_load.rand_data import RandData
ModuleNotFoundError: No module named 'data_load'
change to the parent directory of etc and data_load and type
python -m etc.test
This should do the job.
Here a small test case (Assuming you're on a linux machine)
## create the test case
mkdir -p import_issue/data_load import_issue/etc
touch import_issue/data_load/__init__.py import_issue/etc/__init__.py
echo 'print("I am", __name__)' > import_issue/etc/test.py
echo 'from data_load.rand_data import RandData' >> import_issue/etc/test.py
echo 'print("Randdata = ", RandData)' >> import_issue/etc/test.py
echo "class RandData:" > import_issue/data_load/rand_data.py
echo ' pass' >> import_issue/data_load/rand_data.py
#
# now perform the test
cd import_issue
python -m etc.test
The reason why things in your initial example didn't work out as expected is, that you present working directory was probably etc
and if you load a python script in etc, then it tries to import load_data relative to the present working directory (etc) and below etc there is no directory named rand_data, that has a file __init__.py in it.
My suggestion to fix is to go up to the common parent directory (This will now be your present working directory) and import etc/test as a module.
The reason is, that test.py is in a directory etc with an init.py so you should import it as etc.test and not call it directly.
Calling a file, that is a module directly with etc/test.py is not really recommended and can provoke some rare confusing situations.
instead of running command in etc directory python test.py run the python script in parent directory of etc and data_load.
ie what you need to do in terminal is (considering you are still in etc directory in terminal)
1. `cd ..`
2. `python etc/test.py`
this will work.
why are you facing error is because in test.py python is looking in the current directory for data_load module, and since there is no data_load module in etc directory it is giving this error.
how to solve above problem
append the sys.path in code
run the script from project main folder which contains all the package and python module and scripts.
I have a Python script, say myscript.py, that uses relative module imports, ie from .. import module1, where my project layout is as follows:
project
+ outer_module
- __init__.py
- module1.py
+ inner_module
- __init__.py
- myscript.py
- myscript.sh
And I have a Bash script, say myscript.sh, which is a wrapper for my python script, shown below:
#!/bin/bash
python -m outer_module.inner_module.myscript $#
This works to execute myscript.py and forwards the arguments to my script as desired, but it only works when I call ./outer_module/inner_module/myscript.sh from within the project directory shown above.
How can I make this script work from anywhere? For example, how can I make this work for a call like bash /root/to/my/project/outer_module/inner_module/myscript.sh?
Here are my attempts:
When executing myscript.sh from anywhere else, I get the error: No module named outer_module.inner_module. Then I tried another approach to execute the bash script from anywhere, by replacing myscript.sh with:
#!/bin/bash
scriptdir=`dirname "$BASH_SOURCE"`
python $scriptdir/myscript.py $#
When I execute myscript.sh as shown above, I get the following:
Traceback (most recent call last):
File "./inner_module/myscript.py", line 10, in <module>
from .. import module1
ValueError: Attempted relative import in non-package
Which is due to the relative import on the first line in myscript.py, as mentioned earlier, which is from .. import module1.
As the error message says:
ValueError: Attempted relative import in non-package
The solution to this is to create a package, and have your script execute with that package in its path.
You already have a package, as you have the __init__.py files in those directories; but you only have that package in your path when you are calling it from the project directory, as you mentioned; that's because . is in your Python path by default.
To fix this, just add the project directory to your Python path, and then invoke it with python -m outer_module.inner_module.myscript:
#!/bin/bash
export PYTHONPATH=$PYTHONPATH:$(dirname "$BASH_SOURCE")/../..
python -m outer_module.inner_module.myscript $#
You need to include the path to the outer module's parent directory in the PYTHONPATH environmental variable, then you can use the same command you used in the first script from anywhere.
The PYTHONPATH is where python searches for any modules you try to import:
#!/bin/bash
export PYTHONPATH=$PYTHONPATH:PATH/TO/MODULE/PARENTDIR
python -m outer_module.inner_module.myscript $#
two py file:
./src/foo.py
def bar ():
print 'bar!'
./tests/testfoo.py
from foo import bar
print 'testing'
bar ()
in the root folder './', calling
python ./tests/testfoo.py
the result is
Traceback (most recent call last):
File "./tests/testfoo.py", line 1, in <module>
from foo import bar
ImportError: No module named foo
running the testfoo.py but naturally module foo could not be found by the interpreter. can i give any parameter to interpreter to lookup modules in the folder 'src'?
Edit #1
also added two empty init.py file.
./src/__init__.py
./tests/__init__.py
still getting the same error.
Edit #2 (Solution)
I solved problem by adding a new initialization py file
prep.py
import os, sys
cur = os.path.dirname (__file__)
pathtest = os.path.join (cur, 'tests')
sys.path.append (pathtest)
then called:
python prep.py tests/testfoo.py
it worked
Try this:
import sys
sys.path.append('your/path')
import testfoo
You may need to make ./tests an absolute path...
The environment variable PYTHONPATH keeps the search paths for Python modules. For example
$ echo $PYTHONPATH
/home/username/lib/python2.6/site-packages:/usr/local/stsci_python/lib/python:
Modify this to add your directory:
export PYTHONPATH="$PYTHONPATH:/your/new/path"
(This is on BASH, by the way.)
Put a __init__.py in those directories to indicate to python that they should be treated like packages.
./src/__init__.py
./tests/__init__.py
This will fix the issue from the location you are running it, since python will check the current directory. But it will not fix the issue if you were to run it from some random location like:
/some/other/path/ $ /projects/tests/testfoo.py
What you need is to modify your PYTHONPATH to include that location:
PYTHONPATH=$PYTHONPATH:/projects
This can be added to either your shell environment, or done manually each time.