Path appended but python does not find module - python

I have the following structure:
~/git/
~/git/folder1
~/git/folder2
in ~/git/folder1 I have main.py, which imports doing the following:
import folder2.future_data as future_data
which throws the following error:
import folder2.future_data as f_d
ImportError: No module named folder2.future_data
Despite my $PATH containing
user#mac-upload:~$ echo $PATH
/home/user/anaconda2/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/home/user/git/folder2
Why am I unable to import from folder2 despite it being in my path?
Am I missing something?

Try putting an empty __init__.py file in each directory (~/git, ~/git/folder1, and ~/git/folder2). Then do export PYTHONPATH=${HOME}/git:$PYTHONPATH (assuming bash shell).
This will also allow you to just set your PYTHONPATH once at the top level and be done with it. If you add more directories (modules) that you need to import, you can just keep adding __init__.py files to your structure (instead of having to constantly modify your PYTHONPATH every time your file/directory structure changes).

You can explicitly added the path inside the main.py script before you doing import
import sys
sys.path.append(r'~/git/folder2')
import future_data

Related

Getting a ModuleNotFoundError when trying to import from a particular module

The directory I have looks like this:
repository
/src
/main.py
/a.py
/b.py
/c.py
I run my program via python ./main.py and within main.py there's an important statement from a import some_func. I'm getting a ModuleNotFoundError: No module named 'a' every time I run the program.
I've tried running the Python shell and running the commands import b or import c and those work without any errors. There's nothing particularly special about a either, it just contains a few functions.
What's the problem and how can I fix this issue?
repository/
__init__.py
/src
__init__.py
main.py
a.py
b.py
c.py
In the __init__.py in repository, add the following line:
from . import repository
In the __init__.py in src, add the following line:
from . import main
from . import a
from . import b
from . import c
Now from src.a import your_func is going to work on main.py
Maybe you could try using a relative import, which allows you to import modules from other directories relative to the location of the current file.
Note that you will need to add a dot (.) before the module name when using a relative import, this indicates that the module is in the same directory as the current file:
from . import a
Or try running it from a different directory and appending the /src path like this:
import sys
sys.path.append('/src')
You could also try using the PYTHONPATH (environment variable) to add a directory to the search path:
Open your terminal and navigate to the directory containing the main.py file (/src).
Set the PYTHONPATH environment variable to include the current directory, by running the following command
export PYTHONPATH=$PYTHONPATH:$(pwd)
At last you could try to use the -m flag inside your command, so that Python knows to look for the a module inside the /src directory:
python -m src.main
I've had similar problems in the past. Imports in Python depend on a lot of things like how you run your program, as a script or as a module and what is your current working directory.
Thus I've created a new import library: ultraimport It gives the programmer more control over imports and lets you do file system based, relative imports.
Your main.py could look like this:
import ultraimport
a = ultraimport('__dir__/a.py')
This will always work, no matter how you run your code, no matter what is your sys.path and also no init files are necessary.

Python 3.8.x Intra and Extra Imports

When defining multiple python 3 packages in a project, I cannot determine how to configure the environment (using VS Code) such that I can run a file in any package with intra-package imports without breaking extra-package imports.
Example with absolute paths:
src/
foo/
a.py
b.py
goo/
c.py
# b.py contents
from foo.a import *
# c.py contents
from foo.b import *
where PYTHONPATH="${workspaceFolder}\src", so it should be able to see the foo and goo directories.
I can run c.py, but running b.py gives a ModuleNotFoundError: "No module named 'foo'".
Modifying b.py to use relative paths:
# modified b.py contents
from a import *
Then allows me to run b.py, but attempting to then run c.py gives a ModuleNotFoundError: "No module named 'b'".
I have also added __init__.py files to foo/ and goo/ directories, but the errors still occurs.
I did set the cwd in the launch.json file to be the ${workspaceFolder} directory, if that is relevant.
A solution I was trying to avoid, but does work in-case there are no better solutions, is putting each individual package-dependency onto the PYTHONPATH environment variable.
E.g., "${workspaceFolder}\src;${workspaceFolder}\foo" when running c.py.
But this is not an easily scalable solution and is also something that needs to be tracked across project changes, so I do not think it is a particularly good or pythonic solution.
In fact, you can find it didn't work if you try to print the current path. It just add ".." to the path instead of the parent directory.
import sys
print(sys.path)
sys.path.append('..')
print(sys.path)
So, what we have to do is add the path of module we used to "sys" in path.
Here we have three ways:
Put the written .py file into the directory that has been added to the system environment variable;
Create a new .pth file under \Python\Python310\Lib\site-packages.
Write the path of the module in this new .pth file , one path per line.
Using the pythonpath environment variable as upstairs said.
Then we can use this two ways to import :
sys. path. append ('a').
sys. path. insert (0, 'a')

How to import module from other directory in python

I have a file tree like this:
app
|---src
| |---vlep/config.py
|
|---tests
|---conftest.py
from conftest.py I am trying to import vlep.confi as config and I am getting the ModulenotFoundError: No module named 'vlep'. I am using venv and I added absolute paths for app, src and vlep directories to venv/bin/activate, so when I activate the virtualenv I can have these directories in the PYTHONPATH env. I thought that this would do the job, but no...and have no idea why. What am I doing wrong?
You can insert the path to the src folder in sys.path like this:
import os
import sys
sys.path.insert(0, f'{os.path.dirname(os.path.abspath(__file__))}/../src')
from vlep import config
This way the absolute path to your src directory will be first when Python resolves where to import the module from. And it also does not matter what directory you run conftest.py from.
You need to set the file path manually to import the modules from another directory. You can assign a directory path to the PYTHONPATH variable and still get your program working.
In Linux, you can use the following command in the terminal to set the path:
export PYTHONPATH='path/to/directory'
In Windows system :
SET PYTHONPATH='path/to/directory'
To see if PYTHONPATH variable holds the path of the new folder, you can use the following command:
echo $PYTHONPATH
Then you can do your imports:
from conftest import vlep.confi as config
Tks for your time guys. After all there was not wronging to what I was doing to allow the import. The export PYTHONPATH=":/home/user/app/src/vlep" added to the end of venv/bin/activate was enough. The problem was that I was running a script that was changing the value of PYTHONPATH.

Basic Python import mechanics

I have the following directory tree:
project/
A/
__init__.py
foo.py
TestA/
__init__.py
testFoo.py
the content of testFoo is:
import unittest
from A import foo
from the project directory I run python testA/testFoo.py
I get a ModuleNotFoundError No module named A
I have two question: how to improt and run A.foo from TestA.testFoo and why is it so difficult to grasp the import logic in Python? Isn't there any debug trick to solve this kind of issues rapidly, I'm sorry I have to bother you with such basics questions?
When your are executing a file an environment variable called python path is generated, python import work with this variable to find your file to import, this path is generated with the path of the file you are executing and it will search in the current directory and sub directories containing an __init__.py file, if you want to import from a directory on the same level you need to modify your python path or change the architecture of your project so the file executed is always on top level.
you can include path to your python path like this :
import sys
sys.path.insert(0, "/path/to/file.py")
You can read more on import system : https://docs.python.org/3/reference/import.html
The best way in my opinion is to not touch the python path and include your test directoy into the directory where tested files are:
project/
A/
__init__.py
foo.py
TestA/
__init__.py
testFoo.py
Then run the python -m unittest command into your A or project directory, it will search into your current and sub directories for test and execute it.
More on unittest here : https://docs.python.org/3/library/unittest.html
Add the folder project/testA to the system pythonpath first:
import sys
sys.path.insert(0, "/path/to/pythonfile")
and try the import again.
Can you try this ?
Create an empty file __init__.py in subdirectory TestA. And add at the begin of main code
from __future__ import absolute_import
Then import as below :
import A.foo as testfoo
The recommended way in py3 may be like below
echo $pwd
$ /home/user/project
python -m testA.testFoo
The way of execute module python -m in python is a good way to replace relative references。
You definitely cannot find A because python need look from sys.path, PYTHONPATH to find the module.
And python will automatically add current top level script to sys.path not currently directory to sys.path. So if you add print(sys.path) in testFoo.py, you will see it only add project/TestA to the sys.path.
Another word, the project did not be included in sys.path, then how python can find the module A?
So you had to add the project folder to sys.path by yourself, and, this just needed in top script, something like follows:
import unittest
import sys
import os
file_path = os.path.abspath(os.path.dirname(__file__)).replace('\\', '/')
lib_path = os.path.abspath(os.path.join(file_path, '..')).replace('\\', '/')
sys.path.append(lib_path)

Access file from nested directory under PYTHONPATH

I have checked many SO questions, but didn't help me to solve my issue.
I have a folder structure:
|--test/foo.py
|--library/ #This is set as my PYTHONPATH
|--|--file1.py
|--|--folder1
|--|--|--util.py
I am trying to access util.py from foo.py.
Note: At this point i am able to access all the files under library from test folder. But whenever i try to access library/folder1/util.py, it gives an error saying "ImportError: No module named util"
I have tried this so far:
foo.py
import os
import sys
import file1
sys.path.insert(0, '/folder1/')
import util
util.function_name
#do something
This approach works but then i am not able to use "library/file1.py".
Is there any cleaner way to avoid this?
Note: These are only folder structure (I am maintaining to differentiate files), not modules, (so i believe i can not use __init__.py and something like import utils.functionname)
If you want to brute force it, run this from your top-level directory (in your case, the unnamed directory that contains test/ and library/):
export PYTHONPATH=$(myarray=($(find "$(pwd -P)" -type d)) \
&& printf '%s\n' "$(IFS=:; printf '%s' "${myarray[*]}")")
This will add your top-level directory and every subdirectory to the python path. No more nonsense.

Categories

Resources