I've been having troubles with this and cannot find the solution... I've browsed all SO questions about this but nothing worked in my case... So here's the problem:
I clone a project from git repository
Set up a virtualenv called env in project base dir with proper Python version(2.7) & install all reqs succesfully
Here's where it gets fun... I navigate to the folder that has my manage.py and do a python manage.py makemigrations which results in
from foo.bar.data import CoolioClass
ImportError: No module named foo.bar.data
My directories look like this (just for illustration):
project_root/
├──env/
├──django/
│ ├── app
│ │ └── models.py (from foo.bar.data import CoolioClass)
│ ├── django
│ │ └── settings.py
│ └── manage.py
└──foo/
├── bar
│ ├── __init__.py
│ ├── data.py
│ └── test.py
├── baz
│ ├── __init__.py
│ ├── data.py
│ └── test.py
└── __init__.py
When I print sys.path in python shell it results in:
/home/johnny/project_root/env/lib/python2.7
/home/johnny/project_root/env/lib/python2.7/plat-x86_64-linux-gnu
/home/johnny/project_root/env/lib/python2.7/lib-tk
/home/johnny/project_root/env/lib/python2.7/lib-old
/home/johnny/project_root/env/lib/python2.7/lib-dynload
/usr/lib/python2.7
/usr/lib/python2.7/plat-x86_64-linux-gnu
/usr/lib/python2.7/lib-tk
/home/johnny/project_root/env/local/lib/python2.7/site-packages
/home/johnny/project_root/env/lib/python2.7/site-packages
Related
I can run test_authentication.py no problem in PyCharm (Shift + F10). However, when I try to run python test_authentication.py in Anaconda Prompt, I am getting this error:
(etradebot) PS H:\My Drive\etradebot\tests> python test_authentication.py
Traceback (most recent call last):
File "H:\My Drive\etradebot\tests\test_authentication.py", line 4, in <module>
from authentication.authentication import Authentication
ModuleNotFoundError: No module named 'authentication'
This is my project's structure.
etradebot/
├── authentication/
│ ├── __init__.py
│ ├── authentication.py
│
├── etrade/
│ ├── __init__.py
│ ├── etrade.py
│
├── model/
│ ├── __init__.py
│ ├── model.py
│
├── tests/
│ ├── __init__.py
│ ├── test_authentication.py
│ ├── test_etrade.py
│ ├── test_model.py
│ ├── test_utils.py
│
├── utils/
│ ├── __init__.py
│ ├── fake_data.py
│ ├── logging_config.py
│
├── .gitignore
├── chromedriver.exe
├── msedgedriver.exe
├── main.py
├── strategy.py
Everywhere else in my project where I import authentication.py module, it import fine using the line:
from authentication.authentication import Authentication
Both my Anaconda Prompt and PyCharm IDE are using the same environment etradebot. Why does my test work in PyCharm but not Anaconda Prompt? Any suggestions are greatly appreciated.
I tried running python test_authentication.py in Anaconda Prompt and expected the unit test to run normally. Running test_authentication.py in PyCharm (Shift + F10) did work. I double-checked that both my Anaconda Prompt and PyCharm were using the same Anaconda environment etradebot, and they are. So I am confused.
I have a git project with directory structure similar to this:
.
├── .github/
│ └── workflows/
│ └── test.yml
└── my_packages/
├── package_1/
│ ├── tests/
│ │ └── test_package.py
│ ├── package_logic.py
│ ├── configurations.yml
│ └── requirements.py
├── package_2/
│ ├── tests/
│ │ └── test_package.py
│ ├── package_logic.py
│ ├── configurations.yml
│ └── requirements.py
├── ...
└── package_n/
├── tests/
│ └── test_package.py
├── package_logic.py
├── configurations.yml
└── requirements.py
There are n packages in this repository and every package is independent one from another with its own requirements.py, logic, and tests.
There is also test.yml file, which is currently not implemented, that needs to create a testing environment for every package separately and run its test (pytest). Another requirement is that every package defines its python version in the configurations.yml file in which the module test should run. And finally, the test should run only for modules which code was changed.
How can this be implemented in Github Actions? And is it possible to run every module in a separate job (or just in parallel)?
I am very new to Python and I have the following structure for a project:
server/
├── config/
│ ├── __init__.py
│ ├── application.py
│ ├── dev.py
│ └── qa.py
├── lib/
│ ├── __init__.py
│ ├── redisdb.py
│ ├── logger.py
│ └── geo.py
└── scripts/
├── __init__.py
├── my_first_script.py
└── my_second_script.py
and in my_first_script.py file, I have the following code:
import pickle
from lib.redisdb import r
import re
import config.application as appconf
print( appconf.DOCUMENT_ENDPOINT )
partnerData = pickle.loads(r.get("partner_date_all"))
print( len(partnerData) )
When I run this code in the terminal using the command
python server/scripts/my_first_script.py
I am getting the following error:
Traceback (most recent call last):
File "my_first_script.py", line 3, in <module>
from lib.redisdb import r
ImportError: No module named lib.redisdb
I am using Python 2.7 version here. When I checked with Python 3 also, I have the same error. Now how can I execute this file? If my code doesn't have imports from the other local modules, everything works just fine.
Your modules are all siblings and you didn't not declare a parent package.
You could modify your structure this way so your modules can know each other.
server/ (assuming this is your project root)
server/ (assuming you want to call your package "server")
├── __init__.py
├── server.py (your package entry point)
├── config/
│ ├── __init__.py
│ ├── application.py
│ ├── dev.py
│ └── qa.py
├── lib/
│ ├── __init__.py
│ ├── redisdb.py
│ ├── logger.py
│ └── geo.py
└── scripts/
├── __init__.py
├── my_first_script.py
└── my_second_script.py
And now your imports can refer to the parent package:
for example:
# my_first_script.py
from server.config import application
This is my folder structure
src
├── __init__.py
└── foo
├── __init__.py
└── bar
├── __init__.py
├── events
│ ├── __init__.py
│ ├── execute
│ │ ├── __init__.py
│ │ ├── helloworld.py
│ │ ├── run.py
└── settings.py
$ cat src/foo/__init__.py outputs...
from pkgutil import extend_path
__path__ = extend_path(__path__, __name__)
in src/foo/bar/events/execute/run.py, I want to do something like this
from foo.bar.events.execute.helloworld import HelloWorld
I get the error
No module named 'foo.bar'
This is how I'm running my app
I understand it's not proper, but there's a reason I just simplified the question for brevity
$ python src/foo/bar/events/execute/run
How do I achieve importing this way from src/foo/bar/events/execute/run.py?
from foo.bar.events.execute.helloworld import HelloWorld
I have a python Celery project of the following structure:
├── celeryconfig.py
├── Dockerfile
│ └── _templates
├── framework
│ ├── celery
│ │ ├── celery.py
│ │ ├── __init__.py
│ │ └── __init__.cpython-36.pyc
│ ├── tasks
│ │ ├── tasks.py
│ │ ├── __init__.py
│ ├── utilities
│ │ ├── utilities.py
│ │ ├── __init__.py
│ ├── tests
│ │ ├── __init__.py
│ │ └── test_celery.py
Which I run from the top level directory using the command celery framework.celery.celery worker
I have tets in the tests directory that I run using nose2. When I run it in the directory the tests pass without issue. However when the tests are run as part of a docker build process, the nose2 process fails because it can't make sense of the imports, for example
# test_celery.py
from framework.tasks.tasks import my_function
def test_my_function():
# Do some testing.
The import of the function succeeds, but fails on the imports of the imported file, which look like:
# tasks.py
from framework.utilities.utilities import some_other_function
Part of the issue I'm having is that Celery is particular about how it itself is structured, so attempts to restructure the directory have just resulted in celery not being able to start. I'm not quite sure why the tests only fail in a docker build, which looks like this:
FROM google/cloud-sdk:198.0.0-alpline
COPY . /project
WORKDIR /project
RUN apk add --no-cache python3 python3-dev && pip3 install -r requirements.txt
RUN nose2