Importing modules from a folder in python27 - python

This sounds ridiculous as there appears to be an unlimited number of responses to this question on this site - but I can't find a straightforward solution without temporarily changing my system path for each reload (or init, which doesn't work for my setup). I'm looking for a secure, non-hacky way of getting this done.
Simply put - I have a directory structure as follows:
**root**
>main.py
>**modules**
>>rivescript.py
>>js.py
>**plugins**
>>weather.py
>>synd.py
To make it simple, I would like to import every available module in the presented subdirectories (modules, plugins) natively in main.py
Pseudo:
#main.py
import "./modules/*.py" as modules_*
import "./plugins/*.py" as plugins_*
And be able to call functions as something like:
plugins_weather.get("3088")
modules_rivescript.RiveScript.reply("localuser", language_input)
Any suggestions? Speed and resource consumption are a big thing for this project.

First, you should put __init__.py files (which could be empty) in modules/ and plugins/ directories, to mark them as packages.
Now, you are able to import your modules in main.py:
import modules.js as js
import modules.rivescript as rivescript
import plugins.weather as weather
import plugins.synd as synd
weather.get("3088") # Usage example

Related

Why some package can import a module that does *not* exist in the folder that it means to import from?

When I try to look at the implementation of some numpy functions, I realized a strange thing (description below).
In this python script: /Users/myname/opt/anaconda3/envs/ml/lib/python3.7/site-packages/numpy/core/multiarray.py
Inside this file: from numpy.core._multiarray_umath import *
However, when I go to the folder to check, I also use terminal to ls all files under the that folder /Users/myname/opt/anaconda3/envs/ml/lib/python3.7/site-packages/numpy/core, the _multiarray_umath is not included in any of the file under that folder, as shown below:
__init__.py
__pycache__
_add_newdocs.py
_aliased_types.py
_asarray.py
_dtype.py
_dtype_ctypes.py
_dummy.cpython-37m-darwin.so
_exceptions.py
_internal.py
_methods.py
_multiarray_tests.cpython-37m-darwin.so
_multiarray_umath.cpython-37m-darwin.so
_operand_flag_tests.cpython-37m-darwin.so
_rational_tests.cpython-37m-darwin.so
_string_helpers.py
_struct_ufunc_tests.cpython-37m-darwin.so
_type_aliases.py
_ufunc_config.py
_umath_tests.cpython-37m-darwin.so
arrayprint.py
cversions.py
defchararray.py
einsumfunc.py
fromnumeric.py
function_base.py
generate_numpy_api.py
getlimits.py
include
info.py
lib
machar.py
memmap.py
multiarray.py
numeric.py
numerictypes.py
overrides.py
records.py
setup.py
setup_common.py
shape_base.py
tests
umath.py
umath_tests.py
I'm really confused how such mechanism works? Can you please explain it with plain English and also point me to some resources that I can learn the motivation behind all such package development practice?

Python import function

root
- Module_1
- utils.py
- Module_2
- basic.py
I have a couple of functions in utils.py which I want to use in basic.py, I tried every possible option but not able to do it, I referred many answers, I don't want to use sys path. Is it possible, thank u in advance (:
If I have to add __init__.py, where all should I add it??
In fact there's multiple ways.
I personally would suggest a way, where you do not have to mess with the environment variable PYTHONPATH and where you do not have to change sys.path
The simplest suggestion and preferred if possible in your context
If you can, then just move Module_2/basic.py one level up, so that it is located in the parent directory of Module_1 and Module_2.
create an empty file Module_1/__init__.py
and use following import in basic.py
from Module_1.utils import function
My second preferred suggestion:
I'd suggest to create two empty files.
Module_1/__init__.py
Module_2/__init__.py
By the way I hope, that the real directory names do not contain any uppercase characters. This is strongly discouraged. compared to python2 python 3 contains no (or almost no) module with uppercase letters anymore. So I'd try to keep the same spirit.
then in basic just write
from Module_1.utils import function
!!! But:
Instead of typing
python Module_2/basic.py
you had to be in the parent directory of Module_1 and Module_2
python -m Module_2.basic
or if under a unix like operation system you could also type
PYTHONPATH="/path/to/parent_of_Mdule_1" python Module_2/basic.py
My not so preferred suggestion:
It looks simpler but is a little more magic and might not be desired with bigger projects in a bigger contents.
It seems simpler, but for bigger projects you might get lost and when you start using coding style checkers like flake8 you will get coding style warnings
Just add at the beginning of Module_2/basic.py following lines.
import os
import sys
TOPDIR = os.path.dirname(os.path.dirname(os.path.realpath(__file__)))
sys.path.insert(0, TOPDIR)
and import then the same way as in my first solution
from Module_1.utils import function
If you want to make a custom package importable the, put __init__.py in that dir. In your case, __init__.py should be in Module_1 dir. Similarly, if you want utilise the code of basic.py somewhere else then put __init__.py in Module_2 dir.
try add __init__.py in Module_1 folder and import in basic.py like this:
from Module_1.utils import func

Is there any advantage to importing a module by relative path vs actual package name?

Let's say I have a folder structure as below.
project/
-> app/
--> __init__.py (has db = SQLAlchemy(app))
--> model.py
I need to import db in model.py. I can either import it using
from app import db
or
from . import db
Is there a difference between the two? Does one method have any advantages over the other method?
Absolute imports are preferred because they are quite clear and straightforward. It is easy to tell exactly where the imported resource is, just by looking at the statement. In fact, pep8 explicitly recommends absolute imports.
Sometimes, however, absolute imports can get quite verbose, depending on the complexity of the directory structure. Imagine having a statement like this:
from package1.subpackage2.subpackage3.subpackage4.module5 import function6
This looks ridiculous! Right?
So, Relative imports comes into picture. A relative import specifies the resource to be imported relative to the current location—that is, the location where the import statement is.
Above complex import statement becomes:
from ..subpackage4.module5 import function6
Hope this helps!

ValueError: Attempted relative import in non-package for running standalone scripts In Flask Web App

I have flask web app and its structure is as follows:
/app
/__init__.py
/wsgi.py
/app
/__init__.py
/views.py
/models.py
/method.py
/common.py
/db_client.py
/amqp_client.py
/cron
/__init.py__
/daemon1.py
/daemon2.py
/static/
/main.css
/templates/
/base.html
/scripts
/nginx
/supervisor
/Dockerfile
/docker-compose.yml
In app/app/cron i have written standalone daemons which i want to call outside the docker. e.g.
python daemon1.py
daemon1.py code
from ..common import stats
from ..method import msapi, dataformater
from ..db_client import db_connection
def run_daemon():
......
......
......
if name =="main":
run_daemon()
So when i am trying to run this daemon1.py its throwing ValueError: Attempted relative import in non-package
Please suggest the right approach for import as well as to structure these daemons.
Thanks in advance.
I ran into the exact same problem with an app that was running Flask and Celery. I spent far too many hours Googling for what should be an easy answer. Alas, there was not.
I did not like the "python -m" syntax, as that was not terribly practical for calling functions within running code. And on account of of my seemingly small brain, I was not able to come to grips with any of the other answers out there.
So...there is the wrong way and the long way. Both of them work (for me) and I'm sure I'll get a tongue lashing from the community.
The Wrong Way
You can call a module directly using the imp package like so:
import imp
common = imp.load_source('common', os.path.dirname(os.path.abspath('__file__')) + '/common.py')
result = common.stats() #not sure how you call stats, but you hopefully get the idea
I had a quick search for the references that said that is a no-no, but I can't find them...sorry.
The Long Way
This method involves temporarily appending each of your modules to you PATH. This has worked for me on my Docker deploys and works nicely regardless of the container's directory structure. Here are the steps:
1) You must import the relevant modules from the parent directories in your __init__ files. This is really the whole point of the __init__ - allowing the modules in its package to be callable. So, in your case, cron/__init__ should contain:
from . import common
It doesn't look like your directories go any higher than that, but you would do the same for any other packages levels up as well.
2) Now you need to append the path of the module to the PATH variable. You can see what is in there right now by running:
sys.path
As expected, you probably won't see any of your modules in there. That means, that Python can't figure out what you want when you call the common module. In order to add the path, you need to figure out the directory structure. You will want to make this dynamic to account for changing directories.
It's worth noting that this will need to run each time your module runs. I'm not sure what your cron module is, but in my case it is Celery. So, this runs only when I fire up workers and the initial crontabs.
Here is the hack I threw together (I'm sure there is a cleaner way to do it):
curr_path = os.getcwd() #current path where cron is running
parrent_path = os.path.abspath(os.path.join(os.getcwd(), '..')) #the parent directory path
parrent_dir = os.path.basename(os.path.abspath(parrent_path)) #the parent directory name
while parrent_dir <> 'project_name': #loop until you get to the top directory - should be the project name
parrent_path = os.path.abspath(os.path.join(par_path, '..'))
parrent_dir = os.path.basename(os.path.abspath(parrent_path))
In your case, this might be a challenge since you have two directories named 'app'. Your top level 'app' is my 'project_name'. For the next step, let's assume you have changed it to 'project_name'.
3) Now you can append the path for each of your modules to the PATH variable:
sys.path.append(parrent_dir + '/app')
Now if you run sys.path again, you should see the path to /app in there.
In summary: make sure all of your __init__'s have imports, determine the paths to the modules you want to import, append the paths to the PATH variable.
I hope that helps.
#greenbergé Thank you for your solution. i tried but didn't worked for me.
So to make things work I have changed my code little bit. Apart from calling run_daemon() in main of daemon1.py, i have called function run_daemon() directly.
python -m 'from app.cron.daemon1 import run_daemon(); run_daemon()'
As it is not exact solution of the problem but things worked for me.

Python Import Conventions: explicit bin/lib imports

ADVICE REQUEST: Best Practices for Python Imports
I need advice re: how major projects do Python imports and the standard / Pythonic way to set this up.
I have a project Foobar that's a subproject of another, much larger and well-used project Dammit. My project directory looks like this:
/opt/foobar
/opt/foobar/bin/startFile.py
/opt/foobar/bin/tests/test_startFile.py
/opt/foobar/foobarLib/someModule.py
/opt/foobar/foobarLib/tests/test_someModule.py
So, for the Dammit project, I add to PYTHONPATH:
export PYTHONPATH=$PYTHONPATH:/opt/foobar
This lets me add a very clear, easy-to-track-down import of:
from foobarLib.someModule import SomeClass
to all 3 of:
* Dammit - using foobarLib in the import makes everyone know it's not in Dammit project, it's in Foobar.
* My startFile.py is an adjunct process that has the same pythonpath import.
* test_someModule.py has an explicit import, too.
PROBLEM: This only works for foobarLib, not bin.
I have tests I want to run on in /opt/foobar/bin/tests/test_startFile.py. But, how to set up the path and do the import? Should I do a relative import like this:
PROJ_ROOT = os.path.abspath(os.path.join(os.path.dirname(__file__),".."))
sys.path.insert(0, PROJ_ROOT)
Or, should I rename the bin dir to be foobarBin so I do the import as:
from foobarBin.startFile import StartFile
I'm tempted to do the following:
/opt/foobarProject/foobar
/opt/foobarProject/foobar/bin/startFile.py
/opt/foobarProject/foobar/bin/tests/test_startFile.py
/opt/foobarProject/foobar/foobarLib/someModule.py
/opt/foobarProject/foobar/foobarLib/tests/test_someModule.py
then, I can do all my imports with:
import foobar.bin.startFile
import foobar.lib.someModule
Basically, for large Python projects (AND THEIR ADJUNCTS), it seems to me we have goals of:
minimize number of dirs added to pythonpath;
make it obvious where imports are coming from. That is, people use 'import lib.thing' a lot, and if there's more than one directory in the pythonpath named 'lib',
it's troublesome / non-obvious where that is, short of invoking
python and searching sys.modules, etc.
minimize number of times we add paths to sys.path since that's runtime and somewhat non-obvious.
I've used Python a long time, and been part of projects that do it various ways, some of them more stupid and some less so. I guess I'm wondering if there is a best-practices here, and the reason for it? Is my convention of adding foobarProject layer okay, good, bad, or just one more way among many?

Categories

Resources