Import all packages in main.py - python

I'd like to import all my packages in only one file.
Let assume that I have a main.py file where I call all my class (from others .py files located in a src folder):
main.py
|-- src
|-- package1.py
|-- package2.py
the main.py looks like this:
from src.package1 import *
from src.package2 import *
def main():
class1 = ClassFromPackage1()
class2 = ClassFromPackage2()
if __name__ == '__main__':
main()
in package1.py I import let say numpy, scipy and pandas
import numpy
import scipy
import pandas
class ClassFromPackage1():
# Do stuff using numpy, scipy and pandas
and in package2.py I use numpy and scikit learn:
import numpy
import sklearn
class ClassFromPackage2():
# Do stuff using numpy and sklearn
Is there a way to import all packages in one file Foo.py where I only write:
import numpy
import sklearn
import scipy
import pandas
and import this Foo.py in src .py? like this for example with package1.py
import Foo
class ClassFromPackage1():
# Do stuff using numpy, scipy and pandas
Is this a good idea? Does it reduce memory consumption? Will it helps python to start the main.py faster?

Looks like you want to make code cleaner? What you can do is create a file like foo.py and put all imports in it. Then you can import modules inside foo.py by doing
from foo import *
This will indirectly import all modules.

The way you have already done it is how it is usually done. Similar to header files in C/C++, you make the dependencies explicit. And that it is a good thing.
You asked if it will run faster, the answer is no. All imports are shared. This, sometimes, causes unwanted side effects, but that is not the question here.

Related

How to build package like pandas/numpy where pd/np is an object with all the functions

As per title, I am trying to build a python package myself, I am already familiar with writing python packages reading notes from https://packaging.python.org/en/latest/tutorials/packaging-projects/ and https://docs.python.org/3/tutorial/modules.html#packages. These gave me an idea of how to write a bunch of object class/functions where I can import them.
What I want is to write a package like pandas and numpy, where I run import and they work as an "object", that is to say most/all the function is a dotted after the package.
E.g. after importing
import pandas as pd
import numpy as np
The pd and np would have all the functions and can be called with pd.read_csv() or np.arange(), and running dir(pd) and dir(np) would give me all the various functions available from them. I tried looking at the pandas src code to try an replicate their functionality. However, I could not do it. Maybe there is some parts of that I am missing or misunderstanding. Any help or point in the right direction to help me do this would be much appreciated.
In a more general example, I want to write a package and import it to have the functionalities dotted after it. E.g. import pypack and I can call pypack.FUNCTION() instead of having to import that function as such from pypack.module import FUNCTION and call FUNCTION() or instead of importing it as just a submodule.
I hope my question makes sense as I have no formal training in write software.
Let's assume you have a module (package) called my_library.
.
├── main.py
└── my_library/
└── __init__.py
/my_library/__init__.py
def foo(x):
return x
In your main.py you can import my_library
import my_library
print(my_library.foo("Hello World"))
The directory with __init__.py will be your package and can be imported.
Now consider a even deeper example.
.
├── main.py
└── my_library/
├── __init__.py
└── inner_module.py
inner_module.py
def bar(x):
return x
In your /my_library/__init__.py you can add
from .inner_module import bar
def foo(x):
return x
You can use bar() in your main as follows
import my_library
print(my_library.foo("Hello World"))
print(my_library.bar("Hello World"))

Importing package python

Pretty new to python but trying to write a reusable package for some data processing I've been doing a lot recently.
My project I'm wanting to structure like
src/
mypackage/*.py
test/
test-type1/*.py
test-type2/*.py
...
Each 'test-type' will have multiple scripts underneath it. My issue is when I try to import mypackage like in the root directory it explodes. I'm curious what a good way to handle this is? I can't just use relative paths since I get an error
attempted relative import with no known parent package
Any help would be appreciated.
Edit:
If a test file was in the root directory I could just do. With the change in directories I can't do that without getting the relative import error.
import mypackage.{file}
Edit2:
Example of code of one of my tests. Folder structure
rl/
__init__.py
callbacks/
__init__.py
checkpoint.py
earlystop.py
progress.py
training/
__init__.py
train.py
tests/
cartpole-v1/
train.py
''' Code
import gym
import torch
import pandas as pd
import matplotlib.pyplot as plt
from pole_actor import PoleActor
from ...rl.training import train
from ...rl.callbacks import Checkpoint, EarlyStop, ProgressBarCallback
def a2c(env: gym.Env):
actor = PoleActor(batch_size=50,max_memory=500,optimizer=torch.optim.Adam,lr=3e-4)
rewards = train(env,actor,3000,callbacks=[
Checkpoint(save_path='models/pole-callback.ach5',patience=100),
EarlyStop(patience=500,delay=1000),
#Need a better way of selecting the 'best' model, ideally we should always get a good model out of here
ProgressBarCallback()
])
smoothed = pd.Series.rolling(pd.Series(rewards),10).mean()
plt.plot()
plt.xlabel('Episodes')
plt.ylabel('Reward')
plt.plot(rewards)
plt.plot(smoothed)
plt.show(block=True)
if __name__ == "__main__":
env = gym.make("CartPole-v1")
a2c(env)
You could use these lines of code before importing the library:
import os,sys,inspect
current_dir = os.path.dirname(os.path.abspath(inspect.getfile(inspect.currentframe())))
parent_dir = os.path.dirname(current_dir)
sys.path.insert(0, parent_dir)
this code will import the packages from the parent directory instead of the current directory.

Import python packages like NumPy or Pandas in imported modules

I have a main Python-program called mainProgram.py. It looks like this (simplified example):
import numpy as np
import pandas as pd
from packOne import functionOne
from packTwo import functionTwo
ResultOne = functionOne()
ResultTwo = functionTwo()
File __init__.py was also created.
After running the main program mainProgram.py I received NameError:
NameError: name 'np' is not defined
In package PackOne.py I use some functions from NumPy, but I didn't import NumPy in the package PackOne.py. Should I also import NumPy in packOne and all other packages which I'm going to import? Are there any elegant solutions, how to import packages like NumPy or Pandas once in main python program?

Importing library once for a python module

I want to structure imports of my python module/package, namely I have several *.py files in my module. All of them use:
import numpy as np
in some pf them I use:
import pandas as pd
can I set the global import for my python module, and say that it uses numpy as np in all the *.py files of the module.
I tried something in __init__.py but it didn't work as expected. Is it anyhow reasonable to make global imports?
No, you cannot do this, it is fundamentally opposed to the way Python works.
Warning:- Do not use this at home.Not a good practice
You can Do it by importing all your libraries into a single file. For example:-
library.py
import numpy as np
import os
import json
import pandas as pd
And then import this file in your code files
main.py
from library import *
a = np.array([1, 2, 3])

Import submodule functions into the parent namespace

I have a package of commonly used utility functions which I import into tons of different projects, using different parts. I'm following patterns that I've seen in numpy, by doing things like:
utils/
__init__.py
```
from . import math
from . import plot
```
math/
__init__.py
```
from . import core
from . core import *
from . import stats
from . stats import *
__all__ = []
__all__.extend(core.__all__)
__all__.extend(stats.__all__)
```
core.py
```
__all__ = ['cfunc1', 'cfunc2']
def cfunc1()...
def cfunc2()...
```
stats.py
```
__all__ = ['sfunc1', 'sfunc2']
def sfunc1()...
def sfunc2()...
```
plot/
__init__.py
The nice thing about this structure is I can just call the submodule functions from the higher level namespaces, e.g. utils.math.cfunc1() and utils.math.sfunc2().
The issue is: both core and stats take a long time to import, so I don't want to automatically import them both from math.__init__.
Is there any way to not import both stats and core by default, and instead use something like import utils.math.core, but still have the functions in core end up in the math namespace? i.e. be able to do:
import utils.math.core
utils.math.cfunc1()

Categories

Resources