Unable to split up large Python module - python

In Python 2.7, I'm getting
'module' has no attribute
, and/or
'name' is not defined
errors when I try to split up a large python file.
(I have already read similar posts and the Python modules documentation)
Say you have a python file that is structured like this:
<imports>
<50 global variables defined>
<100 lengthy functions that each use most or all of the globals
defined above, and also call each other>
<main() that calls some of the functions and uses the globals>
So I can easily categorize groups of functions together, create a python file for each group, and put them there. The problem is whenever I try to call any of them from the main python file, I get the errors listed above. I think the problem is related to circular dependencies. Since all of the functions rely on the globals, and each other, they are circularly dependent.
If I have main_file.py, group_of_functions_1.py, and group_of_functions_2.py,
main_file.py will have:
import group_of_functions_1.py
import group_of_functions_2.py
and group_of_functions_1.py will have
import main_file.py
import group_of_functions_2.py
and group_of_functions_2.py will have
import main_file.py
import group_of_functions_1.py
Regardless of whether I use "import package_x" or "from package_x import *" the problem remains.
If I take the route of getting rid of the globals, then most of the functions will have 50 parameters they will be passing around which then also need to be returned
What is the right way to clean this up?
(I have already read similar posts and the Python modules documentation)

One of the sources of your errors is likely the following:
import group_of_functions_1.py
import group_of_functions_2.py
When importing, you don't add .py to the end of the module name. Do this instead:
import group_of_functions_1
import group_of_functions_2

Related

What's the Pythonic way of supporting constants across modules in one project as well as across projects? [duplicate]

This seems pretty basic, so I must be missing something obvious. Goal is to import a module from the same directory. I've broken it down about as simple as I can and I'm getting the nameerror.
file import_this.py:
def my_function(number) :
print number + 2
file import_test.py:
import import_this
my_function(2)
Do I have to specify the directory the import file is in? (It's in the same as the test file). Also, can I test to see what modules are imported?
You are accessing the function incorrectly.
Either use the following
import import_this
import_this.my_function(2)
or do,
from import_this import my_function
my_function(2)
Alternatively (apart from #mu's answer above),
>>>import import_this as it
.. and then,
>>> it.my_function(2)

python calling function from another file while using import from the main file

I'm trying to use multiple files in my programing and I ran into a problem.
I have two files: main.py, nu.py
The main file is:
import numpy
import nu
def numarray():
numpy.array(some code goes here)
nu.createarray()
The nu file is:
def createarray():
numpy.array(some code goes here)
When I run main I get an error:
File "D:\python\nu.py", line 2, in createarray
numpy.array(some code goes here)
NameError: name 'numpy' is not defined
numpy is just an exaple, I'm using about six imports.
As far as I see it, I have to import all moduls on all files, but it creating a problem where certain modules can't be loaded twice, it just hang.
What I'm doing wrong and how can I properly import functions from another file while using imported modules from the main file?
I hope i explain it well.
thanks for helping!
I have years in python and importing from other files is still a headache..
The problmen here is that you are not importing numpy in "nu.py".
But as you say sometimes it's a little annoying have to import al libraries in all files.
The last thing is, How do you get the error a module cannot be imported twice? can you give me an example?
In each separate python script if you are using a module within you need to import it to access. So you will need to 'import numpy' in your nu.py script like below
If possible try keeping the use of a module within a script so you dont have import the same multiple times, although this wont always be appropriate

Nested function causing troubles

I've got a Python script.
I've had several functions in this script which I decided to move to a 'package' folder beside the main script.
In this folder, I created a *.py file where I put all my functions.
I've placed an empty init.py near this file within the 'package' folder.
When starting the code of my main script with:
from package_folder.my_functions import *
the script works well when calling every functions from that file.
But when trying to import it directly:
import package_folder.my_functions
it doesn't seems to work as well as with the above technique.
The cause seems to be the fact that in the file wellmy_functions.py, I have a function that needs an other one, declared previously in that file.
I had this obscure error on that function that needs an other one:
TypeError: 'NoneType' object is not callable
Is this permissible and if not, how to manage this case?
It's generally not a good idea to use from module import *. Wildcard importing leads to namespace pollution; you imported more names than you need and if you accidentally refer to an imported name you may not get the NameError you wanted.
Also, if a future version of the library added additional names, you could end up masking other names, leading to strange bugs still:
Example
from my_mod1 import func1
from my_mod2 import *
If you upgrade my_mod2 and it now includes a my_mod2.func1 it'll replace the my_mod1.func1 import in the 1st line.

Backing up/copying an entire folder tree in batch or python?

I'm trying to copy an entire directory from one locations to another via python every 7 days to essentially make a backup...
The backup folder/tree folder may or may not exist so it needs to create the folder if it doesn't exist, that's why I assumed distutils is better suited over shutil
Note Is it better for me to use batch or some other language for the said job?
The following code:
import distutils
distutils.dir_util.copy_tree("C:\Users\A\Desktop\Test", "C:\Users\A\Desktop\test_new", preserve_mode=1, preserve_times=1, preserve_symlinks=0, update=1, verbose=0, dry_run=0)
Returns:
Traceback (most recent call last):
File "C:\Users\A\Desktop\test.py", line 2, in <module>
distutils.dir_util.copy_tree("C:\Users\A\Desktop\test", "C:\Users\A\Desktop\test2", preserve_mode=1, preserve_times=1, preserve_symlinks=0, update=1, verbose=0, dry_run=0)
AttributeError: 'module' object has no attribute 'dir_util'
What am I doing wrong?
Thanks in advance
- Hyflex
You need to import dir_util specifically to access it's functions:
from distutils import dir_util
If there are other modules in that package that you need, add them to the line, separated by commas. Only import the modules you need.
For Unix/Linux, I suggest 'rsync'.
For windows: xcopy
I've been attempting essentially the same thing to back up what I write on a plethora of virtual machines.
I ran into the same problem you did with distutils. From what I can tell, the Python community is using the distutils module to start standardizing how new modules interface with Python. I think they're still in the thick of it though as everything I've seen relating to it seems more complicated, not less complicated. Hopefully, I'm just seeing all the crazy that happens in the middle of a big change.
But I did figure out how to get it working. To use distutil.dir_util.copytree(),
>>> from distutils import dir_util
>>> dir_util.copy_tree("/home/user/backing_up/temp", "/home/user/backing_up/other")
['/home/user/backing_up/other/stuff.txt'] # Return value indicating success
If you feel like it's worthwhile, you can import distutils.core and make the longer call to distutils.dir_util.copy_tree().
>>> import distutils.core
>>> distutils.dir_util.copy_tree("/home/user/backing_up/temp", "/home/user/backing_up/other")
['/home/user/backing_up/other/stuff.txt'] # Return value indicating success
(I know, I know, there are subtle differences between "import module.submodule" and "from module import submodule" but that's not the intent of the question and so long as you're importing the correct stuff and calling the functions appropriately, it doesn't make a difference.)
Like you, I also explicitly stated that I wanted the default for preserve_mode and preserve_times, but I didn't touch the other variables. Everything worked as expected once I imported and called the function the way it wanted me to.
Now that my back up script works, I realize I should have written it in Bash since I plan on having it run whenever the machine goes to a specific runlevel. I'm using a wrapper instead now, even if I should just re-write it.

Python: Importing an "import file"

I am importing a lot of different scripts, so at the top of my file it gets cluttered with import statements, i.e.:
from somewhere.fileA import ...
from somewhere.fileB import ...
from somewhere.fileC import ...
...
Is there a way to move all of these somewhere else and then all I have to do is import that file instead so it's just one clean import?
I strongly advise against what you want to do. You are doing the global include file mistake again. Although only one module is importing all your modules (as opposed to all modules importing the global one), the remaining point is that if there's a valid reason for all those modules to be collected under a common name, fine. If there's no reason, then they should be kept as separate includes. The reason is documentation. If I open your file, and see only one import, I don't get any information about what is imported and where it comes from. If on the other hand, I have the list of imports, I know at a glance what is needed and what not.
Also, there's another important error I assume you are doing. When you say
from somewhere.fileA import ...
from somewhere.fileB import ...
from somewhere.fileC import ...
I assume you are importing, for example, a class, like this
from somewhere.fileA import MyClass
this is wrong. This alternative solution is much better
from somewhere import fileA
<later>
a=fileA.MyClass()
Why? two reasons: first, namespacing. If you have two modules having a class named MyClass, you would have a clash. Second, documentation. Suppose you use the first option, and I find in your code the following line
a=MyClass()
now I have no idea where this MyClass comes from, and I will have to grep around all your files in order to find it. Having it qualified with the module name allows me to immediately understand where it comes from, and immediately find, via a /search, where stuff coming from the fileA module is used in your program.
Final note: when you say "fileA" you are doing a mistake. There are modules (or packages), not files. Modules map to files, and packages map to directories, but they may also map to egg files, and you may even create a module having no file at all. This is naming of concepts, and it's a lateral issue.
Of course there is; just create a file called myimports.py in the same directory where your main file is and put your imports there. Then you can simply use from myimports import * in your main script.

Categories

Resources