Python - Importing many libraries for multiple files convention - python

Regarding code structure and formatting, I can't find any clear information about this small nitpick relating to importing many modules at once.
Say I have two files, crudely named solver.py and data.py. Like most people, I have a set of standard modules that I import for each solver. Is it advisable to create a third module/textfile such as importList or importList.py which contains all the information such as import xpackagex as xpx? Should I just suck it up and copy over all of the imports for each file I write? Of course I am concerned about compatibility since for the main function where one could type from importList import * it would overwrite the any other choices, but it might make for some tidier looking code, particularly when many libraries are imported. Is there a standard approach for this?
Best wishes and thanks in advance.

Related

Accessing A Python Module + General Library/Module Structure

Below is a screenshot of part of an article explaining how to access the example Python module dataset.py, for which they provide the following line:
import my_model.training.dataset
I'd like to know if the following methods below are equivalent and accomplish the same thing:
from my_model.training import dataset
from my_model import training.dataset
I have a library where I've been accumulating all of my .py files over time. I'm trying to organize it into something more.. neat but I'm having trouble deciding how to do that.
The library (or rather, the folder I'm dumping everything in) is meant to be just a collection of independent modules, but some of the modules have cross dependencies.. It'd be easier if I had a systematic way to group functions/classes within certain files ie modules. Should they be grouped by purpose?
keep in mind these aren't even packages for projects, they are the building blocks for other packages; just my own personal collection of classes and functions but starting to get hard to manage. so i could use some advice
Thanks

What is the most Pythonic way to organise a package

I have created a package in python that I want to use in many other modules. The package contains a lot of classes; some large and some small. I have decided to keep each class in its own module as this makes sense in the context of the package and is, I think, what users would want.
I would like to know the most pythonic way to organise the package.
At present at present it is structured as shown here, in a top level directory called 'org':
(remember I have many more modules than the three show here and the list of modules is very long).
I can import any of the classes using into different packages using:
import sys
sys.path.append('../org')
from org.a import A
A()
I would like to organise it like this and still use the same import statements (if possible):
unfortunately, if I do this, I cannot import any of the classes using the code shown above.
Can someone please show me how they would do it?

How to set up collection of Python Classes with inter-dependency?

So I have a set of .py documents as follows:
/Spider
Script.py
/Classes
__init__.py
ParseXML.py
CrawlWeb.py
TextAnalytics.py
Each .py document in the /Classes subfolder contains a class for a specific purpose, the script schedules the different components. There are a couple of questions I had:
1) A lot of the classes share frameworks such as urllib2, threading etc. What is considered the 'best' form for setting up the import statements? I.e. is there a way for me to use something like the __init__.py file to pass the shared dependencies to all of the classes, then use the specific .py files to import the singular dependencies?
2) Some of the classes call on the other classes, (e.g. the CrawlWeb.py document uses the ParseXML class to update the XML files after crawling). I separated out the classes like this because they were each quite large and so were easier to update like this... Would it be considered best form to combine classes in this case or are there other ways to get round this?
The classes will only ever be used as part of the script. So far the only real solution I've been able to come up with is perhaps using the Script.py file for all of the import statements, but it seems a little bit messy. Any advice would be very appreciated.
The best way to handle the common imports is to import them in each module they're used. While this probably feels annoying to you because you have to type more, it makes it dramatically clearer to the reader of the code what modules are in scope. You're not missing something by doing common imports; you're doing it right.
While you certainly can put your classes all into separate files, it's more common in Python to group related classes together in a single module. Given how short it sounds like your script is, that may mean it makes sense for you to pull everything into a single file. This is a judgment call, and I cannot offer a hard-and-fast rule.

Automatically import to all Python files in the given folder?

I am relatively quite new to Python and I try to learn the "Pythonic" way of doing things to build a solid foundation in terms of Python development. Perhaps what I want to achieve is not Python at all, but I am nonetheless seeking to find out the "right" way to solve this issue.
I am building an application, for which I am creating modules. I just noticed that a module of mine has 7 different .py Python files, all importing 3 different same things. So all these files share these imports.
I tried removing them, and inserting these import to the empty init.py in the folder, but it did not do the trick.
If possible, since these imports are needed by all these module files, I would not like them to be imported in each file one by one.
What can I do to perform the common import?
Thank you very much, I really appreciate your kind help.
As the Zen of Python states, "Explicit is better than implicit", and this is a good example.
It's very useful to have the dependencies of a module listed explicitly in the imports and it means that every symbol in a file can be traced to its origin with a simple text search. E.g. if you search for some_identifier in your file, you'll either find a definition in the file, or from some_module import some_identifier. It's even more obvious with direct references to some_module.some_identifier. (This is also one reason why you should not do from module import *.)
One thing you could do, without losing the above property, is to import your three shared modules into a fourth module:
#fourth.py
import first
import second
import third
then...
#another.py
import fourth
fourth.first.some_function()
#etc.
If you can't stomach that (it does make calls more verbose, after all) then the duplication of three imports is fine, really.
I agree with DrewV, it is perfectly pythonic to do
File1:
import xyz
import abc
...
File2:
import xyz
An almost identical question has also been addressed in the following link:
python multiple imports for a common module
As it explains, Python does the job of optimising the module load, so you can write multiple import statements and not worry about performance losses, because the module is only loaded once. In fact, listing out all the imports in each file makes it explicitly clear what each file depends on.
And for a discussion of how imports interact with namespaces, see:
Python imports across modules and global variables

How to maintain different version of a python module?

I have this core python module we use in our facility called mfxLib. I need to be able to keep different version of this module without breaking all the other modules/plugin that are importing this module.
My solution was keep a duplicate of my module by renaming them mfxLib01 and mfxLib02 then
to replace the original mfxLib module with an empty module containing only a __init__.py file that import the latest version.
# content of mfxLib.__init__.py
from mfxLib02 import *
This seems logical and seems to work but I was wondering if there was a common practice for doing this? guidelines to follow? etc
Thanks
You can import a module as another name. Commonly people use this to save typing in a long module name, for example:
import numpy as np
np.array([1,2,3,4])
Hence you could do:
import mfxLib01 as mfxLib
or
import mfxLib02 as mfxLib
then your code uses mfxLib everywhere.
That might help...
If you have different scripts requiring different versions, your current approach should be the the best, but I'd suggest using a version control system like Git or SVN. That would allow you to commit and revert to earlier versions easily, as well as share the module with other users.
Version control will almost certainly make your life easier. In addition to Petterson's recommendations, consider Mercurial. Like git and SVN, it's free. It's also written in Python and should run without difficulty on any of your systems.
Spacedman's recommendations are also useful, especially if the differences between the versions represent customizations for particular systems and the customizations are relatively stable. Note that you can use that approach in combination with a version control system.
Finally, it's always worthwhile to make a strong effort to write your module so that it can work without modification everywhere. Often, you can accomplish this by adding some optional arguments to a few key functions to handle the different requirements. Python is really convenient in that regard because keyword arguments at the end of the arg list are always optional, so you can easily arrange to provide the existing behavior by giving them suitable default values.
def foo(oldarg1, oldarg2, newarg1=None):
if newarg1 != None:
## behave differently
else:
## behave as usual

Categories

Resources