Break Python Functions into Other Files - python

So to keep it simple I just want to be able to reference the function in C from A and I'm not sure how to do this in python without directly referencing c which I don't want to do
a.py references b.py
b.py references c.py
c.py has a function in it called foo
I want to call foo from a.py but for abstraction purposes I want to only reference b.py
I tried to set up a simple example like the one above before I start attempting this on my actual codebase but it seems I can't see the member in C from A

So not the most ideal solution but it appears if I create a member for each sub file in b.py it will allow me to access it from a.py
It would look something like this
c.py
def call_me_c():
print("It works from c")
b.py
import c
cfile = c
a.py
import b
b.cfile.call_me_c()

Related

Imports in __init__.py

This is my project structure (Python 3.5.1.):
a
├── b.py
└── __init__.py
Case 1
File b.py is empty.
File __init__.py is:
print(b)
If we run import a, the output is:
NameError: name 'b' is not defined
Case 2
File b.py is empty.
File __init__.py is:
import a.b
print(b)
If we run import a, the output is:
<module 'a.b' from '/tmp/a/b.py'>
Question
Why doesn't the program fail in Case 2?
Usually if we run import a.b then we can only reference it by a.b, not b. Hopefully somebody can help explain what's happening to the namespace in Case 2.
Python adds modules as globals to the parent package after import.
So when you imported a.b, the name b was added as a global to the a module, created by a/__init__.py.
From the Python 3 import system documentation:
When a submodule is loaded using any mechanism (e.g. importlib APIs, the import or import-from statements, or built-in __import__()) a binding is placed in the parent module’s namespace to the submodule object. For example, if package spam has a submodule foo, after importing spam.foo, spam will have an attribute foo which is bound to the submodule.
Bold emphasis mine. Note that the same applies to Python 2, but Python 3 made the process more explicit.
An import statement brings a module into scope. You imported b, so there it is, a module object.
Read the documentation for import:
The basic import statement (no from clause) is executed in two steps:
find a module, loading and initializing it if necessary
define a name or names in the local namespace for the scope where the import statement occurs.
You didn't import b in the first case.

how do imports work for a module used as a singleton?

I'm confused about what happens if you tread a module as a singleton.
Say I have a module conf.py, that contains some configuration parameters which need to be accessed by multiple other files. In conf.py, I might have this piece of code (and nothing else):
myOption = 'foo' + 'bar'
If I now import it first in a.py, and then in b.py, my understanding is that the first time it is imported (in a.py), the string concatenation will be executed. But the second time it is imported (in b.py), conf.myOption already has its value, so no string concatenation will be executed. Is this correct?
If after doing these two imports, I then execute the following in b.py
conf.myOption = 'aDifferentFoobar'
then obviously b.py would now see this new value. Would a.py see the same value, or would it still see 'foobar'?
I believe (but correct me if I'm wrong) that imports are always referred to by reference, not by value? And I'm guessing that's what the above questions boil down to.
Try it and see:
mod.py:
def foo():
print("in foo()")
return "foo"
bar = foo()
opt = "initial"
b.py:
import mod
mod.opt = "changed"
a.py:
import mod
import b
print(mod.bar)
print(mod.opt)
Execute a.py:
$ python3.4 a.py
Output:
in foo()
foo
changed
We learn:
foo() is only executed once
mod.opt is changed by b.py
a.py sees the changed value of mod.opt
bonus: the order of imports in a.py does not matter

Calling isinstance in main Python module

There is a strange behavior of isinstance if used in __main__ space.
Consider the following code
a.py:
class A(object):
pass
if __name__ == "__main__":
from b import B
b = B()
print(isinstance(b, A))
b.py
from a import A
class B(A):
pass
main.py
from a import A
from b import B
b = B()
print(isinstance(b, A))
When I run main.py, I get True, as expected, but when I run a.py, I am getting False. It looks like the name of A is getting the prefix __main__ there.
How can I get a consistent behavior?
I need this trick with import of B in a.py to run doctest on file a.py.
WSo what happens when you run a.py is that Python reads a.py and executes it. While doing so, it imports module b, which imports module a, but it does not reuse the definitions from parsing it earlier. So now you have two copies of the definitions inside a.py, known as modules __main__ and a and thus different __main__.A and a.A.
In general you should avoid importing modules that you are executing as well. Rather you can create a new file for running doctests and use something like
import a
import doctest
doctest.testmod(a)
and remove the __main__ part from module a.
Chain of events:
The a.py script is called.
The class A statement is executed, creating A in the __main__
namespace.
b is imported.
In b.py a is imported.
The class A statement is executed, creating A in the a namespace.
This A in the a namespace has no relation to the A in the
__main__ namespace other than having been generated by the same
code. They are different objects.
I agree with Helmut that it is best to avoid such circular imports. However, if you wish to fix your code with minimal changes, you could do the following:
Let's rename b.py --> bmodule.py so we can distinguish b the module from b the variable (hopefully in your real code these names are already distinct):
class A(object):
pass
if __name__ == "__main__":
import bmodule
b = bmodule.B()
print(isinstance(b, bmodule.A))
prints
True

How to share globals between imported modules?

I have two modules, a.py and b.py. I want the globals from a.py to be available in b.py like this:
a.py:
#!/usr/bin/env python
var = "this is global"
import b
b.foo()
b.py:
#!/usr/bin/env python
var = "this is global"
def foo():
print var
Currently, I re-declare the globals in each module. There must be an easier way.
Create a settings module that has shared globals if that's what you want. That way you're only importing and referencing each global one time, and you're keeping them isolated within the namespace of the settings module. It's a good thing.
#settings.py
var = 'this is global'
# a.py
import settings
import b
b.foo()
# b.py
import settings
def foo():
print settings.var
By making b.py require globals from a.py, you have created classes that depend on each other, which is bad design.
If you have static variables that need to be shared, consider creating c.py which both a.py and b.py can import and reference.
If you have dynamic variables that need to be shared, consider creating a settings class that can be instantiated and passed between the modules.
Define your globals in c.py and import them into a.py and b.py

passing value to other module python

i have two script name is A.py and B.py
i want to know how to send value from A.py to B.py.
for more detail,when run finished A.py script at the end of script ,A.py call B.py.
my question is i have to send some value from A.py to B.py.
anybody some help me how to send value A.py to B.py,so i can use some value in B.py.
"Do I assume correctly that you want to have B.py to use all the variables with values
that exist when A.py finishes?"
this is what i want exactly. i was upload my A.py and B.py to pastebin site.
http://elca.pastebin.com/m618fa852 <- A.py
http://elca.pastebin.com/m50e7d527 <- B.py
i want to use B.py 's xx value, xx value is come from A.py .
sorry my english
Your question isn't quite clear.
import B
B.methodToExecute(argument)
Do I assume correctly that you want to have B.py to use all the variables with values that exist when A.py finishes?
[edit]
Ok, the problem is you cannot easily do this without any variable assignments. What you'd like to achieve is an import statement done backwards. Normally, if you import a module in an other module or script, the importer (in this example, A) can access the importee's (B) variables, methods etc., but not backwards.
In A.py:
a = 2
print "A.a:", a
import B
print "B.b:", B.b
from B import *
print "b in our namespace:", b
In B.py:
b = 3
This will print:
A.a: 2
B.b: 3
b in our namespace: 3
If you are 100% sure you want this (why wouldn't you create some related classes with methods, or just put the methods in one big module?), you can import module A from module B, so if you modify B.py:
In B.py:
b = 3
from A import *
print "a in B's namespace:", a
... and run it again, you'll see some weird output with double lines and the desired a in B's namespace: 2 line (so it's 50% success). The key is that if you are importing simple scripts without functions and/or module/class declaration, whenever Python imports something, the necessary objects and references will be created inside the Python VM and the imported script gets executed, so you can run into troubles with the previously done circular imports. Use modules or classes and you'll be better, because in those only the parts after
if __name__ == "__main__":
...
will be executed on import.
Or, another good idea in this thread is to call a subprocess, but then you need to check for the variables on the command line, not directly in your script's namespace.
Your question might have been phrased too abstractly for me to really know what you need.
Generally, what you would do is after you've done everything in A and are read for B, you will have import A at the top of your module and call a function in A that you what taking in all the values you want to pass. If B is currently using hard-coded globals you want to override, refactor it to use functions.

Categories

Resources