Use function from another file python - python

I have 2 files, 1 file with the classes and 1 file with the 'interface'
In the first file I have:
from second_file import *
class Catalog:
def ListOfBooks(self):
more_20 = input()
# If press 1 starts showing from 20 until 40
if more_20 == '1':
for item in C[20:41:1]:
print("ID:",item['ID'],"Title:",item['title']," Author: ", item['author'])
elif more_20 == '2':
return librarian_option()
test = Catalog()
test.ListOfBooks()
What I try to achieve is when the user presses 2, I want to go back to the function in my other file.
Second file:
def librarian_option():
.......
I don't want to use globals and I have read that the librarian_option() is in the scope of the second file and that's why I can't call it directly. I can't find a solution to it.
I get the following error:
NameError: name 'librarian_option' is not defined

Have you tried? It's just best practice to be explicit instead of using * wildcard
from second_file import librarian_option
Make sure the second_file is in the same directory.

Note: this is not an answer, but an example to help improve the question with more details.
You need to reduce your problem to the minimum amount of code (and actions) necessary to produce your problem. You also need provide how exactly you are running your script, and what version (of Python and your OS) you are using.
For example, I have created the following two scripts (named exactly as shown):
first_file.py:
from second_file import *
class Catalog:
def ListOfBooks(self):
return librarian_option()
test = Catalog()
a = test.ListOfBooks()
print(a)
second_file.py:
def librarian_option():
return 1
These two files are located in the same, random, directory on my computer (MacOS). I run this as follows:
python3.7 first_file.py
and my output is
1
Hence, I can't reproduce your problem.
See if you can still produce your problem with such simplified scripts (i.e., no extra functions or classes, no __init__.py file, etc). You probably want to do this in a temporary directory elsewhere on your system.
If your problem goes away, slowly build back up to where it reappears again. By then, possibly, you've discovered the actual problem as well. If you then don't understand why this (last) change you made caused the problem, feel free to update your question with all the new information.

Related

Common Python scripts are getting called multiple times

I have the following project structure.
- README.rst
- LICENSE
- setup.py
- requirements.txt
- common_prj/__init__.py
- common_prj/common_lib.py
- common_prj/config.xml
- Project1/__init__.py
- Project1/some_app_m1.py
- Project1/some_app_m2.py
- Project1/some_app.py
- Project2/some_app1.py
I am having all my common classes in common_prj/common_lib.py file. Now each module file from Project1 is calling common_prj/common_lib.py to access common classes and setup project env using config.xml
#import in some_app_m1.py
from common_prj import common_lib
#import in some_app_m2.py
from common_prj import common_lib
Imports in some_app.py
from Project1 import some_app_m1
from Project1 import some_app_m2
from common_prj import common_lib
###
<some functions>
<some functions>
###
if __name__ == '__main__':
With the above three import it seems common_lib.py is getting executed multiple time , but if I keep the common_lib.py in Project1 then I don't see this issue.
Please let me now how can I keep common_lib.py in common package and call that from Project1 scripts without executing it multiple time. The purpose of common_lib.py is to share common classes with scripts in Project2.
I have the below code in common_lib.py which is repeating for calling classes from each module after import.
self.env = dict_env.get(int(input("Choose Database ENV for this execution : \n" + str(dict_env) + "\nSelect the numeric value => ")))
self.app = dict_app.get(int(input("Choose application for this execution : \n" + str(dict_app) + "\nSelect the numeric value => ")))
My question why I am not facing this issue if I keep common_lib.py in Project1 rather common_prj. I added the above code in common_lib.py because I don't wanted to repeat these lines and ENV setup in my all app code. These are global env settings for all application scripts code inside Project1.
output
Choose Database ENV for this execution :
{1: 'DEV', 2: 'SIT', 3: 'UAT', 4: 'PROD'}
Select the numeric value => 1
Choose application for this execution :
{1: 'app1', 2: 'app2', 3: 'app3', 4: 'app4', 5: 'app5', 6: 'app6'}
Select the numeric value => 1
Choose Database ENV for this execution : ### Here it is repeating again from common_lib.py
{1: 'DEV', 2: 'SIT', 3: 'UAT', 4: 'PROD'}
Select the numeric value => 1
Choose application for this execution : ### Here it is repeating again from common_lib.py
{1: 'app1', 2: 'app2', 3: 'app3', 4: 'app4', 5: 'app5', 6: 'app6'}
Select the numeric value => 1
Note: This is my first answer where I answer in-depth of the logistics of python. If I have said anything incorrect, please let me know or edit my answer (with a comment letting me know why). Please feel free to clarify any questions you may have too.
As the comments have discussed if __name__ == '__main__': is exactly the way to go. You can check out the answers on the SO post for a full description or check out the python docs.
In essence, __name__ is a variable for a python file. Its behavior includes:
When used in a file, its value is defaulted to '__main__'.
When a file imports another program. (import foo). foo.__name__ will return 'foo' as its value.
Now, when importing a file, python compiles the new file completely. This means that it will run everything that can be run. Python compiles all the methods, variables, etc. As it's compiling, if python comes across a callable function (i.e hello_world()) it will run it. This happens even if you're trying to import a specific part of the package or not.
Now, these two things are very important. If you have some trouble understanding the logic, you can take a look at some sample code I created.
Since you are importing from common_prj 3 times (from the some_app_m* file and my_app.py file), you are running the program in common_prj three times (This is also what makes python one of the slower programming languages).
Although I'm unable to see your complete code, I'm assuming that common_prj.py calls common_lib() at some point in your code (at no indentation level). This means that the two methods inside:
self.env = dict_env.get(int(input("Choose Database ENV for this execution : \n" + str(dict_env) + "\nSelect the numeric value => ")))
self.app = dict_app.get(int(input("Choose application for this execution : \n" + str(dict_app) + "\nSelect the numeric value => ")))
were also called.
So, in conclusion:
You imported common_prj 3 times
Which ran common_lib multiple times as well.
Solution:
This is where the __name__ part I talked about earlier comes into play
Take all callable functions that shouldn't be called during importing, and place them under the conditional of if __name__ == '__name__':
This will ensure that only if the common_prj file is run then the condition will pass.
Again, if you didn't understand what I said, please feel free to ask or look at my sample code!
Edit: You can also just remove these callable functions. Files that are meant to be imported use them to make sure that the functions work as intended. If you look at any of the packages that you import daily, you'll either find that the methods are called in the condition OR not in the file at all.
This is more of a question of library design, rather than the detailed mechanics of Python.
A library module (which is what common_prj is) should be designed to do nothing on import, only define functions, classes, constants etc. It can do things like pre-calculating constants, but it should be restricted to things that don't depend on anything external, produce the same results each time and don't involve much heavy computation. At that point, the code running three times will no longer be a problem.
If the library needs parameters, the caller should supply those, ideally in a way that allows multiple different sets of parameters to be used within the same program. Depending on the situation, the parameters can be a main object that manages the rest of the interaction, or parameters that are passed into each object constructor or function call, or a configuration object that's passed in. The library can provide a function to gather these parameters from the environment or user input, but it shouldn't be required; it should always be possible for the program to supply the parameters directly. This will also make tests easier to write.
Side notes:
The name "common" is a bit generic; consider renaming the library after what it does, or splitting it into several libraries each of which does one thing?
Python will try to cache imported modules so it doesn't re-run them multiple times; this is probably why it only ran the code once in one situation but three times in another. This is another reason to have only definitions happen on import, so the details of the caching don't affect functionality.
It seems after commenting the below in
Project1/init.py
the issue has been resolved. I was calling the same common_lib from init.py as well which was causing this issue . Removing the below entry resolved the issue.
#from common_prj import commonlib

Running Fortran executable within Python script

I am writing a Python script with the following objectives:
Starting from current working directory, change directory to child directory 'A'
Make slight adjustments to a fort.4 file
Run a Fortran binary file (the syntax of which is ../../../../ continuing until I hit the folder containing the binary); return to 2. until my particular objective is complete, then
Back out of child directory to parent, then enter another child directory and return to 2. until I have iterated through all the folders in question.
The code is coming along well. I am having to rely heavily upon Python's OS module for the directory work. However, I have never had any experience a) making minor adjustments of a file using python and b) running an executable. Could you guys give me some ideas on Python modules, direct me to a similar stack source etc, or perhaps give ways that this can be accomplished? I understand this is a vague question, so please ask if you do not understand what I am asking and I will elaborate. Also, the changes I have to make to this fort.4 file are repetitive in nature; they all happen at the same position in the file.
Cheers
EDIT::
entire fort.4 file:
file_name
movie1.dat !name of a general file the binary reads
nbr_box ! line3-8 is general info
2
this_box
1
lrdf_bead
.true.
beadid1
C1 !this is the line I must change
beadid2
F4 !This is a second line I must change
lrdf_com
.false.
bin_width
0.04
rcut
7
So really, I need to change "C1" to "C2" for example. The changes are very insignificant to make, but I must emphasize the fact that the main fortran executable reads this fort.4, as well as this movie1.dat file that I have already created. Hope this helps
Ok so there is a few important things here, first we need to be able to manage our cwd, for that we will use the os module
import os
whenever a method operates on a folder it is important to change directories into the folder and back to the parent folder. This can also be achieved with the os module.
def operateOnFolder(folder):
os.chdir(folder)
...
os.chdir("..")
Now we need to do some method for each directory, that comes with this,
for k in os.listdir(".") if os.path.isdir(k):
operateOnFolder(k)
Finally in order to operate on some preexisting FORTRAN file we can use the builtin file operators.
fileSource = open("someFile.f","r")
fileText = fileSource.read()
fileSource.close()
fileLines = fileText.split("\n")
# change a line in the file with -> fileLines[42] = "the 42nd line"
fileText = "\n".join(fileLines)
fileOutput = open("someFile.f","w")
fileOutput.write(fileText)
You can create and run your executable output.fx from source.f90::
subprocess.call(["gfortran","-o","output.fx","source.f90"])#create
subprocess.call(["output.fx"]) #execute

Python - How to save functions

I´m starting in python. I have four functions and are working OK. What I want to do is to save them. I want to call them whenever I want in python.
Here's the code my four functions:
import numpy as ui
def simulate_prizedoor(nsim):
sim=ui.random.choice(3,nsim)
return sims
def simulate_guess(nsim):
guesses=ui.random.choice(3,nsim)
return guesses
def goat_door(prizedoors, guesses):
result = ui.random.randint(0, 3, prizedoors.size)
while True:
bad = (result == prizedoors) | (result == guesses)
if not bad.any():
return result
result[bad] = ui.random.randint(0, 3, bad.sum())
def switch_guesses(guesses, goatdoors):
result = ui.random.randint(0, 3, guesses.size)
while True:
bad = (result == guesses) | (result == goatdoors)
if not bad.any():
return result
result[bad] = ui.random.randint(0, 3, bad.sum())
What you want to do is to take your Python file, and use it as a module or a library.
There's no way to make those four functions automatically available, no matter what, 100% percent of the time, but you can do something very close.
For example, at the top of your file, you imported numpy. numpy is a module or library which has been set up so it's available any time you run python, as long as you import it.
You want to do the same thing -- save those 4 functions into a file, and import them whenever you want them.
For example, if you copy and paste those four functions into a file named foobar.py, then you can simply do from foobar import *. However, this will only work if you're running Python in the same folder where you saved your code.
If you want to make your module available system-wide, you have to save it somewhere on the PYTHONPATH. Usually, saving it to C:\Python27\Lib\site-packages will work (assuming you're running Windows).
If you decide to put them anywhere in your project folder don`t forget to create a blank init.py file so python can see them. A better answer can be provided here : http://docs.python.org/2/tutorial/modules.html
Save them in a file - this makes them a module.
If you put them in a file called mymod.py, in python you can load them as follows
from mymod import *
simulate_prizedoor(23)
Quick solution, without having to explicitly create a file - relies on IPython and its storemagic
IPython 4.0.1 -- An enhanced Interactive Python.
details.
In [1]: def func(a):
...: print a
...:
In [2]: func = _i #gets the previous input
In [3]: store func #store(magic) the input
#(auto-magic enabled or would need '%store')
Stored 'func' (unicode)
In [4]: exit
IPython 4.0.1 -- An enhanced Interactive Python.
In [1]: store -r func #retrieve stored string
In [2]: exec func #execute string as python code
In [3]: func(10)
10
Once you had stored all your functions just once, then you can restore them all with store -r, and then exec func once for each function, in each new session.
(Came across this question while looking for a solution for 'quick saving' functions (most convenient way) while in an interactive python session - adding my current best solution for future readers)

Common variables in modules

I have three python files, let's call them master.py, slave1.py and slave2.py. Now slave1.py and slave2.py do not have any functions, but are required to do two different things using the same input (say the variable inp).
What I'd like to do is to call both the slave programs from master, and specify the one input variable inp in master, so I don't have to do it twice. Also so I can change the outputs of both slaves in one master program etc.
I'd like to keep the code of both slave1.py and slave2.py separate, so I can debug them individually if required, but when I try to do
#! /usr/bin/python
# This is master.py
import slave1
import slave2
inp = #some input
both slave1 and slave2 run before I can change the input. As I understand it, the way python imports modules is to execute them first. But is there some way to delay executing them so I can specify the common input? Or any other way to specify the input for both files from one place?
EDIT: slave1 and slave2 perform two different simulations being given a particular initial condition. Since the output of the two are the same, I'd like to display them in a similar manner, as well as have control over which files to write the simulated data to. So I figured importing both of them into a master file was the easiest way to do that.
Write the code in your slave modules as functions, import the functions, then call the functions from master with whatever input you need. If you need to have more stateful information, consider constructing an object.
You can do imports at any time:
inp = #some input
import slave1
import slave2
Note that this is generally considered bad design - you would be better off making the modules contain a function, rather than just having it happen when you import the module.
It looks like the architecture of your program is not really optimal. I think you have two files that execute immediately when you run them with python slave1.py. That is nice for scripting, but when you import them you run in trouble as you have experienced.
Best is to wrap your code in the slave files in a function (as suggested by #sr2222) and call these explicitly from master.py:
slave1.py/ slave2.py
def run(inp):
#your code
master.py
import slave1, slave2
inp = "foo"
slave1.run(inp)
slave2.run(inp)
If you still want to be able to run the slaves independently you could add something like this at the end:
if __name__ == "__main__":
inp = "foo"
run(inp)

Python: how to input python-code part like \input in Tex?

I want to input code in Python like \input{Sources/file.tex}. How can I do it in Python?
[added]
Suppose I want to input data to: print("My data is here"+<input data here>).
Data
1, 3, 5, 5, 6
The built-in execfile function does what you ask, for example:
filename = "Sources/file.py"
execfile( filename )
This will execute the code from Sources/file.py almost as if that code were embedded in the current file, and is thus very similar to #include in C or \input in LaTeX.
Note that execfile also permits two optional arguments allowing you to specify the globals and locals dicts that the code should be executed with respect to, but in most cases this is not necessary. See pydoc execfile for details.
There are occasional legitimate reasons to want to use execfile. However, for the purpose of structuring large Python programs, it is conventional to separate your code into modules placed somewhere in the PYTHONPATH and to load them using the import statement rather than executing them with execfile. The advantages of import over execfile include:
Imported functions get qualified with the name of the module, e.g. module.myfunction instead of just myfunction.
Your code doesn't need to hard-code where in the filesystem the file is located.
You can't do that in Python. You can import objects from other modules.
otherfile.py:
def print_hello():
print "Hello World!"
main.py
import otherfile
otherfile.print_hello() # prints Hello World!
See the python tutorial
Say you have code in "my_file.py". Any line which is not in a method WILL get executed when you do:
import my_file
So for example if my_file.py has the following code in it:
print "hello"
Then in the interpreter you type:
import my_file
You will see "hello".
My question was clearly too broad, as the variety of replies hint -- none of them fully attack the question. The jchl targets the scenario where you get python-code to be executed. The THC4k addresses the situation where you want to use outside objects from modules. muckabout's reply is bad practice, as Xavier Ho mentioned, why on earth it uses import when it could use exec as well, the principle of least privileges to the dogs. One thing is still missing, probably because of the conflict between the term python-code in the title and the addition of data containing integers -- it is hard to claim that data is python-code but the code explains how to input data, evaluations and executable code.
#!/usr/bin/python
#
# Description: it works like the input -thing in Tex,
# you can fetch outside executable code, data or anything you like.
# Sorry I don't know precisely how input(things) works, maybe misusing terms
# or exaggerating.
#
# The reason why I wanted input -style thing is because I wanted to use more
# Python to write my lab-reports. Now, I don't need to mess data with
# executions and evalutions and data can be in clean files.
#TRIAL 1: Execution and Evaluation not from a file
executeMe="print('hello'); a = 'If you see me, it works'";
exec( executeMe )
print(a);
#TRIAL 2: printing file content
#
# and now with files
#
# $ cat IwillPrint007fromFile
# 007
f = open('./IwillPrint007fromFile', 'r');
msg = f.read()
print("If 007 == " + msg + " it works!");
# TRIAL 3: Evaluation from a file
#
# $cat IwillEvaluateSthing.py
# #!/usr/bin/python
# #
# # Description:
#
#
# evaluateMe = "If you see me again, you are breaking the rules of Sky."
f = open('./IwillEvaluateSthing.py', 'r');
exec(f.read());
print(evaluateMe);

Categories

Resources