Import Lib not working with exec function? - python

I have written the below code string and trying to execute it through the exec method. This code is running fine when I run it with global mode only.
codeRule = """import math
def fun (n):
data = n
data = data * math.pi
print(data)
return data
dd = fun(n)"""
codeObejct = compile(codeRule, 'sumstring', 'exec')
exec(codeObejct, dict(n = 10))
But my use case needs dd value outside of exec so I have used the below parameter to get dd value inside another dataframe.
loc = {}
exec(codeObejct, dict(n = 10), loc)
dd = loc["dd"]
But as soon as I use local it starts giving me an error regarding Lib Import such as
File "<stdin>", line 1, in <module>
File "sumstring", line 7, in <module>
File "sumstring", line 4, in fun
NameError: name 'math' is not defined
Can someone please help to solve this problem?
I have checked the below question's answer but I don't know how to fit it in my use case.
Why doesn't an import in an exec in a function work?

Finally, I got solution,
I was missing one point with exec. Below is the solution that I got and I hope it will work for my actual Use case
codeRule = """import math
def fun (n):
data = n
data = data * math.pi
return data
"""
#export Function
exec (codeRule, globals())
dd = fun(10)
dd
31.41592653589793

Your answer looks good. Here's another more-convoluted approach if you need a fallback for your use case:
codeRule = """\
import math
def fun(n):
data = n
data = data * math.pi
return data
loc['dd'] = fun(n)
"""
codeObject = compile(codeRule, 'sumstring', 'exec')
loc = {}
exec(codeObject, dict(n=10, loc=loc))
print(f"{loc['dd']=}") # -> loc['dd']=31.41592653589793

Related

How can I use a calculated value of a function in another function in another Python file?

I have wrote the following code in derivation.py:
def Interpolation(ableitungWinkel,x_values):
z = medfilt(derivation,3)
diff = abs(derivation-z)
new_smootheddata = np.where(diff>3,z,derivation)
x=np.arange(0,len(x_values[:-2]))
f = interp1d(x,new_smootheddata,kind="linear")
xnew = np.arange(0, len(x_values[:-3]),0.01)
ynew = f(xnew)
s=plt.plot(x, z,"o",xnew, ynew, "-")
return s
In my project there is also integration.py. In this Python file I need the values which z calculates in the function def interpolation for this calculation:
def horizontalAcceleration(strideData):
resultsHorizontal = list()
for i in range (len(strideData)):
yAngle = z
xAcceleration = strideData.to_numpy()[i, 4]
yAcceleration = strideData.to_numpy()[i, 5]
a = ((m.cos(m.radians(yAngle)))*yAcceleration)-((m.sin(m.radians(yAngle)))*xAcceleration)
resultsHorizontal.append(a)
resultsHorizontal.insert(0, 0)
return resultsHorizontal
As you can see I have already added z to the function def horizontalAcceleration at the place where it should go.
To use z there, I tried the following: from derivation import z
But that doesn't work. Because then I get the error: ImportError: cannot import name 'z' from 'derivation'
Have anybody an idea how I can solve this problem? Thanks for helping me.
I think that your misunderstanding is because you think a function is like a script that has been run and modified a.global state. That's not what a function is. A function is a series of actions performed on its inputs (ignoring closures for a minute) which returns some results. You can call it many times, but without calling it, it never executes. Once it stops executing all its variables go out of scope.
You can import and call a function though. So you can change the return type of Interpolation to return everything you need somewhere else. E.g.
def Interpolation(...):
...
return {'z': z, 's': s}
Then somewhere you import that function, call it, get back all the data you need, then pass that to your other function.
import Interpolation from derivation
# get z and s in a dict result
result = Interpolation(...)
# pass s as well as the other argument to your other function
horizontalAcceleration(strideData, result['s'])

How do I run a function from my .py file in the command?

I have a .py file with a function that calculates the gradient of a function at a point and returns the value of that gradient at the point. The function takes a np.array([2,]) as input and outputs another np.array([2,]). I am confused as to how I can call the function from the cmd line and run the function with a specified input.
Here is a code snippet:
import numpy as np
def grad(x):
x_1 = x[0]
x_2 = x[1]
df_dx_1 = 6*x
df_dx_2 = 8*x_2
df_dx = np.array([df_dx_1, df_dx_2])
return np.transpose(df_dx)
I would really appreciate your help!
EDIT: This question differs from the popular command line thread because I have a specific issue of not being able to recognise the numpy input
First change script to (Here it uses if __name__='__main__' to check if it is running from script, then import sys and pass first argument using sys.argv[0] to the function):
import numpy as np
def grad(x):
x_1 = x[0]
x_2 = x[1]
df_dx_1 = 6*x
df_dx_2 = 8*x_2
df_dx = np.array([df_dx_1, df_dx_2])
return np.transpose(df_dx)
if __name__ == '__main__':
import sys
grad(sys.argv[1])
And call it like:
python "YOURSCRIPTPATH.py" argument_1
You can have more than one command line argument:
import sys
import numpy as np
def grad(x):
# your grad function here
arr = np.array([int(sys.argv[1]), int(sys.argv[2])])
print(grad(arr))
Usage:
python gradient.py 10 5
You could just something like this in the command line:
$ python -c 'from YOURFILE import grad; print(grad(your_argument))'

Returning values from Python function running in MATLAB

I am testing calling Python functions through MATLAB, but I am not getting what I expected. I figured I could call a Python function within MATLAB and have whatever that function returns, given to a MATLAB variable, as follows:
Python script (saved as test_for_mlab.py):
def out_func(arg_1):
print 'This print statement is in Python'
a = int(arg_1 * 10)
return a
MATLAB part:
val = py.test_for_mlab.out_func(33);
I was expecting val to have a value of 330 (int). Instead, the message I get in MATLAB is val = Python NoneType with no properties. None.
How can I get my desired results?
My latest tests in Matlab 2022 suggest that the return of values only works with scripts but not with functions.
A workaround is to write a simple wrapper script in Python, like so:
return_script.py:
from return_value_test import main
import sys
a = main(sys.argv)
return_value_test.py:
import sys
def main(argv):
print(argv)
x = int(argv[1])
a = x
b = a + 1
return b
if __name__ == "__main__":
main(sys.argv[1:])
How to call the script from Matlab with a command line parameter of 2, the return value of a should be 3 (original a + 1):
a = pyrunfile("return_script.py 2", "a")

def function gives invalid syntax

As far as I checked, the indentation is correct, no brackets are missing and I have only imported packages in the previous lines But I still get invalid syntax error.
#!/usr/bin/python
import bpy
import mathutils
import numpy as np
from math import radians
from mathutils import Vector
from math import radians
from mathutils import Matrix
from bpy import context
def transform_mesh('parent', 'obj_to_be_transformed', (translate_x, translate_y, translate_z), (rot_x,rot_y,rot_z)):
obj= bpy.data.objects[parent]
obj1= bpy.data.objects[obj_to_be_transformed]
initial_mat = obj1.matrix_world
...some code
(x,y,z) = (translate_x, translate_y, translate_z)
orig_loc_mat = Matrix.Translation(orig_loc+ mathutils.Vector((x,y,z)))
...some more code
eul = mathutils.Euler((radians(rot_x), radians(rot_y), radians(rot_z)), 'XYZ')
rot_mat = eul.to_matrix().to_4x4()
obj.matrix_world = orig_loc_mat * rot_mat * orig_rot_mat * orig_scale_mat
bpy.context.scene.update()
return [initial_loc,initial_rot,initial_scale,loc,rot,scale]
transform_result= transform_mesh('Armature','Coil',(5,0,0),(0,0,1))
print (transform_result)
And error is:
Error: File "D:\users\gayathri\Gayathri\Synthetic_data_generation\Final\HMI_Depth_coilA_final_final.blend\Untitled", line 18
def transform_mesh('parent', 'obj_to_be_transformed', (translate_x, translate_y, translate_z), (rot_x,rot_y,rot_z)):
^
SyntaxError: invalid syntax
location: <unknown location>:-1
def transform_mesh('parent', 'obj_to_be_transformed',
should be
def transform_mesh(parent, obj_to_be_transformed,
surely?
1- Remove strings from arguments
2- Remove tuples from arguments and attribute them in the function (It might be useful to add some checks)
So, here you are:
def transform_mesh(parent, obj_to_be_transformed, translate, rot):
translate_x, translate_y, translate_z= translate
rot_x,rot_y,rot_z = rot
# etc
transform_result= transform_mesh('Armature','Coil',(5,0,0),(0,0,1))
print (transform_result)
Tuple parameters are not supported in Python3, but you can pass it as a variable and unpack it after defining the function.
def transform_mesh(translate_xyz):
translate_x, translate_y, translate_z = translate_xyz
You need to provide variables as arguments to the function.
try something like:
def transform_mesh(parent, obj_to_be_transformed, t1, t2):
Although in the code you have shared, you are always using t1 and t2 as tuples. But in case you want to use x, y and z separately, you can do it by referencing the index:
x = t1[0]
y = t1[1]
In this line the function parameter are passed in incorrect way,
def transform_mesh('parent', 'obj_to_be_transformed', (translate_x, translate_y, translate_z), (rot_x,rot_y,rot_z)):
The Correct syntax would be:
def transform_mesh(parent, obj_to_be_transformed, *translate_xyz, *rot_xyz): #*translate_xyz and *rot-xyz are tuple parameter

How to troubleshoot an "AttributeError: __exit__" in multiproccesing in Python?

I tried to rewrite some csv-reading code to be able to run it on multiple cores in Python 3.2.2. I tried to use the Pool object of multiprocessing, which I adapted from working examples (and already worked for me for another part of my project). I ran into an error message I found hard to decipher and troubleshoot.
The error:
Traceback (most recent call last):
File "parser5_nodots_parallel.py", line 256, in <module>
MG,ppl = csv2graph(r)
File "parser5_nodots_parallel.py", line 245, in csv2graph
node_chunks)
File "/Library/Frameworks/Python.framework/Versions/3.2/lib/python3.2/multiprocessing/pool.py", line 251, in map
return self.map_async(func, iterable, chunksize).get()
File "/Library/Frameworks/Python.framework/Versions/3.2/lib/python3.2/multiprocessing/pool.py", line 552, in get
raise self._value
AttributeError: __exit__
The relevant code:
import csv
import time
import datetime
import re
from operator import itemgetter
from multiprocessing import Pool
import itertools
def chunks(l,n):
"""Divide a list of nodes `l` in `n` chunks"""
l_c = iter(l)
while 1:
x = tuple(itertools.islice(l_c,n))
if not x:
return
yield x
def csv2nodes(r):
strptime = time.strptime
mktime = time.mktime
l = []
ppl = set()
pattern = re.compile(r"""[A-Za-z0-9"/]+?(?=[,\n])""")
for row in r:
with pattern.findall(row) as f:
cell = int(f[3])
id = int(f[2])
st = mktime(strptime(f[0],'%d/%m/%Y'))
ed = mktime(strptime(f[1],'%d/%m/%Y'))
# collect list
l.append([(id,cell,{1:st,2: ed})])
# collect separate sets
ppl.add(id)
return (l,ppl)
def csv2graph(source):
MG=nx.MultiGraph()
# Remember that I use integers for edge attributes, to save space! Dic above.
# start: 1
# end: 2
p = Pool()
node_divisor = len(p._pool)
node_chunks = list(chunks(source,int(len(source)/int(node_divisor))))
num_chunks = len(node_chunks)
pedgelists = p.map(csv2nodes,
node_chunks)
ll = []
ppl = set()
for l in pedgelists:
ll.append(l[0])
ppl.update(l[1])
MG.add_edges_from(ll)
return (MG,ppl)
with open('/Users/laszlosandor/Dropbox/peers_prisons/python/codetenus_test.txt','r') as source:
r = source.readlines()
MG,ppl = csv2graph(r)
What's a good way to troubleshoot this?
The problem is in this line:
with pattern.findall(row) as f:
You are using the with statement. It requires an object with __enter__ and __exit__ methods. But pattern.findall returns a list, with tries to store the __exit__ method, but it can't find it, and raises an error. Just use
f = pattern.findall(row)
instead.
It is not the asker's problem in this instance but the first troubleshooting step for a generic "AttributeError: __exit__" should be making sure the brackets are there, e.g.
with SomeContextManager() as foo:
#works because a new object is referenced...
not
with SomeContextManager as foo:
#AttributeError because the class is referenced
Catches me out from time to time and I end up here -__-
The error also happens when trying to use the
with multiprocessing.Pool() as pool:
# ...
with a Python version that is too old (like Python 2.X) and does not support using with together with multiprocessing pools.
(See this answer https://stackoverflow.com/a/25968716/1426569 to another question for more details)
The reason behind this error is :
Flask app is already running, hasn't shut down and in middle of that we try to start another instance by:
with app.app_context():
#Code
Before we use this with statement we need to make sure that scope of the previous running app is closed.

Categories

Resources