How can I send a value which in a process to another process? For example I have something like this code piece. I want to print value in xFunc. Can someone explain how can I do it? Thank you.
def yFunc():
value = 5
def xFunc():
print(value)
def smap(f):
return f()
def main():
f_x = functools.partial(xFunc)
f_y = functools.partial(yFunc)
with Pool() as pool:
res = pool.map(smap, [f_x, f_y])
if __name__ == '__main__':
main()
Edit: value is not constant number it is changing continuously.
Edit2: I found a way for my problem. Here the solution:
https://stackoverflow.com/a/58208695/16660763
There are several ways.
One way would be to use the global keyword. like this:
def yFunc():
global value
value = 5
def xFunc():
print(value + 1)
yFunc()
xFunc()
Or like this:
value = 5
def yFunc():
global value
value = value + 1
def xFunc():
print(value)
yFunc()
xFunc()
Another way would be to pass the variable to the function like this:
def yFunc():
value = 5
return value
def xFunc(x):
print(x)
xFunc(yFunc())
Hope that's what you were asking.
Python has provided some tools to realize it.
Considering Pipes, Queues, Value and Managers.
Related
Here's the problem I try to solve:
I have a first function, to which I put in arguments. Then, later on, I have a second function, from which I want to call, as a variable, the said argument of the parent function. So it goes like:
def parent_function(argument=x):
if statement:
child_function()
else:
...
return result
def child_function():
x = x + 5
return x
If I run such a code, I get an error in the child function saying name 'x' is not defined.
However, if I fix my code to make x global in the parent function, like this:
def parent_function(argument=x):
global x
if statement:
child_function()
else:
...
return result
def child_function():
x = x + 5
return x
I get the error name 'x' is parameter and global
I need to import both functions in another file and I can't "dismantle" the child function inside the parent function.
Thanks very much for any help !
Don't use global Variables. Every function needs it's own arguments:
def parent_function(x):
if statement:
x = child_function(x)
else:
...
return result
def child_function(x):
x = x + 5
return x
name 'x' is parameter and global means you can't overwrite parameter x for being global also. To fix this, use another variable y, like this:
def parent_function(argument=x):
global y
y = x
if statement:
child_function()
else:
...
return result
def child_function():
y = y + 5
return y
This error happens because you are trying to overwrite a parameter in a function whose scope is local to that function by giving it a global scope. The problem is that variables defined in the context of a function are by definition local variables. To better illustrate this problem ,you can simply try to launch this piece of code:
def parent_function(argument="Hello"):
global argument
return argument
You will see that it will fail to run for the same reason that I have explained. I hope I have been clear in my explanation. Good luck.
The first thing you need to change is this:
def parent_function(argument=x):
If you search for how to make a default argument in a function you will get something like this: https://www.geeksforgeeks.org/default-arguments-in-python/
. This means instead of x you need to have some default value, for example:
def parent_function(argument=5):
This means that if you do not pass the argument called argument to the function value 5 will be passed.
On the other hand, it seems that you want x to be an argument, which means the def line should look like this:
def parent_function(x=5):
Second, global keyword needs to be used in the child_function since x has not been used in parent_function. This leads to this:
def parent_function(x=5):
if statement:
child_function()
else:
...
return result
def child_function():
global x
x = x + 5
return x
To have all this work, there must be at least two more lines one to set x and another to call parent_function, like this:
x = 6
parent_function(4)
But, to be even funnier, x from the arguments in parent_function and x used in child_function are not the same thing, and you can see for yourself in this example which is similar to your code, but fully executable:
def parent_function(x=5):
if True:
print(child_function())
else:
print("else branch")
return True
def child_function():
global x
x = x + 5
return x
x = 6
parent_function(4)
This prints out 11 even you might think it will print out 9!
This is due to fact that keyword global refers to the (as the word says) global variable declared outside of the functions, the variable with value 6. Usually, local and global variables should have different names, so either the argument x in parent_function or global x variable needs to be renamed.
IDK if this helps, but you will learn something from this, for sure!
I have to execute the following code wherein I will be calling the function main again and again.
so here as I need to use i = i+1, I need to declare and initialize i in the first place right, but when i call the main function it again defines i=0 and the whole purpose of i = i+1 is lost.
How can I solve this error?
I have given the condition just as an example.
Basically what I want is i should be initialized only once, inspite of how many number of times main is called.
def main():
i = 0
if 0<1:
i = i+1
y = i
There are a couple ways to do this that don't involve globals. One is capture the value of i in a closure and return a new function that increments this. You will need to call the initial function once to get the returned function:
def main():
i = 0
def inner():
nonlocal i
i += 1
return i
return inner
f = main()
f()
# 1
f()
# 2
You can also create a generator which is a more pythonic way to do this. The generator can be iterated over (although use caution since it iterates forever) or you can get a single value by passing it to next():
def main():
i = 1
while True:
yield i
i += 1
f = main()
next(f)
# 1
next(f)
# 2
You can also use itertools.count
So you haven't declared i as a global variable
Do something like this
global i
i = 0
def main():
if 0<1:
global i
i = i+1
y = i
The reason behind this is because inside a function all the variables are local meaning they only exist inside the function while the function is called, so if you want a function to be able to change a variable for the whole code, you'll need to announce it as a global so python knows to change the value of it for the entire code
I'm not sure exactly what you are trying to do, but I believe there is an easier way to do whatever it is you are doing
It looks like you want to maintain state in a function call which is a good reason to convert it to a class.
class MyClass:
def __init__(self):
self.i = 0
def main(self):
self.i += 1
y = self.i
myclass = MyClass()
myclass.main()
myclass.main()
print(myclass.i)
I'm only starting getting into a python from #C and I have this question that I wasn't able to find an answer to, maybe I wasn't able to form a question right
I need this to create two lists when using:load(positives) and load(negatives), positives is a path to the file. From #C I'm used to use this kind of structure for not copy the same code again just with another variable, eg. what if I would need 5 lists. With this code i'm only able to access the self.dictionary variable but in no way self.positives and self.negatives
I get error AttributeError: 'Analyzer' object has no attribute 'positives' at line 'for p in self.positives:'
MAIN QUESTION IS: how to make self.dictionary = [] to create list variables from the argument name - self.positives and self.negatives which i need later in code
def load(self, dictionary):
i = 0
self.dictionary = []
with open(dictionary) as lines:
for line in lines:
#some more code
self.dictionary.append(0)
self.dictionary[i] = line
i+=1
#later in code
for p in self.positives:
if text == p:
score += 1
for p in self.negatives:
if text == p:
score -= 1
#structure of a program:
class Analyzer():
def load()
def init()
load(positives)
load(negatives)
def analyze()
for p in self.positives
You cannot write self.dictionary and expect python to convert it to self.positives or self.negatives.
Instead of positives, insert self.positives and self.negatives into the function and use those.
Took long enough to figure it out:
All it took was to return a self.dictionary from load, and assign it in init as self.positives = self.load(positives):
#structure of a program:
class Analyzer():
def load()
return self.dictionary
def init()
self.positives = self.load(positives)
self.negatives = self.load(negatives)
def analyze()
for p in self.positives
From what I understood from the question, you are trying to create 2 lists. You first have to declare them like this:
FirstList = [ ]
SecondList = [ ]
Then take whatever value you want to add to the list and append it like this:
SecondList.append("The thing you want in the list")
by the end of the code your lists should be filled with what you want.
First of, I am sure that this is a repeat question so i'm sorry, but I couldn't find anything. Also keep in mind I am very new to coding in general hence the quite dumb question
so if I have something like
a = 1
def fun():
a = a + 1
fun()
is there a way to make it so that if I run this a would be equal to 2?
Use global. Like this:
a = 1
def fun():
# make a a global variable here
global a
a = a + 1
fun()
print a
OUTPUT:
2
You are dealing with global variable a.
a = 1
def fun():
global a
a = a + 1
What is correct way to use data from functions in Python scripts?
Using print, like:
var1 = 'This is var1'
def func1():
print(var1)
func1()
Or - with return:
var1 = 'This is var1'
def func1():
return var1
print(func1())
Both give same result:
$ ./func_var_print_return.py
This is var1
You should return a value when the purpose of the function is to produce a value. This is so functions can use other functions. For example
def add(x,y):
return x + y
def multiply(a,b):
product = 0
for i in range(b):
product = add(product, a) # Note I am calling the add function
return product
Testing
>>> multiply(5,4)
20
Note that I used the return value from add within my multiply function. If I only printed the value from add, I would have been unable to do that.
It depends on the situation. In general I would return it so you can print if you want but if you code changes at some point you can perform other operations with the value
You should always try to return the value.
Think of unit tests -> How would you verify the value if it's not returned?
And as mentioned by "meto" before, if your code changes you can still perform other operations with the value.