Multiprocessing with Python and Arguements - python

related to my last post (Which somehow got marked off and closed),
I wrote some code to create a thread for a command handler for my python TCP listener. What basically happens is that I send in some data and it goes in the TCP connecter. Then the TCP connector creates another process and sends the data that it received through the process to the function in the command listener. I do not know what is going on. Please help!
import socket
import sys
import errno
from multiprocessing import Process, Queue # #UnresolvedImport
import CommandHandler
class tcpconnection:
def tcp(self):
data = ''
q = Queue()
p = Process(target=CommandHandler.CommandHandler.commands(), args=(self, data))
#
HOST = '' # Symbolic name meaning all available interfaces
PORT = 9999 # Arbitrary non-privileged port
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.bind((HOST, PORT))
s.listen(1)
conn, addr = s.accept()
print('Connected by', addr)
while True:
data = conn.recv(1024)
p.start()
p.join()
if not data: break
conn.send(data)
conn.close()
Says that:
p = Process(target=CommandHandler.CommandHandler.commands(), args=(self, data))
TypeError: unbound method commands() must be called with CommandHandler instance as first argument (got nothing instead)

target=CommandHandler.CommandHandler.commands() sets the target argument to the result of the invocation of the commands method. What you probably intended to do was to use that method as tharget, so you should use:
p = Process(target=CommandHandler.CommandHandler.commands, args=(self, data))
Also the error tells you that you're trying to call an unbound method on a class, but you need an object to call it on, probably:
p = Process(target=CommandHandler.CommandHandler().commands, args=(self, data))
but that's not your only issue:
while True:
data = conn.recv(1024)
p.start()
p.join()
This loop will fail after the first invocation, because you can't restart an already started process.

Related

How can I wait until I receive data using a Python socket?

I am creating a socket client and trying to obtain some data. In order to do so, I need to connect to a web server via socket and the server actually creates another socket which listens and awaits for the data after which sends back to the client.
The problem I have with the code below is that my socket client does not wait for the incoming data from the server and just accepts empty data.
How can I wait for a non-empty data from the server using Python sockets?
My code:
import sys
import json
import socketIO_client
import time
host = 'https://SOME_URL'
socketIO = socketIO_client.SocketIO(host, params={"email" : "edmund#gmail.com"})
def on_connect(*args):
print "socket.io connected"
def on_disconnect(*args):
print "socketIO diconnected"
socketIO.on('connect', on_connect)
socketIO.on('disconnect', on_disconnect)
def on_response_state(*args):
print args # Prints ()
socketIO.emit('receive_state',on_response_state)
socketIO.wait_for_callbacks(seconds=3)
Here's an example using socket. Using s.accept(), the client will wait till a client accepts the connection before starting the while loop to receive data. This should help with your problem.
def receiver():
PORT = 123
CHUNK_SIZE = 1024
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
s.bind(('0.0.0.0', PORT))
s.listen(1)
conn,address=s.accept() # accept an incoming connection using accept() method which will block until a new client connects
while True:
datachunk = conn.recv(CHUNK_SIZE) # reads data chunk from the socket in batches using method recv() until it returns an empty string
if not datachunk:
break # no more data coming in, so break out of the while loop
data.append(datachunk) # add chunk to your already collected data
conn.close()
print(data)
return
receiver()
put the recv socket in a while thread.
like this:
def rec(self):
while 1:
sleep 0.01
rdata = self.clientsocket.recv(self.buffsize)
print("rec from server: ", rdata.decode('utf8'),'\n','press enter to continue')
....
t2 = threading.Thread(target=y.rec, name="rec")
t2.start()
Since you're using the SocketIO library to include parameters (achieved using requests), and want to emit a message, you can wait indefinitely for a response by not specifying a wait time.
with SocketIO(host, params={"email" : "edmund#gmail.com"}) as socketIO:
def on_response_state(*args):
print args # Prints ()
socketIO.emit('receive_state', on_response_state)
socketIO.wait()

When combine TCPServer with ThreadingMixIn, it block

the server-side code (tcp_server.py):
from SocketServer import TCPServer, ThreadingMixIn, StreamRequestHandler
class Server(ThreadingMixIn, TCPServer):
pass
class Handler(StreamRequestHandler):
def handle(self):
print 'got a connection from: ', self.request.getpeername()
print self.rfile.read(1024)
msg = 'hello'
self.wfile.write(msg)
server = Server(('127.0.0.1', 8888), Handler)
server.serve_forever()
the client-side code (tcp_client.py):
from socket import *
import threading
def do_connection():
s = socket(AF_INET, SOCK_STREAM)
s.connect(('127.0.0.1', 8888))
s.sendall('this is client')
print s.recv(1024)
ts = []
for x in xrange(100):
print x
t = threading.Thread(target=do_connection())
t.daemon = True
ts.append(t)
for t in ts:
t.start()
I runned tcp_server.py, and then tcp_client.py. tcp_client.py should have been over soon. However, tcp_client.py seemed just run only one thread and blocked, and tcp_server.py got only one connection. When I interrupted tcp_client.py,tcp_server.py got one message this is client。
Is there any mistake in my code ?
This line:
t = threading.Thread(target=do_connection())
Should be
t = threading.Thread(target=do_connection)
When you use do_connection(), you end up executing do_connection in the main thread, and then pass the return value of that call to the Thread object. What you want to do is pass the do_connection function object to Thread, so that the Thread object can execute do_connection in a new thread when you call t.start.
Also, note that starting 100 threads concurrently to connect to your server may not perform very well. You may want to consider starting with fewer threads, and working your way up to a higher number once you know things are working properly.
because the server is blocked by the first request, I try to change the read(1024) to
readline in the server.py and add a '\n' to the content sended from the client, it
works.
it seems the rfile.read(1024) will block the how process, so the goodway is use readline
or use the self.request.recv(1024)
server.py
from SocketServer import TCPServer, ThreadingMixIn, StreamRequestHandler
class Server(ThreadingMixIn, TCPServer):
pass
class Handler(StreamRequestHandler):
def handle(self):
print 'got a connection from: ', self.request.getpeername()
print self.rfile.readline()
#print self.request.recv(1024).strip()
msg = 'hello'
self.wfile.write(msg)
# Create the server, binding to localhost on port 9999
server = Server(("127.0.0.1", 8888), Handler)
server.serve_forever()
client.py
from socket import *
import threading
def do_connection():
print "start"
s = socket(AF_INET, SOCK_STREAM)
s.connect(('127.0.0.1', 8888))
s.sendall('this is client\n')
print s.recv(1024)
ts = []
for x in xrange(100):
print x
t = threading.Thread(target=do_connection)
ts.append(t)
for t in ts:
print "start t"
t.start()

Python terminate multiprocessing and gui process gracefully

I am using glade as my gui and creating a process to run my gui in. This app will open a socket when 'on' is clicked. When i press 'send', it will send whatever is in an textfield to the socket. The socket receives this data and sends it back. The problem is after i send data to the socket the thread doesn't terminate. Also after i close my gui it calls a sys.exit() but also leaves a process and doesn't terminate. I believe the error is in how i am implementing my processes or all my processing in general. Can anyone shine some light on this? It also relates to my last post as well. Thanks
main.py
// Main thread that create a new process for my gui and displays it
import socket, thread, gtk, Handler, sys, os, multiprocessing
sys.setrecursionlimit(10000)
if __name__ == '__main__':
builder = gtk.Builder()
#32bit template.glade 64bit template-2.22
# #todo add switching between architectures
#
builder.add_from_file("template/template-2.22.glade")
builder.connect_signals(Handler.Handler(builder))
window = builder.get_object("window1")
window.show_all()
try:
p = multiprocessing.Process(target=gtk.main())
p.start()
except:
print "Error Starting new Thread"
handler.py
// Handler for gtk glade signals, creates new threads and handles button and stuff
import thread, threading, os, server, client,multiprocessing, time
import sys, gtk
class Handler(object):
'''
classdocs
'''
myobject = ''
def __init__(self,object1):
#Getting glade builder
self.myobject = object1
'''
Constructor
'''
def clickme(self,value):
myserver = server.Server()
try:
p = multiprocessing.Process(target=myserver.run)
p.start()
except:
pass
def sendmessage(self,value):
text = self.myobject.get_object('entry1').get_text()
print text
msg = client.MyClass()
p = multiprocessing.Process(target=msg.run,args=([text]))
p.start()
server.py
// Opens a socket and listens for incoming data and sends it back
import socket,multiprocessing, gtk, sys
class Server:
'''
classdocs
'''
def __init__(self):
'''
Constructor
'''
def run(self):
try:
while 1:
HOST = 'localhost' # Symbolic name meaning the local host
PORT = 50006 # Arbitrary non-privileged port
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
s.bind((HOST, PORT))
s.listen(5)
conn, addr = s.accept()
print 'Connected by', addr
while True:
data = conn.recv(1024)
if not data:
conn.close()
sys.exit()
break
elif data != '':
conn.sendall(data)
break
print "Closing"
#conn.close()
finally:
print "End"
pass
client.py
// Sends whatever is inside text area to socket
import time
class MyClass:
'''
classdocs
'''
def __init__(self):
'''
Constructor
'''
def run(self,text):
try:
import socket
HOST = 'localhost' # The localhost
PORT = 50006 # The same port as used by the server
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((HOST, PORT))
s.send(text)
data = s.recv(1024)
while 1:
if data != '':
print 'Received', repr(data)
break
finally:
pass
This is just wrong:
p = multiprocessing.Process(target=gtk.main())
p.start()
First, you can't start the gtk main loop in a subprocess, even if you did it rigth. Fortunately the process never really tries to start main as you call gtk.main(), which will block until the main loop exits and then return None. So what you're actually doing is:
gtk.main()
p = multiprocessing.Process(target=None)
p.start()
Througout the rest of your code you keep creating new processes and then forgetting about them. If you would keep a reference to them, you could at least try to send the TERM signal to them to shut them down (using Process.terminate, or set the daemon flag). If you want to shut down the subprocess cleanly, you either need to handle that signal in the subprocess, or use other IPC mechanisms to get it to shut down cleanly (like mutliprocessing.Event, ...).
Then there is this:
while True:
data = conn.recv(1024)
if not data:
conn.close()
sys.exit()
break
elif data != '':
conn.sendall(data)
break
This while loop will never loop (unless recv magically returns something else then a string). The first execution path ends with sys.exit() (taking the whole server down - the break is unreachable), the second ends with break, so the loop is useless.
A few lines below you have the exact opposite:
data = s.recv(1024)
while 1:
if data != '':
print 'Received', repr(data)
break
Unless data was '' in the first line, this will be an endless loop, as data's value won't change anymore.
Generally you don't really need multiprocessing for most of this. Starting a server in a different process may be ok if if has to do a lot of work, but spawing a subprocess just to send some data is overkill. Sending and receiving using sockets are IO bound, using threading here would be more reasonable.
You have two classes (Server and Handler) which have only two methods, one of which is __init__, and the other one is only used as target for a subprocess:
myserver = server.Server()
try:
p = multiprocessing.Process(target=myserver.run)
and:
msg = client.MyClass()
p = multiprocessing.Process(target=msg.run,args=([text]))
That's a sign that these shouldn't be classes but functions.

Can socket objects be shared with Python's multiprocessing? socket.close() does not seem to be working

I'm writing a server which uses multiprocessing.Process for each client. socket.accept() is being called in a parent process and the connection object is given as an argument to the Process.
The problem is that when calling socket.close() the socket does not seem to be closing. The client's recv() should return immediately after close() has been called on the server. This is the case when using threading.Thread or just handle the requests in the main thread, however when using multiprocessing, the client's recv seem to be hanging forever.
Some sources indicate that socket objects should be shared as handles with multiprocessing.Pipes and multiprocess.reduction but it does not seem to make a difference.
EDIT: I am using Python 2.7.4 on Linux 64 bit .
Below are the sample implementation demonstrating this issue.
server.py
import socket
from multiprocessing import Process
#from threading import Thread as Process
s = socket.socket()
s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
s.bind(('', 5001))
s.listen(5)
def process(s):
print "accepted"
s.close()
print "closed"
while True:
print "accepting"
c, _ = s.accept()
p = Process(target=process, args=(c,))
p.start()
print "started process"
client.py
import socket
s = socket.socket()
s.connect(('', 5001))
print "connected"
buf = s.recv(1024)
print "buf: '" + buf +"'"
s.close()
The problem is that the socket is not closed in the parent process. Therefore it remains open, and causes the symptom you are observing.
Immediately after forking off the child process to handle the connection, you should close the parent process' copy of the socket, like so:
while True:
print "accepting"
c, _ = s.accept()
p = Process(target=process, args=(c,))
p.start()
print "started process"
c.close()

Make a multiprocess UDP server with Python, one process for listening to one port

I want make a multiprocess UDP server with Python, listen one port for each process from a class:
processListener.py:
import multiprocessing
import socket
class processListener(multiprocessing.Process):
def __init__(self):
multiprocessing.Process.__init__(self)
self.data = None
def run(self):
self.startServer()
return
def startServer(self):
udpSocket = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
address = ('', self.port)
udpSocket.bind(address)
while 1:
data, client = udpSocket.recvfrom(1024)
print self.data, '>>>', data.strip()
self.data = data.strip()
udpSocket.sendto('ACK', client)
return
and my main file is server.py:
from processListener import *
# Variable Definition
port = 4000
# Sever Initialization
if __name__ == '__main__':
process = processListener()
process.port = port
process.start()
while True:
command = raw_input()
if command == 'showdata':
print 'Last Data is:', process.data
When the server is running and I send data to localhost:4000 from UDP
shell$
None >>> Test Data
But the problem starts when I use the command showdata
shell$
None >>> Test Data
showdata
Last Data is: None
Multiple processes do not share state by default.
You are accessing processListener instance from server.py process. processListener.py process has its own processListener instance (that has non-None data attribute).
To demonstrate it, replace multiprocessing.Process by threading.Thread. Multiple threads share objects by default and you should see non-None data.

Categories

Resources