Building an error checking function in a python FTP script - python

I'm working on a script that connects several "client" computers in a "server" computer, which then uses those clients to process several files, using FTP (pyftplib and pyftpdlib) for transfering files and results.
The script works by creating 3 folders on the server: Files, Processing and Results. The clients then connect to the server by FTP, access the "Files" folder, get the file for processing, then transfer it to the "Processing" folder while it is processing it. Then, when it finishes processing, the client delete the file from the processing folder and copies the results to the "Results" folder.
This is working correctly, both on the server and the client side. The problem i'm having is that, if one of the clients disconnects midway without generating an error (PC is disconnected, power outage), the server will threat this as if the client is still processing the file, and the file will stay in the "Processing" folder. What i want is a error checking function that, when this happens, the file on the "Processing" folder will return to the "Files" folder.
Here is the Server FTP Code
def main():
authorizer = DummyAuthorizer()
authorizer.add_user('client', 'password', '.', perm='elradfmwM')
authorizer.add_anonymous(os.getcwd())
handler = FTPHandler
handler.authorizer = authorizer
handler.banner = "FTP Server."
address = ('', port)
server = FTPServer(address, handler)
server.max_cons = 256
server.max_cons_per_ip = 50
server.serve_forever()
if __name__ == '__main__':
main()
And here is the Client FTP code:
while True:
ftp = ftplib.FTP()
ftp.connect(arguments.host_ip, arguments.host_port)
ftp.login("client", "password")
print ftp.getwelcome()
ftp.retrlines('LIST')
ftp.retrbinary('RETR Output.txt', open('Output.txt', 'wb').write)
ftp.retrbinary('RETR dicionario.json', open('dicionario.json', 'wb').write)
with open('dicionario.json') as json_file:
json_data = json.load(json_file)
receptor_file = json_data['--receptor']
print 'Retrieving receptor file ' + receptor_file
ftp.retrbinary('RETR ' + receptor_file, open(receptor_file, 'wb').write)
ftp.cwd('Files')
ftp.retrlines('LIST')
filename = ftp.nlst()[0]
print 'Getting ' + filename
ftp.retrbinary('RETR ' + filename, open(filename, 'wb').write)
with open("Output.txt", "a") as input_file:
input_file.write('ligand = %s' %filename)
input_file.close()
ftp.delete(filename)
ftp.cwd('../Processing')
ftp.storbinary('STOR ' + filename, open(filename, 'rb'))
ftp.quit()
print "Processing"
return_code = subprocess.call(calls the program for processing files)
if return_code == 0:
print """Done!"""
ftp.connect(arguments.host_ip, arguments.host_port)
ftp.login("client", "password")
ftp.cwd('Results')
ftp.storbinary('STOR ' + os.path.splitext(filename)[0] + '_out.pdbqt', open (os.path.splitext(filename)[0] + '_out.pdbqt'))
ftp.cwd('../Processing')
ftp.delete(filename)
ftp.quit()
else:
print """Something is technically wrong..."""
ftp.connect(arguments.host_ip, arguments.host_port)
ftp.login("client", "password")
ftp.cwd('Files')
ftp.storbinary('STOR ' + filename, open(filename, 'rb'))
ftp.cwd('../Processing')
ftp.delete(filename)
ftp.quit()
Thanks for the help!

So, after half a month fiddling with this code, i finally made it work when a client cancels the connection
First i had to make a way for the server to identify each client. Instead of making them login with only one user, i created specific users for each connection, with 2 different functions:
def handler_generation(size=9, chars=string.ascii_uppercase + string.digits):
return ''.join(random.choice(chars) for i in range (size))
This generates a 9 character login and password
Then i created a custom handler in pyftpdlib, and used the on_login function:
class MyHandler(FTPHandler):
def on_login(self, username):
if username == "client":
user_login = handler_generation()
user_password = handler_generation()
global authorizer
authorizer.add_user(user_login, user_password, '.', perm='elradfmwM')
credentials = open("Credentials.txt",'w')
credentials.write(user_login)
credentials.write("\n")
credentials.write(user_password)
credentials.close()
else:
pass
So, when the Client connects with the "client" login, the server generates a 9 character login and password, and sends it to the client in the "Credentials.txt" file. In the client-side, it would do this:
ftp.login("client", "password")
ftp.retrbinary('RETR Credentials.txt', open('Credentials.txt', 'wb').write)
ftp.quit()
with open('Credentials.txt') as credential_file:
lines = credential_file.readlines()
credential_login = lines[0].split("\n")[0]
credential_password = lines[1].split("\n")[0]
ftp.connect(arguments.host_ip, arguments.host_port)
ftp.login(credential_login, credential_password)
So now the clients all connect with their own specific login. On the client side, i made it so that for each task that was completed, the client would send a file named for their specific login. I also made the client append their own login name in the file they were processing to make it easy for the server to find the file:
ftp.rename(filename, credential_login + filename)
Then, i used another function of the handler class, the on_disconnect:
def on_disconnect(self):
if self.username == "client":
pass
else:
if os.path.isfile(self.username):
pass
else:
for fname in os.listdir("Processing"):
if fname.startswith(self.username):
shutil.move("Processing/" + fname, "Files")
os.rename("Files/" + fname, "Files/" + fname[9::])
print self.remote_ip, self.remote_port,self.username, "disconnected"
pass
Now, whenever a client disconnects, the server searches the folder to check if the client sent the handler file. If it's not there, the server will move the file to the "Files" folder, which is the folder where the Files that are yet to be processed are.
To make a failed client disconnect from the server without sending a quit command, i used the timeout function from pyftpdlib. To make sure that an active client would not accidentally timeout, i implemented a thread in the client, that would do something with the server each N seconds:
class perpetualTimer():
def __init__(self,t,hFunction):
self.t=t
self.hFunction = hFunction
self.thread = Timer(self.t,self.handle_function)
def handle_function(self):
self.hFunction()
self.thread = Timer(self.t,self.handle_function)
self.thread.start()
def start(self):
self.thread.start()
def cancel(self):
self.thread.cancel()
def NotIdle():
Doing something here
t = perpetualTimer(10, NotIdle)
t.start()
(this particular code i copied straight from someone here)
And voila. Now both the server and the client work and have their own error checking function.
I'm putting this answer here in case someone encounters a similar problem.
Thanks!

Related

Windows Task Scheduler Running Executable without reading File

I'm trying to have Task Scheduler run an executable on windows startup. The executable is a simple python script that reads a list of IPs from a .txt, pings them, and repeats after a set interval, like a basic heartbeat.
The executable was created using pyinstaller successfully, and runs perfectly when used manually.
However, when I try and have task scheduler run the same executable in the same directory, it does so without reading the .txt file, and immediately closing.
The following is the code,
import os
import time
import smtplib
from email.message import EmailMessage #allows for the email system to work
from datetime import datetime #allows the text files to have the date & time
from win10toast import ToastNotifier #allows for desktop notificaitons to appear on windows devices
import schedule #automatically schedules when the script executes
# Scans the IPs in the txt file
def notif():
with open (r'sydvlan.txt') as file:
dump = file.read() #reads the lines of the sydvlan.txt file
dump = dump.splitlines()
#creates a new log file and makes the title the current date and time
cdString = datetime.now().strftime("%d_%m_%Y %H_%M")
report = open(r'HeartbeatResults_{0}.txt'.format(cdString), 'w') #creates a log with the date&time
for line in dump:
lineList = line.split(":")
lineText = lineList[0] #makes sure that only the IP is being read from sydvlan.txt
IP = lineList[1].strip()
print("Currently Pinging {} on {}".format(lineText,IP))
print("------------------"*3)
# Get Date and Time at time of ping.
currentDate = datetime.now()
cdString = currentDate.strftime("%d_%m_%Y %H:%M:%S")
# pings the IPs from the txt
response = os.popen(f"ping {IP} -n 4").read() #pings the device
print("------------------"*3)
# If the os.popen() returns 0, it means the operation completed without any errors, so if it returns 0 it is successful.
if "Received >= 1" and "Approximate" in response:
report.write("UP {0} Successful {1}".format(lineText, cdString) + "\n")
else:
report.write("DOWN {0} UnSuccessful {1}".format(lineText, cdString) + "\n")
if "Received = 0" or "unreachable" in response: #Sends an email to IT staff if the ping fails
#composes the email message
#email_alert("Issue with {0}".format(lineText, cdString), "The Hearbeat Notification System Works :)")
toaster = ToastNotifier()
toaster.show_toast("Issue with {0}: {1} on {2}".format(lineText, IP, cdString), "Please Fix Now", duration=10, icon_path='Warning.ico')
time.sleep(1)
report.write("Hearbeat Protocol Complete" + "\n")
file.close()
#email notification setup
#def email_alert(subject, body):
# mailListFile = open(r'XXXXX.txt')
# emailList = (mailListFile.read()).splitlines()
# msg = EmailMessage()
# msg.set_content(body)
# msg['subject'] = subject
# msg['to'] = ', '.join(emailList)
# user = "XXXXX"
# msg['from'] = user
# password = "XXXXXX"
# server = smtplib.SMTP("smtp.gmail.com", 587)
# server.starttls()
# server.login(user, password)
# server.send_message(msg)
# server.quit()
#allows for the entire script to run every 300 seconds (or whatever is specified)
if __name__ == '__main__':
notif()
while True:
schedule.run_pending()
time.sleep(1)
schedule.every(5).minutes.do(notif)
The only argument I used when creating the executable was --onefile
Thank you for your time.
Your filename r'sydvlan.txt' is a relative path, so its location depends on the current working directory when the program is invoked.
Either try to use absolue paths, e.g. r'C:\path\to\file\sydvlan.txt' (do this for r'HeartbeatResults_{0}.txt', too) or set "Start in (optional)" parameter in the Windows task scheduler to the path where your txt file is located.

Receive files and save them on socket Python

My server is sending serial of files with name like file_1, file_2, and so on.
The sending socket works well and I've checked those files all correct.
Server - sending files
f = open(new_filename, 'rb')
start_ts = ts
seconds += 1
try:
print('Sending %s' % new_filename)
conn.sendall(f.read(99999999))
f.flush()
f.close()
except socket.error:
if errno == errno.ECONNREFUSED:
print(os.strerror(socket.error.errno))
else:
raise
print('Send failed')
sys.exit()
Here on the client side, I want to receive those files and save them as the way it was on the server (save_1, save_2, ...) on my computer.
Client- receive files and try to save them
try:
client.send(b'Receiving Data...\n')
while True:
save_filename = 'savefolder/save_%i.pcap' % file_index
f = open(save_filename, 'wb')
data = client.recv(99999999)
f.write(data)
reply = b'Message Received.\n'
if not data:
break
client.sendall(reply)
f.close()
file_index += 1
except socket.timeout:
print('Done receiving.', end=' ')
client.close()
print('Client socket is closed')
But the saved files look weird. When the server sends one 3,755 KB file, the client writes two files with 192KB and 3,563KB. Worse, the bigger one crashes. Is there a fix for this? I don't know why it happens here with my code.
Is it the only way to input 99999999 value to get the entire file without cutting?
Why is my client not sending message to the server even though I wrote some code in there?
TCP is not a message based protocol but a byte stream protocol. There is no fixed relation between how much was send or sendall and how much gets read with recv.
In order to send multiple messages (files) over the same connection you have to define some application protocol which clearly defines where messages start and end. And alternative would be to use a new TCP connection for each file and read until recv returns '', i.e. indicates that the other site has closed the connection.

http proxy server only working for https sites

I am trying to use this code to create an HTTP proxy cache server. When I run the code it starts running and connects to the port and everything but when I try to connect from the browser, for example, it opens a port on 55555 if I type in localhost:52523/www.google.com it works fine but when I try other sites specifically HTTP, for example, localhost:52523/www.microcenter.com or just localhost:52523/google.com it will display localhost didn’t send any data.
ERR_EMPTY_RESPONSE and shows an exception in the console though it creates the cache file on my computer.
I would like to find out how to edit the code so that I can access any website just as I would normally do on the browser without using the proxy server. It should be able to work with www.microcenter.com
import socket
import sys
import urllib
from urlparse import urlparse
Serv_Sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM) # socket.socket
function creates a socket.
port = Serv_Sock.getsockname()[1]
# Server socket created, bound and starting to listen
Serv_Sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM) # socket.socket
function creates a socket.
Serv_Sock.bind(('',port))
Serv_Sock.listen(5)
port = Serv_Sock.getsockname()[1]
# Prepare a server socket
print ("starting server on port %s...,"%(port))
def caching_object(splitMessage, Cli_Sock):
#this method is responsible for caching
Req_Type = splitMessage[0]
Req_path = splitMessage[1]
Req_path = Req_path[1:]
print "Request is ", Req_Type, " to URL : ", Req_path
#Searching available cache if file exists
url = urlparse(Req_path)
file_to_use = "/" + Req_path
print file_to_use
try:
file = open(file_to_use[5:], "r")
data = file.readlines()
print "File Present in Cache\n"
#Proxy Server Will Send A Response Message
#Cli_Sock.send("HTTP/1.0 200 OK\r\n")
#Cli_Sock.send("Content-Type:text/html")
#Cli_Sock.send("\r\n")
#Proxy Server Will Send Data
for i in range(0, len(data)):
print (data[i])
Cli_Sock.send(data[i])
print "Reading file from cache\n"
except IOError:
print "File Doesn't Exists In Cache\n fetching file from server \n
creating cache"
serv_proxy = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
host_name = Req_path
print "HOST NAME:", host_name
try:
serv_proxy.connect((url.host_name, 80))
print 'Socket connected to port 80 of the host'
fileobj = serv_proxy.makefile('r', 0)
fileobj.write("GET " + "http://" + Req_path + " HTTP/1.0\n\n")
# Read the response into buffer
buffer = fileobj.readlines()
# Create a new file in the cache for the requested file.
# Also send the response in the buffer to client socket
# and the corresponding file in the cache
tmpFile = open(file_to_use, "wb")
for data in buffer:
tmpFile.write(data)
tcpCliSock.send(data)
except:
print 'Illegal Request'
Cli_Sock.close()
while True:
# Start receiving data from the client
print 'Initiating server... \n Accepting connection\n'
Cli_Sock, addr = Serv_Sock.accept() # Accept a connection from client
#print addr
print ' connection received from: ', addr
message = Cli_Sock.recv(1024) #Recieves data from Socket
splitMessage = message.split()
if len(splitMessage) <= 1:
continue
caching_object(splitMessage, Cli_Sock)
Your errors are not related to URI scheme (http or https) but to files and socket use.
When you are trying to open a file with:
file = open(file_to_use[1:], "r")
you are passing an illegal file path (http://ebay.com/ in your example).
As you are working with URIs, you could use a parser like urlparse, so you can handle better the schema, hostname, etc...
For example:
url = urlparse(Req_path)
file_to_use = url.hostname
file = open(file_to_use, "r")
and use only the hostname as a file name.
Another problem is with the use of sockets. Function connect should receive hostname, not hostname with schema which is what you are doing. Again, with the help of the parser:
serv_proxy.connect((url.hostname, 80))
Besides that, you do not call listen on a client (see examples), so you can remove that line.
Finally, again to create the new file, use the hostname:
tmpFile = open(file_to_use, "wb")

Cache Proxy Server Returning 404 with www.google.com

I have a homework assignment which involves implementing a proxy cache server in Python for web pages. Here is my implementation of it
from socket import *
import sys
def main():
#Create a server socket, bind it to a port and start listening
tcpSerSock = socket(AF_INET, SOCK_STREAM) #Initializing socket
tcpSerSock.bind(("", 8030)) #Binding socket to port
tcpSerSock.listen(5) #Listening for page requests
while True:
#Start receiving data from the client
print 'Ready to serve...'
tcpCliSock, addr = tcpSerSock.accept()
print 'Received a connection from:', addr
message = tcpCliSock.recv(1024)
print message
#Extract the filename from the given message
filename = ""
try:
filename = message.split()[1].partition("/")[2].replace("/", "")
except:
continue
fileExist = False
try: #Check whether the file exists in the cache
f = open(filename, "r")
outputdata = f.readlines()
fileExist = True
#ProxyServer finds a cache hit and generates a response message
tcpCliSock.send("HTTP/1.0 200 OK\r\n")
tcpCliSock.send("Content-Type:text/html\r\n")
for data in outputdata:
tcpCliSock.send(data)
print 'Read from cache'
except IOError: #Error handling for file not found in cache
if fileExist == False:
c = socket(AF_INET, SOCK_STREAM) #Create a socket on the proxyserver
try:
srv = getaddrinfo(filename, 80)
c.connect((filename, 80)) #https://docs.python.org/2/library/socket.html
# Create a temporary file on this socket and ask port 80 for
# the file requested by the client
fileobj = c.makefile('r', 0)
fileobj.write("GET " + "http://" + filename + " HTTP/1.0\r\n")
# Read the response into buffer
buffr = fileobj.readlines()
# Create a new file in the cache for the requested file.
# Also send the response in the buffer to client socket and the
# corresponding file in the cache
tmpFile = open(filename,"wb")
for data in buffr:
tmpFile.write(data)
tcpCliSock.send(data)
except:
print "Illegal request"
else: #File not found
print "404: File Not Found"
tcpCliSock.close() #Close the client and the server sockets
main()
I configured my browsers to use my proxy server like so
But my problem when I run it is that no matter what web page I try to access it returns a 404 error with the initial connection and then a connection reset error with subsequent connections. I have no idea why so any help would be greatly appreciated, thanks!
There are quite a number of issues with your code.
Your URL parser is quite cumbersome. Instead of the line
filename = message.split()[1].partition("/")[2].replace("/", "")
I would use
import re
parsed_url = re.match(r'GET\s+http://(([^/]+)(.*))\sHTTP/1.*$', message)
local_path = parsed_url.group(3)
host_name = parsed_url.group(2)
filename = parsed_url.group(1)
If you catch an exception there, you should probably throw an error because it is a request your proxy doesn't understand (e.g. a POST).
When you assemble your request to the destination server, you then use
fileobj.write("GET {object} HTTP/1.0\n".format(object=local_path))
fileobj.write("Host: {host}\n\n".format(host=host_name))
You should also include some of the header lines from the original request because they can make a major difference to the returned content.
Furthermore, you currently cache the entire response with all header lines, so you should not add your own when serving from cache.
What you have doesn't work, anyway, because there is no guarantee that you will get a 200 and text/html content. You should check the response code and only cache if you did indeed get a 200.

Transfer A Text File Via Socket In Python

I have a project in PYTHON that is two machines (A , B) ,
1) request machine A send a request to B to list a directory(in my code i set it to current directory)
2) in second request machine A wants to download the Text file of the directory. (Put a text file in machine B's directory)
3) after that machine A changes the text file and send back to the machine B.
4) and At the end machine A send two number and machine B send back the result of it.
it work till step 2 but nothing happen after that it's like a true while I can't understand why?!
Here is my Code
Machine A (Client):
# -*- coding: UTF-8 -*-
import os
import socket
PORT = 9000
HOST = 'localhost'
socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
socket.connect((HOST, PORT))
store=[]
direc = raw_input("Enter The Directory to List : ")
socket.sendall(direc)
len_data = socket.recv(2048)
print int(len_data)
for i in range(int(len_data)):
data = socket.recv(2048)
store.append(data)
print("List of Files:")
for i in store:
print(i)
txt_file = raw_input("Please Choose a TEXT file :")
if store.count(txt_file) is 0:
print("There no such a TXT file!!")
else:
socket.sendall(txt_file)
def write_file(name):
fname = open(name,'ab')
while True:
string = socket.recv(2048)
if string:
fname.write(string)
else:
fname.write("changed")
fname.close()
break
def read_file(name):
fileToSend = open(name, 'rb')
while True:
data = fileToSend.readline()
if data:
socket.send(data)
else:
fileToSend.close()
break
write_file(txt_file)
read_file(txt_file)
x = raw_input("Enter The First Num: ")
socket.send(x)
y = raw_input("Enter The Second Num: ")
socket.send(y)
result = socket.recv(1024)
print result
raw_input()
socket.sendall('')
socket.close()
exit()
and the Machine B (Server):
import os,sys,socket
PORT = 9000
HOST = 'localhost'
tcpsocket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
server_address = (HOST, PORT)
print >>sys.stderr, 'starting up on %s port %s' % server_address
socket.bind((HOST,PORT))
socket.listen(1)
conn, addr = socket.accept()
directory = conn.recv(2048)
if os.listdir(os.curdir):
data = os.listdir(os.curdir)
len_data = data.__len__()
print(len_data)
if len_data:
conn.send(str(len_data))
for i in data:
if i:
print >>sys.stderr, 'sending data back to the client'
conn.send(i)
else:
break
txt_file_name = conn.recv(2048)
def write_file(name):
with open(name,'wb') as fname:
while True:
string = conn.recv(2048)
if string:
fname.write(string)
else:
fname.close()
break
def read_file(name):
with open(name, 'rb') as fileToSend:
while True:
data = fileToSend.readline()
if data:
conn.send(data)
else:
fileToSend.close()
break
def add (x,y):
return str(x+y)
read_file(txt_file_name)
write_file(txt_file_name)
x = conn.recv(1024)
y = conn.recv(1024)
conn.send(add(x,y))
conn.sendall('')
conn.close()
exit()
I am fascinated with your problem and looked into it. While we can solve it using socket. I lean toward HTTP protocol for several reasons:
You don't have to make up your own "hand shake". The HTTP protocol has provision for requesting file, uploading a file, and do some processing (your step #4)
You can test your server using a web browser
Web services are very popular now. This is a baby step to learn about web services.
Here is the server code (server.py):
from BaseHTTPServer import HTTPServer, BaseHTTPRequestHandler
import os
class MyHandler(BaseHTTPRequestHandler):
def do_GET(self):
global running
if self.path == '/':
self.list_files()
elif self.path.startswith('/calculation'):
self.send_calculation()
elif self.path.startswith('/quit'):
self.send_response(200)
running = False
else:
self.send_file(self.path[1:])
def do_POST(self):
filename = self.path[1:] # Remove the / from the path
filesize = int(self.headers['Content-Length'])
contents = self.rfile.read(filesize)
with open(filename, 'w') as f:
f.write(contents.decode())
self.send_response(200)
def send_file(self, filename):
# Check to see if file exists and is a file, not directory
if os.path.isfile(filename):
self.send_response(200)
self.send_header('Content-Type', 'text/plain')
self.end_headers()
# Read and send the contents of the file
with open(filename) as f:
contents = f.read()
self.wfile.write(contents)
else:
self.send_response(404)
self.send_header('Content-Type', 'text/plain')
self.end_headers()
self.wfile.write('Dude! File not found')
def send_calculation(self):
empty, operation, number1, number2 = self.path.split('/')
result = int(number1) + int(number2)
self.send_response(200)
self.send_header('Content-Type', 'text/plain')
self.end_headers()
self.wfile.write(result)
def list_files(self):
file_list = os.listdir(os.curdir)
if file_list:
self.send_response(200)
self.send_header('Content-Type', 'text/plain')
self.end_headers()
for filename in file_list:
self.wfile.write('{}\n'.format(filename))
#
# Main
#
running = True
server = HTTPServer(('', 9000), MyHandler)
print 'Server started on host:{}, port:{}'.format(*server.server_address)
while running:
server.handle_request()
And here is the client code (client.py):
import urllib2
import urlparse
def make_url(server, port, path, scheme='http'):
netloc = '{}:{}'.format(server, port)
url = urlparse.urlunsplit((scheme, netloc, path, '', ''))
return url
#
# Main
#
server = '10.0.0.5'
port = 9000
# 1 - Request directory listing
url = make_url(server, port, '/')
file_list = urllib2.urlopen(url).read()
print 'Files from server:'
for filename in file_list.splitlines():
print '- {}'.format(filename)
# 2 - Request contents of a file
filename = raw_input('Type a file name: ')
url = make_url(server, port, filename)
contents = urllib2.urlopen(url).read()
print 'Contents:'
print contents
# 3 - Upload a file to the server
contents = 'hello, world.\nThe End'
filename = 'foo.txt'
url = make_url(server, port, filename)
f = urllib2.urlopen(url, data=contents)
# 4 - Do some calculation
n1 = 19
n2 = 5
path = '/calculation/{}/{}'.format(n1, n2)
url = make_url(server, port, path)
result = int(urllib2.urlopen(url).read())
print '{} + {} = {}'.format(n1, n2, result)
# Send quit signal
url = make_url(server, port, '/quit')
urllib2.urlopen(url).read()
Web Service
The server is really a web service, which provides the following services:
Get a directory listing
GET http://server:port/
This service will return a list of files in the current directory.
Get contents of a file
GET http://server:port/filename
Returns the contents of a file in plain text format.
Upload a file
POST http://server:port/filename
Copy a file from the client to the server. If the file already exists on the server, override it.
Do some calculation
GET http://server:port/calculation/x/y
Returns x + y
Shut down the server
GET http://server:port/quit
Tells the server to quit.
Error Handling
For the sake of brevity and clarity, I did not add and error handling to the code. Here are a few error condition that I can think of:
Retrieve a non-existing file, or a directory (server)
Upload failed because of the lack of file write permission (server)
In the calculation service, the parameters are not numbers (server)
The server has not started, wrong port, wrong server (client)
Other Discussions
In a general, GET means data flow from the server to the client, and POST the opposite direction.
To test GET action from the server, you can use your browser. For example, to retrieve the directory contents from 192.168.1.5, port 9000, point your web browser to:
http://192.168.1.5:900/
Testing POST is trickier, see the client code in the upload section for idea of using POST.
In the server code, the do_GET() function handles all the GET requests, and the do_POST() function handles all the POST requests.

Categories

Resources