Reading POST data while uploading - python

I'm about to finish a jquery plugin that should provide a progress-bar for file uploading using ajax and a python CGI script.
My plugin is inspired after this piece of software.
The idea is reading(with python CGI script) raw post data on the server while the browser is uploading, parse it, write file (in chunks) to disk and at the same time using ajax to ask the server about the currently written file size.
The plugin works on my server and on some other web hosting servers. But on Mochahost (using apache) it does not work properly. What that server does is that it calls my python CGI script after all upload is completely done, and not while the file is uploading (as expected). I verified the moment of calling by adding the very beginning of python script command to write a test file. That command should fire when python is called and does not involve reading sys.stdin.
Even on that server python CGI script works (almost) as intended. It reads sys.stdin in chunks, parses it and writes the file on disk. The problem is timing. The server does not provide python with input stream in real-time, but, after all, post from the browser was received, it calls python and populate sys.stdin with raw posted data that is already saved somewhere.
I don't know if it is relevant but I've noticed that their apache version is 2.0.x and SERVER_PROTOCOL is HTTP/1.0. On other web hosting servers SERVER_PROTOCOL is HTTP/1.1.
Can anybody say what things could make a server or specifically an apache server to call a CGI script after uploading is finished and not when the browser asked for it?

Related

Python HTTP Server to download files remotely

I want to make a python HTTP Server that runs on my machine, it will listen for me to send it urls and then it will take that url and download it to the computer.
Example:
PC1 has Downloader.py running with a simple interface, an input and a submit button.
I send "https://mega.nz/file/XXXXXXXX#XXXXXXXXXX_XXXXXXXXXXXXX-XXXXX-XXX" to PC1 from my phone.
Downloader.py receives the mega url and starts downloading the file in the background.
Ideally this would be non-blocking so if I send 2 links one after the other it wouldn't wait for the first to finish before downloading the second.
I already have the download functions but I'm not sure how to put it together with an HTTP Server.
subprocess? threading? multiprocessing?
Thanks.
If you familiar with python, it can be easily done with Django and Celery.

How to handle Python files with Apache

So I want to get into React Native development and have decided Python to be my backend, but for some reason I cannot configure the Apache correctly. The only way to successfully get the result from the request is to include path to python.exe at the start of the document like so:
!C:\Users\Name\PycharmProjects\AppName\venv\Scripts\python.exe
But the problem is that the file is than executed by the Python console, and if I want to access it via mobile phone I get this error:
The server encountered an internal error and was unable to complete your request. Either the server is overloaded or there was an error in a CGI script.
So my question is:
Is there any way in which I can configure Apache to execute a file, without the requirement of the py console, so the request might be handled by devices, which doesn't have a Python console installed?
if you connect from mobile device to http://192.168.1.3/HelloWorld.py on server with CGI then code should be executed on server, not on mobile device. If CGI doesn't work then server may try to send code as normal file and then mobile mdevice ay try to run it locally but it is wrong - CGI server should run code on server.
At start I would put code in subfolder cgi-bin to run it as http://192.168.1.3/cgi-bin/HelloWorld.py because most CGI servers as default run code only in this subfolder.
On Linux script would need shebang
#!/usr/bin/env python
in first line and it should be executable
chmod a+x script.py
CGI has also some rules how to generate data which it will send to client. It may need at start extra information for HTTP protocol - and using only print("Hello World") may generate wrong data and it may have problem to send it. You should have it in any tutorial for CGI scripts. See module cgi
To run Python's code Apache needs module mod_cgi, mod_fcgi or mod_python
mod_cgi and mod_fcgi can run scripts in different languages: Python, Perl, Ruby, etc. and even Bash, PHP or C/C++/Java
Python3 has standard module http which can be used also as simple server
python3 -m http.server --cgi
and it will serve all files in folder in which you run it. And it runs files from subfolder cgi-bin/ - see doc: http

Communicate with long-running Python program

I have a long-running Python program on a server that already listens for messages on one serial port and forwards them out another serial port.
What do I need to do to allow that program to accept data from a web server (that ultimately gets that data from a web browser on a laptop)?
The options I've seen so far are:
flask()
The solution at "
Communicating with python program running on server " server
doesn't seem to work for me, because
(I may be doing this wrong)
the long-running Python program can't seem to grab port 80,
I guess because the web server is already running on port 80 (serving other pages).
Have a CGI script that writes the data to the file, and the long-running script reads the data from that file. I'm a little reluctant to do this on a system where flash wear-out may be a concern.
Somehow (?) convert the long-running script
to a FastCGI script that includes everything it used to do plus new stuff to accept data from the web server.
Somehow (?) convert the long-running script
to a WSGI script that includes everything it used to do plus new stuff to accept data from the web server.
Write a brief web script that the web server starts up, that communicates with a long-running script using asynchat / asyncore / sockets / twisted , which seem designed for communication between two different computers, and so seems like overkill when talking between a long-running Python script and a web server (perhaps with a short-time CGI script or FastCGI script between them) running on the same server.
Perhaps some other option?
Is there a standard "pythonic" way for a web server to hand off data to a Python program that is already up and running? (Rather than the much more common case of a web server starting a Python program and hand off data to that freshly-started program).
(Details that I suspect aren't relevant: my server runs Lighttpd on Ubuntu Linux running on a Beaglebone Black).
(Perhaps this question should be moved to https://softwareengineering.stackexchange.com/ ?)
You could setup your python process to use any other port (f.e. 8091). Than configure your webserver to forward certain (or all) requests to that port using proxypas. Example for Apache:
<VirtualHost yourdomain.for.python.thread>
ServerName localhost
ServerAdmin webmaster#example.com
ProxyRequests off
ProxyPass * http://127.0.0.1:8091
</VirtualHost>
I've done this before for quickly getting a Django server in development mode to show pages via a webserver. If you actually want to serve html content, this is not the most efficient way to go.

Remote website trigger a local action

I have large video files (~100GB) that are local on my machine. I have a non-local website where I enter information about the video file. In addition, I need to get the checksum of the video file (and I do not want to manually trigger the script locally and copy and paste the value). To get a checksum of the video file, I have a script I can run as $ checksum.py <video file>.
How would I trigger the local script through the web interface? In other words, I want to be able to enter the PATH of the video file and click Submit on the web app, and it will locally trigger the script, and (after the checksum has finished), insert that value into the web app/database. What would be the best way to do this?
You cannot trigger anything unless your local script is continuously listening for some kind of data feed (like a fixed URL serving an XML/JSON feed of paths) which is, IMHO, over-complicating your system.
You could also use a Java applet ran locally instead of a remote website, but you'll have to sign it to be able to read local files, and it might not be what you're looking for.
Think of it: it's all about security. Would you like any web server to trigger scripts in your local machine? I certainly wouldn't.
IMHO the best solution is to trigger the script manually which will send the data to your web server.
In general browsers run in a sandbox that has very limited access to the OS. In particular you can't run shell scripts from a browser. As I see it you have two options:
Adapt your checksum.py script to send the checksum info directly to your website using the urllib2 calls, or pipe the results to a "curl" command. (No browser involved.)
Rewrite checksum.py as JavaScript using the FileReader class. This will probably be convoluted, slow, and won't work in Internet Explorer.

Batch execution of SAS using a Telnet connection in Python

I have been interested in finding an alternative to the UI in SAS for quite some time now. We license SAS on our server instead of our desktops, so furthermore we have to launch a remote desktop application to execute code.
I was able to use a Telnet connection instead to remotely connect to the server, and batch execute SAS programs. Then I was interested in whether a python script could be made to connect remotely, and batch execute code, and this script could be executed in jEdit as a BeanShell script.
So far, I have Python code which successfully opens and closes the Telnet connection. It can do basic shell functions like call "dir". However, when I pass the exact same line that I use to execute SAS from command prompt on the remote server with a telnet connection in Python, nothing happens.
Is it possible the server is preventing me from executing code from a script? I use a "read_until" statement for the prompt before running any code.
Here's a few ideas...
The issue you are having above may be related to Local Security Policy settings in Windows (if it is running on a windows server). I'm far from an expert on that stuff but I remember older SAS/Intranet installations required some rumaging around in there to get them working.
As an alternative to the approach you are trying above you could also setup a SAS session on the server that listens for incoming socket requests as per this article:
http://analytics.ncsu.edu/sesug/2000/p-1003.pdf
And finally... Not sure if this helps or not by I remotely execute SAS jobs using PSEXEC. A description of how I set it all up can be found here:
http://www.runsubmit.com/questions/260/hide-sas-batch-jobs-winxp
Good luck
This paper outlines how you can use a Python script to connect to a Unix server using SSH, copy the SAS program written locally onto the server, batch submit it, and download the results back to your local machine, all using a BeanShell macro script for jEdit.

Categories

Resources