I have a task written in Python with a 1-2 minute run-time that I want to run on-demand. The requests would come In very small volumes from a Django server on Linux. The return would be a file.
Usually, I'd use a queue system like Celery. But, this task can only be run on Windows.
What is the best way to make this happen?
Remotely execute the task by establishing an SSH session?
Still use Celery, go through a lot of workarounds to get it to work on Windows (seems messy)?
I could think of 5 solutions which do not require ssh
I didn't talk about authentication in my solutions you should implement something based on chosen solution
Solution 1:
write a simple flask/django app for windows server which runs the task and returns response
in your linux server send a request to windows and get data
linux server can make this call using celery so you wouldn't bother about 2/3 minute wait
Solution 2:
write a simple flask/django app for windows server which calls a celery task in back ground
this app should return the url of result file
celery task creates a file which contains the result
serve this file with nginx (or a windows based static file server, I don't know windows)
send a request from linux server to windows to get result (if file doesn't exists it means result is not ready yet)
Solution 3:
write a simple flask/django app for windows server which calls a celery task in back ground
this app returns a random id for given request
from your linux server send requests to windows app with task id
when task is finished windows app returns result
Solution 4:
write a simple flask/django app for windows server which calls a celery task in back ground
add an endpoint to your linux django app for uploading data
when windows app finishes processing it uploads data to linux django app
Solution 5:
solution 4 but your linux django app is not for uploading data. it only sets a boolean which mean task is done
if task is done linux server send a request to winsows server to get data (this request contains the task id)
Related
so I have a server and did an api for it so I can update patch files to my server, however now when I update the some batch files in the server, I always have to stop running the server and than run it again to see the changes, I was wondering what can I do so that my server restart it's self
Yes you can,
Make requests api send a json like {'do': 'refresh_server'}-
then just type exit(), then run the file again using the os module.
Edit: This is solution for windows
I have a rails app. This app takes parameters from the users and sends it to what I think would be called a slave application server that runs heavy calculations in Python. This slave server is located on the same physical box and runs a Python "SimpleHTTPServer" (just a basic web server).
I set up the slave to receive commands through post requests and run the calculations. Is it appropriate for the python server to receive these requests through GET/POST even though it is on the same box or should I be using another protocol?
**note I have looked into rubypython and other direct connectors but I need a separate app server to run calculations.
The idea:
There is a Node.JS server which sends a request to the IIS server which is running Django / Python. It will send two files to the server which need to be converted with a program which needs to be run in the foreground mode.
So I already looked around in pretty much everything related to IIS and the running of executables here on SO, but still haven't got it working.
I got the following code to run the application from Django:
subprocess.call("C:\example.exe")
There will be probably be some serious security issues (although the server is only reachable from the local network) with the following setup, but here it is:
I'm running a Django application on IIS.
I've set the Application Pool Identity to my local user
I've given "Full Control" permission to "Everyone"
When the subprocess call gets executed it will add the program to my Background Processes with the USER being set to my local user.
Questions:
How do I make the program start in Desktop Mode?
Should I perhaps add another step (start another service) which will then start the program?
Edit:
Could I perhaps make a file watcher which watches if files get stored on the Windows Server and then triggers an executable based on that?
I have a Python middleware application which polls a site for content, and then writes/updates a JSON file for a client-side web application.
I've investigated several ways for hosting this Python application on a Windows Server 2003 machine. I've set up the application to run every minute under the Windows Task Schedule, and have also looked at installing the application as a Windows Service (via NSSM).
Ideally, I'd like the application run continuously as an NSSM-managed Windows Service, with an internal task scheduler function to poll the various required sites every X seconds.
Would this be an acceptable solution? Or should I remove all polling components of the Python application, and use Windows Task Scheduling as the polling mechanism?
I need to install on one of my Windows PC's some software that will periodically send a short HTTP POST request to my remote development server. The request is always the same and should get sent every minute.
What would you recommend as the best approach for that?
The things I considered are:
1. Creating a Windows service
2. Using a script in python (I have cygwin installed)
3. Scheduled task using a batch file (although I don't want the black cmd window to pop up in my face every minute)
Thanks for any additional ideas or hints on how to best implement it.
import urllib
import time
while True:
urllib.urlopen(url, post_data)
time.sleep(60)
If you have cygwin, you probably have cron - run a python script from your crontab.
This is trivially easy with a scheduled task which is the native Windows way to schedule tasks! There's no need for cygwin or Python or anything like that.
I have such a task running on my machine which pokes my Wordpress blog every few hours. The script is just a .bat file which calls wget. The task is configured to "Run whether user is logged on or not" which ensures that it runs when I'm not logged on. There's no "black cmd window".
You didn't say which version of Windows you are on and if you are on XP (unlucky for you if you are) then the configuration is probably different since the scheduled task interface changed quite a bit when Vista came out.