I'm trying to demo a simple website using the python SimpleHTTPServer module. However, the server almost immediately is terminated. I'm ssh'd into the box, type:
python -m SimpleHTTPServer 8000
the webserver starts:
Serving HTTP on 0.0.0.0 port 8000 ...
and then displays Terminated and shuts down within a few seconds.
I'm at a loss as to what to look for. I'm using this same command on macOS with no problems; this is on a Linux box running python V2.7.9.
Related
I am trying to run the python .py file in PyScript. To do that, I need to load python file in a server because PyScript cannot access local file:
<py-script src="./greet.py"></py-script>
So I tried to create local server by entering this command in the terminal in the folder of my project:
python -m http.server
It has been over 30 minutes now and the server is not on. It shows that the terminal local is still running. It should take no more than a minute to be done. I have tried to locate specific port with:
python -m http.server 80 but it still doesn't work.
What should I do?
You might try also binding the server to a specific link-local IP address, using something like python -m http.server --bind 127.0.0.1 8000. The page should then appear at 127.0.0.1:8000.
This was necessary for me on Windows - even after allowing Python permissions to access the network. Not sure if it's Windows-related, or Python not identifying the correct NIC to bind to, or what.
I have a Django AWS server that I need to keep running over the weekend for it to be graded. I typically start it from an SSH using PuTTY with:
python manage.py runserver 0.0.0.0:8000
I was originally thinking of making a bash script to do the task of starting the server, monitoring it, and restarting it when needed using the but was told it wasn't going to work. Why?
1) Start the server using python manage.py runserver 0.0.0.0:8000 & to send it to the background
2) After <some integer length 'x'> minutes of sleeping, check if the server isn't up using ss -tulw and grep the result for the port the server should be running on.
3) Based on the result from step (2), we either need to sleep for 'x' minutes again, or restart the server (and possibly fully-stop anything left running beforehand).
Originally, I thought it was a pretty decent idea, as we can't always be monitoring the server.
EDIT: Checked that ss -tulw | grep 8000 correctly grabs the server while running server:
if I understand you correctly, this is a non-production Django app. You could run a test server using Django's development server like python manage.py runserver 0.0.0.0:8000 as you did.
Thinks like monit (https://mmonit.com/monit/) or supervisord (http://supervisord.org/) are meant to do what you described - monitoring a process and restart it if necessary, but you could also just use a cron job that runs perhaps every minute. In the cron job, you:
Check whether your process is still running and or still listening on port 8000.
Abort if already running.
Restart if stopped or not listening to port 8000.
I have an Ubuntu instance on Google Cloud Platform (GCP). I want to use it as an HTTP server to access files. I simply use this python command, type it in bash:
python3 -m http.server 8000
This will run http.server module as a script, construct a simple HTTP server and listen at port 8000.
Problem is that, since I use GCP instance, I must connect to it remotely (for example I use SSH shell provided by GCP). When I close the SSH shell, the python HTTP server will stop. So what should I do to make sure that the server still runs after I close the shell?
I did searched on Google, and I tried to use
nohup python3 -m http.server 8000 &
This command, I quote, will run the instruction as a background program and persist running after exiting bash. But it seems that this doesn't work for my situation.
Anybody can help?
Try the screen command. I think it's easier to use and also more flexible than nohup as you can also reattach processes after detaching then. See this answer for details.
The http.server module is not meant to be a full-fledged webserver.
You'll want to set up something like Apache instead, see Running a basic Apache web server.
I need to execute python script on remote server (access through puTTY), but I don't have a stable Internet connection, and every time I execute the script I get problems after several minutes due to my Internet getting disconnected.
How do I remotely execute the script without being connected to server?
(e.g. I connect to server, run script, and can logout while executing)
You can use a Linux Screen, it opens a background terminal and keeps a shell active even through network disruptions.
Open the screen typing in your terminal $ screen and execute there your script, even if you lose connection it won't kill the process.
Here you will find a well explained How to for this program. I use it for my regular day working on remote.
try this
nohup your_script >/dev/null 2>&1 &
program will be running in background
Running a basic Python SimpleHTTPServer to check out some files in browser. Once the SimpleHTTPServer is running in one directory how do you stop it and use a different directory? or just have it switch to the new one.
Currently using in terminal:
python -m SimpleHTTPServer 8008
Then if I try to run on another directory it says it's already in use. So basically how to I stop a given instance of SimpleHTTPServer?
Then if I try to run on another directory it says it's already in use
You can only bind to each port once with SimpleHTTPServer. If you are already running the server on port 8008, you can not run another instance listening on that same port.
you should stop the server, and launch it from the other directory... or choose a different port number for running the 2nd instance.