I need to start a python program when the system boots. It must run in the background (forever) such that opening a terminal session and closing it does not affect the program.
I have demonstrated that by using tmux this can be done manually from a terminal session. Can the equivalent be done from a script that is run at bootup?
Then where done one put that script so that it will be run on bootup.
Create a startup script that runs on boot and launches the desired Python program in the background.
Here are the steps:
Create a shell script that launches the Python program in the background:
#!/bin/sh
python /path/to/your/python/program.py &
Make the shell script executable:
chmod +x /path/to/your/script.sh
Add the script to the startup applications:
On Ubuntu, this can be done by going to the Startup Applications program and adding the script.
On other systems, you may need to add the script to the appropriate startup folder, such as /etc/rc.d/ or /etc/init.d/.
After these steps, the Python program should start automatically on boot and run in the background.
It appears that in addition to putting a script that starts the program in /etc/init.d, one also has to put a link in /etc/rc2.d with
sudo ln -s /etc/init.d/scriptname.sh
sudo mv scriptname.sh S01scriptname.sh
The S01 was just copied from all the other files in /etc/rc2.d
Related
I'm very new to coding and software, so please stick with me. I am trying to execute a command in my Raspberry Pi terminal via a Python script. I want to be able to run this pi script from the Desktop. The command to execute is (rpi-deep-pantilt-env) pi#raspberrypi:~/rpi-deep-pantilt $ rpi-deep-pantilt detect So as you can see, I need to cd into rpi-deep-pantilt, then activate my virtual environment, then run the command all via the py script.
A simple shell script to do what you ask:
#!/bin/sh
cd "$HOME"/rpi-deep-pantilt
. ./rpi-deep-pantilt-env/bin/activate
./rpi-deep-pantilt detect "$#"
Most or all of this is probably unnecessary. I guess you could run
#!/bin/sh
d="$HOME"/rpi-deep-pantilt
exec "$d"/rpi-deep-pantilt-env/bin/python "$d"/rpi-deep-pantilt detect "$#"
though if your Python script has hardcoded file paths which require it to run in a specific directory, that's a bug which will prevent this from working.
The "$#" says to pass on any command-line arguments, so if you saved this script as pant, running pant blind mice will pass on the arguments blind and mice to your Python script. (Of course, if it doesn't accept additional command-line arguments after detect, this is unimportant, but I'd still pass them on so you can generate an error message, rather than have them be ignored as if they were not there.)
I have a bash script that invokes a python script in a loop. It is working fine (in-order execution) when I run it normally in the foreground or If I run it in the background using '&' and don't close the terminal window. I am running this in a Mac.
My script looks like below:
for file in */List*.bin;do
newFile=${file%.bin}.txt
./Prog1 $file > "$newFile"
wait
python PyProg.py "$newFile" >> Report.txt
wait
done
Thisis how I run the script:
(sudo ./Script > log.txt) &
But if close the terminal and I check with 'top' command from a different terminal, It is only showing 'Prog1' and not python.
I don't know if the python is not being run at all or the system does some out of order execution.
anyway, I cannot see any new line being added in Report.txt.
I tried disown -h %1 after running this to detach the script from the terminal. But the result is the same.
I have a very basic Python program which alarms you at certain intervals for break...I want to make it a software so that I don't need to run that program everytime I open my computer ...and I want it for both windows and Linux....And is there universal way to create executable files from source code in any language
For Linux, you can specify your .py(script) file and then have a cron job to run it based on your setup. You can schedule script to be executed periodically.
Your cron job script can be edited as:
Opens crontab edit:
# crontab -e
And add following entry if you want a scheduled job:
0 6 * * * /usr/bin/python your_script.py
So this will run every day at 6 am. For more options refer to http://www.adminschoice.com/crontab-quick-reference
Or if you want to add it to run at startup add the following entry:
#reboot /usr/bin/python your_script.py &
For Windows create a batch job and append it to your startup schedule, so this will run whenever you boot up your system.
batch file can be like(given you have a installed python version in your system):
#echo off
python your_script.py
PAUSE
save it as some_name.bat
Create a shortcut of the file in startup folder(which can be opened using RUN > shell:startup)
Paste your batch file shortcut in above location.
For scheduled run for python script in windows refer to https://www.esri.com/arcgis-blog/products/product/analytics/scheduling-a-python-script-or-model-to-run-at-a-prescribed-time/
I want to run two python applicaiton in docker with different ports.
My shell script is below, the name is serverRun.sh
exec python __server_code.py &
exec python server_time_test.py &
in dockerfile, I am trying to run these two python application
RUN ["chmod", "+x", "./serverRun.sh"]
It did not work. Any idea?
You must have one process in foreground. So remove the last &. And don't use exec
cd /rfk-thrift/nlp_search
python __server_code.py &
python server_time_test.py
And this in Dockerfile:
RUN chmod +x ./serverRun.sh
CMD ./serverRun.sh
Looks like your run command just sets the file permissions, it doesn't actually execute the script. Perhaps change the run command.
The RUN instruction is only referenced during build. It won't be running when you launch the container. You want to use ENTRYPOINT or CMD
Your shell script must not terminate, so lock it within a while loop with a strong sleep.
Docker detects the termination of the entrypoint, and ends the container.
Eventually you might want to resort to an entrypoint (bash or other scripting languages) that is able to detect application crashes so your container quits on failure.
I published for you a fully-featured Bash entrypoint here which does exactly this, and more.
I can't seem to figure out how to get his bash script working.
#!/bin/bash
export WORKON_HOME=~/.envs
source /usr/local/bin/virtualenvwrapper.sh
workon staging_env
It is using viretualenv and virualenvwrapper in order to use a Python virtual environment.
Typing these commands in the shell work perfectly fine, running it as a bash script does not work though.
Any ideas?
When you run a script, it creates its own instance of the shell (bash, in this case). Because of this, the changes are lost when the script ends and the script's shell is closed.
To make the changes stick, you'll have to source the script instead of running it.