How to run two python application in docker - python

I want to run two python applicaiton in docker with different ports.
My shell script is below, the name is serverRun.sh
exec python __server_code.py &
exec python server_time_test.py &
in dockerfile, I am trying to run these two python application
RUN ["chmod", "+x", "./serverRun.sh"]
It did not work. Any idea?

You must have one process in foreground. So remove the last &. And don't use exec
cd /rfk-thrift/nlp_search
python __server_code.py &
python server_time_test.py
And this in Dockerfile:
RUN chmod +x ./serverRun.sh
CMD ./serverRun.sh

Looks like your run command just sets the file permissions, it doesn't actually execute the script. Perhaps change the run command.

The RUN instruction is only referenced during build. It won't be running when you launch the container. You want to use ENTRYPOINT or CMD

Your shell script must not terminate, so lock it within a while loop with a strong sleep.
Docker detects the termination of the entrypoint, and ends the container.
Eventually you might want to resort to an entrypoint (bash or other scripting languages) that is able to detect application crashes so your container quits on failure.
I published for you a fully-featured Bash entrypoint here which does exactly this, and more.

Related

ubuntu run python program in background on startup

I need to start a python program when the system boots. It must run in the background (forever) such that opening a terminal session and closing it does not affect the program.
I have demonstrated that by using tmux this can be done manually from a terminal session. Can the equivalent be done from a script that is run at bootup?
Then where done one put that script so that it will be run on bootup.
Create a startup script that runs on boot and launches the desired Python program in the background.
Here are the steps:
Create a shell script that launches the Python program in the background:
#!/bin/sh
python /path/to/your/python/program.py &
Make the shell script executable:
chmod +x /path/to/your/script.sh
Add the script to the startup applications:
On Ubuntu, this can be done by going to the Startup Applications program and adding the script.
On other systems, you may need to add the script to the appropriate startup folder, such as /etc/rc.d/ or /etc/init.d/.
After these steps, the Python program should start automatically on boot and run in the background.
It appears that in addition to putting a script that starts the program in /etc/init.d, one also has to put a link in /etc/rc2.d with
sudo ln -s /etc/init.d/scriptname.sh
sudo mv scriptname.sh S01scriptname.sh
The S01 was just copied from all the other files in /etc/rc2.d

Calling command in virtual environment from py script

I'm very new to coding and software, so please stick with me. I am trying to execute a command in my Raspberry Pi terminal via a Python script. I want to be able to run this pi script from the Desktop. The command to execute is (rpi-deep-pantilt-env) pi#raspberrypi:~/rpi-deep-pantilt $ rpi-deep-pantilt detect So as you can see, I need to cd into rpi-deep-pantilt, then activate my virtual environment, then run the command all via the py script.
A simple shell script to do what you ask:
#!/bin/sh
cd "$HOME"/rpi-deep-pantilt
. ./rpi-deep-pantilt-env/bin/activate
./rpi-deep-pantilt detect "$#"
Most or all of this is probably unnecessary. I guess you could run
#!/bin/sh
d="$HOME"/rpi-deep-pantilt
exec "$d"/rpi-deep-pantilt-env/bin/python "$d"/rpi-deep-pantilt detect "$#"
though if your Python script has hardcoded file paths which require it to run in a specific directory, that's a bug which will prevent this from working.
The "$#" says to pass on any command-line arguments, so if you saved this script as pant, running pant blind mice will pass on the arguments blind and mice to your Python script. (Of course, if it doesn't accept additional command-line arguments after detect, this is unimportant, but I'd still pass them on so you can generate an error message, rather than have them be ignored as if they were not there.)

How to run a command inside virtual environment using Python

I have the virutalenv created and installed. I have also installed jsnapy tool inside my virutal env.
This is the script that we are using:
Filename : venv.py
import os
os.system('/bin/bash --rcfile ~/TestAutomation/End2EndAutomation/bin/activate')
os.system('End2EndAutomation/bin/jsnapy')
ubuntu#server:~/TestAutomation$ python venv.py
(End2EndAutomation) ubuntu#sdno-server:~/TestAutomation$ ^C
We need to know, is how we can get into virutalenv, run a command and deactivate it using python script?
[EDIT1]
i used the code given in the comment. its just entering virutal env. When i issue exit, its running jsnapy command.
ubuntu#server:~/TestAutomation$ python venv.py
(End2EndAutomation) ubuntu#server:~/TestAutomation$ exit
exit
usage:
This tool enables you to capture and audit runtime environment of
networked devices running the Junos operating system (Junos OS)
Tool to capture snapshots and compare them
It supports four subcommands:
--snap, --check, --snapcheck, --diff
1. Take snapshot:
jsnapy --snap pre_snapfile -f main_configfil
Each call to os.system() will create a new bash instance and terminate the previous one. To run all the commands in one bash instance you could put all your commands inside a single bash script and call that from os.system()
run.sh
source ~/TestAutomation/End2EndAutomation/bin/activate
End2EndAutomation/bin/jsnapy
deactivate
Python
os.system('source run.sh')
Alternatively, you could write a multiline bash command, as long as it's all in one os.system() call.
Two successive calls to os.system() will create two independent processes, one after the other. The second will run when the first finishes. Any effects of commands executed in the first process will have been forgotten and flushed when the second runs.
You want to run the activation and the command which needs to be run in the virtualenv in the same process, i.e. the same single shell instance.
To do that, you can use bash -c '...' to run a sequence of commands. See below.
However, a better solution is to simply activate the virtual environment from within Python itself.
p = os.path.expanduser('~/TestAutomation/End2EndAutomation/bin/activate_this.py')
execfile(p, dict(__file__=p))
subprocess.check_call(['./End2EndAutomation/bin/jsnapy'])
For completeness, here is the Bash solution, with comments.
import subprocess
subprocess.check_call(['bash', '-c', """
. ~/TestAutomation/End2EndAutomation/bin/activate
./End2EndAutomation/bin/jsnapy"""])
The preference for subprocess over os.system is recommended even in the os.system documentation.
There is no need to explicitly deactivate; when the bash command finishes, that will implicitly also deactivate the virtual environment.
The --rcfile trick is a nice idea, but it doesn't work when the shell you are calling isn't interactive.

Running a shell script through Fabric which wants to run a background running command

I am a newbie in Fabric, and want to run one command in a background, it is written in shell script, and I have to run that command via Fabric, so lets assume I have a command in shell script as:
#!/bin/bash/
java &
Consider this is a file named myfile.sh
Now in Fabric I am using this code to run my script as:
put('myfile.sh', '/root/temp/')
sudo('sh /root/temp/myfile.sh')
Now this should start the Java process in background but when I login to the Machine and see the jobs using jobs command, nothing is outputted.
Where is the problem please shed some light.
Use it with
run('nohup PATH_TO_JMETER/Jmetercommand & sleep 5; exit 0)
maybe the process exists before you return. when you type in java, normally it shows up help message and exits. Try a sleep statement or something that lingers. and if you want to run it in the background, you could also append & to the sudo call
I use run("screen -d -m sh /root/temp/myfile.sh",pty=False). This starts a new screen session in detached mode, which will continue running after the connection is lost. I use the pty=False option because I found that when connecting to several hosts, the process would not be started in all of them without this option.

This bash script is not working - Linux/Python

I can't seem to figure out how to get his bash script working.
#!/bin/bash
export WORKON_HOME=~/.envs
source /usr/local/bin/virtualenvwrapper.sh
workon staging_env
It is using viretualenv and virualenvwrapper in order to use a Python virtual environment.
Typing these commands in the shell work perfectly fine, running it as a bash script does not work though.
Any ideas?
When you run a script, it creates its own instance of the shell (bash, in this case). Because of this, the changes are lost when the script ends and the script's shell is closed.
To make the changes stick, you'll have to source the script instead of running it.

Categories

Resources