I am trying to build a monitoring app that constantly gets a feed from docker stats API. I quickly noticed that whenever I try to run something like docker stats 857ff7a0403b from within python, it does not gather the std out and waits for ever. The example python code is below.
commands.getoutput('docker stats 857ff7a0403b')
While the above code works for running commands like docker ps and docker images but it does not work for docker stats.
Is there a way in python to quickly grab the results and terminate the utility so that it does not wait for ever.
There is a docker option called --no-stream that will only grab once and output to standard out.
docker stats --no-stream 857ff7a0403b
See https://docs.docker.com/engine/reference/commandline/stats/ for more details.
in python file/code write below command and run
import os
os.system("docker stats 857ff7a0403b")
in terminal when you write (docker stats container-id) hit enter then it will show you the stats of that specific container
in python OS library will help you to write the terminal commands in python as you are writing in terminal as above(it is for, if you want to access it through program)
Related
There is a python script in my repo that I would like to run whenever I call an API.This Python script merely transfer data from one database to another. The Jenkins server for the project currently is used for builds/pipelines/running tests, I was wondering if I could use this Jenkins service to run this script when i call an API since I found that Jenkins allows you to remotely trigger scripts via REST.
I was wondering it I could use Jenkin's feature of trigger remotely to run this python script in my repo when I need to. The python script is built using a python image in the dockerfile, so docker helps to setup the dependencies/python needed to run the script. the command to run by Jenkins is something like docker build and docker run
Yes you can.
Just setup a pipeline that
Runs in docker (with your image). Have a look at this
Does a git clone of you repository
Runs you python script with something like: sh "python <your script>"
I know similar questions have been asked but I couldn't get it working or it was not specific enough for me since I am fairly new to dockers. The question is similar to the question in this thread How to move Docker containers between different hosts? but I don't fully understand the answer or I can't get it working.
My problem: I am using docker Desktop to run a python script locally in a container. But I want this python script to be able to run on a windows server 2016. The script is a short webscraper which creates a csv file.
I am aware I need to install some sort of docker on the webserver and I need to export my container and be able to load in the container at the webserver.
In the thread referred above it says that I need to use docker commit psscrape but when I try to use it.
I get: "Error response from daemon: No such container: psscraper." This is probably since the container has ran but stopped. Since the program runs only for a few seconds. psscraper is in the 'docker ps -a' list but not in the 'docker ps' list. I guess it has something to do with that.
psscraper is the name of the python file.
Is there anyone who could enlighten me on how to proceed?
Consider situation:
I have an Ubuntu server with installed Python, tensorflow and other libs.
My code is python script, that load several models, some of them pretrained vectors .bin, some files from server folders, etc.
When i run script in terminal it launch interactive session, where i input some text and script output me back (like chatbot). During answer it call my Ai models (Tensorflow, keras).
Question: how do i access this running session from other python script? I mean i want use it as a function: to send text and receive answer back.
And of course i need to run this terminal session in background for long time.
I read this and similar answers, but not sure is that right solution (seems not a full):
In Linux, how to prevent a background process from being stopped after closing SSH client
What i am asking, commonly is done by REST server with API that expose and then this api is called from a external code. But there is no API wotking: Tensorflow throw errors when run via Flask (was not able to fix).
If you want your script stays up after closing ssh session, add & disown at the end of your execution command and it will run in background.
I have a Python Program which runs perfectly as standalone program
Time Taken - 5 days
I dockerize the program and execute it with 10% of dataset
docker runs and the program executes successfully
When i use full dataset(108K records) and build and run the new docker
The Docker starts running for 4 hours and logs the steps perfectly
After 4 hours no logging is done
when i inspect using htop no resource is being used
htop image - sys resource use
for docker stats it is not using any resource
docker stats image
For docker ps it shows the image is running
docker ps image
Kindly let me know what I am doing wrong
Is docker has any limits to running a program or logging data
Are you running docker directly on linux or are you using OSX/Windows for it, if so, you might be hitting memory limits.
If running on the cloud (AWS...) check that the machine has no expiry or something like that. I recommend trying to run that locally first.
So I am using the micro-servicing python package nameko which runs a service using eventlet and calls eventlet.monkey_patch() on import.
I have deciphered that it is this piece of code that is blocking any debug attempts via ipdb. The ipdb console shows in the terminal but I cannot type anything and have to close the entire terminal session in order to quit the process.
The stuck console looks like:
How can I use ipdb with this function?
EDIT: This issue only seems to happen when within a docker container.
Sorry, no convenient solution, for now your best option is to skip docker when using ipdb (you can extract filesystem image from docker and run it in another virtualisation, such as qemu, Virtualbox, systemd-nspawn). See https://github.com/larsks/undocker for help.
Other things to try (may not work, please share results):
update eventlet to github master
pip install https://github.com/eventlet/eventlet/archive/master.zip
This issue is cross posted here https://github.com/eventlet/eventlet/issues/361