Fabric - Run a command locally before and after all tasks complete - python

I'm attempting to announce deployment start and end in my fabric script. Feels like this should be easy, but for the life of me I can't figure out how to do it.
env.hosts = ['www1', 'www2', 'www3', 'www4']
def announce_start():
# code to connect to irc server and announce deployment begins
pass
def announce_finish():
# code to connect to irc server and announce deployment finishes
pass
def deploy():
# actual deployment code here
pass
Here's what I've tried:
If I make my deploy task contain 'announce_start' and 'announce_finish'. It will attempt to run all those tasks on each server.
def deploy():
announce_start()
# actual deployment code here
announce_finish()
If I decorate announce_start() and announce_end() with #hosts('localhost'), it runs it on localhost, but still four times. One for each host.
As I was typing this, I finally got it to work by using the decorator #hosts('localhost') on announce_start/end and the fab command:
fab announce_start deploy announce_end
But this seems a bit hacky. I'd like it all wrapped in a single deploy command. Is there a way to do this?

You can use fabric.api.execute, e.g.
def announce_start():
# code to connect to irc server and announce deployment begins
pass
def announce_finish():
# code to connect to irc server and announce deployment finishes
pass
#hosts(...)
def deploy_machine1():
pass
#hosts(...)
def deploy_machine2():
pass
def deploy():
announce_start()
execute(deploy_machine1)
execute(deploy_machine2)
announce_finish()
and then just invoke fab deploy

Related

how to run an external python script as celery task by taking script name using flask server

I am using celery-flask for queuing and monitoring the task, I have four to five scripts and I want these scripts to run as a celery task by passing the script through flask server and then monitoring their status.
Here is the code I have written so far:
#app.route('/script_path/<script_name>') # flask server
def taking_script_name(script_name):
calling_script.delay(script_name)
return 'i have sent an async script request'
#celery.task
def calling_script(script_name):
result = script_name
return {'result':result}
i want the status of the script passed in the result returned in the celery task.
if anybody having another suggestion how to run external task as celery task.
thanks in advance.

Can't reach Locust WebInterface "ERR_CONNECTION_REFUSED"

I wanted to test Locust for my Project on Windows 10.
The Script seems to run properly (no Errors in CMD), but i can't connect to the web interface http://127.0.0.1:8089 (ERR_CONNECTION_REFUSED).
I am guessing, that this has to do with Browser/Windows config, but i can't find the problem. I have no Proxy set up in Lan settings, i get my ip from DNS, and i have no changes in my hosts file.
locustfile.py
from locust import HttpLocust, TaskSet, task
class UserBehavior(TaskSet):
def on_start(self):
""" on_start is called when a Locust start before any task is scheduled """
self.login()
def on_stop(self):
""" on_stop is called when the TaskSet is stopping """
self.logout()
def login(self):
self.client.post("/login", {"username":"tester", "password":"abcd1234"})
def logout(self):
self.client.post("/logout", {"username":"ellen_key", "password":"education"})
#task(2)
def index(self):
self.client.get("/")
#task(1)
def profile(self):
self.client.get("/profile")
class WebsiteUser(HttpLocust):
task_set = UserBehavior
min_wait = 5000
max_wait = 9000
host="http://google.com"
CMD
D:\workspace\WebTesting>locust
CMD result :
[2019-05-13 09:49:45,069] LB6-001-DTMH/INFO/locust.main: Starting web monitor at *:8089
[2019-05-13 09:49:45,070] LB6-001-DTMH/INFO/locust.main: Starting Locust 0.11.0
When i interrupt the script in the command line i get the "KeyboardInterrupt" message and some statistics without data
python -m http.server 8089 seems to work
Try going to http://localhost:8089/
I am not quite sure why, but I can't reach the Locust webinterface through http://127.0.0.1 either (other webservers run fine locally with this address), but going to localhost does work for me.
I faced a similar issue. Instead of using IP address, try using localhost.
http://localhost:portnumber -> http://localhost:8089. This solved the issue for me.

How to launch command on localhost with fabric2?

Here is my script :
from fabric2 import Connection
c = Connection('127.0.0.1')
with c.cd('/home/bussiere/'):
c.run('ls -l')
But i have this error :
paramiko.ssh_exception.AuthenticationException: Authentication failed.
So how do I run a command on localhost ?
In Fabric2, the Connection object has got a local() method.
Have a look at this object's documentation here.
As of July 2020, with fabric2 if you don't pass argument to your task decorator by default you are on the local machine.
for example the following will run on your local machine (localhost) :
Example 1 : Only on local
#python3
#fabfile.py
from fabric import task, Connection
c = Connection('remote_user#remote_server.com')
#task
def DetailList(c):
c.run('ls -l') # will run on local server because the decorator #task does not contain the hosts parameter
You then would run this on your machine with
fab DetailList
If you want to mix code that should be running on remote server and on local you should pass the connection to the #task decorator as a parameter.
Example 2: on local and on remote (but different functions)
#python3
#fabfile.py
#imports
from fabric import task, Connection
#variables
list_of_hosts = ['user#yourserver.com'] #you should have already configure the ssh access
c = Connection(list_of_hosts[0])
working_dir = '/var/www/yourproject'
#will run on remote
#task(hosts = list_of_hosts)
def Update(c):
c.run('sudo apt get update') # will run on remote server because hosts are passed to the task decorator
c.run(f'cd {working_dir} && git pull') # will run on remote server because hosts are passed to the task decorator
c.run('sudo service apache2 restart') # will run on remote server because hosts are passed to the task decorator
#will run on local because you do not specify a host
#task
def DetailsList(c):
c.run('ls -l') # # will run on local server because hosts are NOT passed to the task decorator
As mentionned by Ismaïl there also is a 'local' method that can be used when passing the hosts parameter, the 'local' method will run on the localhost although you have specified the host parameter to the task decorator. Be careful though you can not use the 'local' method if you didn't specified any hosts parameters, use run instead as shown in example 1 & 2.
Example 3 : use both on remote and local servers but under the same functions, note we are not decorating functions that are called in the UpdateAndRestart function.
#python3
#fabfile.py
#imports
from fabric import task, Connection
#variables
list_of_hosts = ['www.yourserver.com'] #you should have already configure the ssh access
c = Connection(list_of_hosts[0])
working_dir = '/var/www/yourproject'
def UpdateServer(c):
c.run('sudo apt get update') # will run on remote server because hosts are passed to the task decorator
c.local('echo the remote server is now updated') # will run on local server because you used the local method when hosts are being passed to the decorator
def PullFromGit(c):
c.run(f'cd {working_dir} && git pull') # will run on remote server because hosts are passed to the task decorator
c.local('echo Git repo is now pulled') # will run on local server because you used the local method when hosts are being passed to the decorator
def RestartServer(c):
c.run('sudo service apache2 restart') # will run on remote server because hosts are passed to the task decorator
c.local('echo Apache2 is now restarted') # will run on local server because you used the local method when hosts are being passed to the decorator
#task(hosts = list_of_hosts)
def UpdateAndRestart(c):
UpdateServer(c)
PullFromGit(c)
RestartServer(c)
c.local('echo you have updated, pulled and restarted Apache2') # will run on local server because you used the local method when hosts are being passed to the decorator
You will be able to run the entire stack with :
fab UpdateAndRestart

Using gevent and Flask to implement websocket, how to achieve concurrency?

So I'm using Flask_Socket to try to implement a websocket on Flask. Using this I hope to notify all connected clients whenever a piece of data has changed. Here's a simplification of my routes/index.py. The issue that I have is that when a websocket connection is opened, it will stay in the notify_change loop until the socket is closed, and in the meantime, other routes like /users can't be accessed.
from flask_sockets import Sockets
sockets = Sockets(app)
#app.route('/users',methods=['GET'])
def users():
return json.dumps(get_users())
data = "Some value" # the piece of data to push
is_dirty = False # A flag which is set by the code that changes the data
#sockets.route("/notifyChange")
def notify_change(ws):
global is_dirty, data
while not ws.closed:
if is_dirty:
is_dirty = False
ws.send(data)
This seems a normal consequence of what is essentially a while True: however, I've been looking online for a way to get around this while still using flask_sockets and haven't had any luck. I'm running the server on GUnicorn
flask/bin/gunicorn -b '0.0.0.0:8000' -k flask_sockets.worker app:app
I tried adding threads by doing --threads 12 but no luck.
Note: I'm only serving up to 4 users at a time, so scaling is not a requirement, and the solution doesn't need to be ultra-optimized.

Fabric - possible to have local and remote task execution in one method?

Is it possible to have local and remote tasks execute from within the same task method?
e.g., I want to do something like the following:
#fabric.api.task
def Deploy():
PrepareDeploy()
PushDeploy()
execute(Extract())
execute(Start())
Where PrepareDeploy and PushDeploy are local tasks (executing only locally, via the fabric.api.local() method):
#fabric.api.task
#fabric.decorators.runs_once
def PrepareDeploy():
#fabric.api.task
#fabric.decorators.runs_once
def PushDeploy():
And Extract/Start are methods that should be run on the remote hosts themselves:
#fabric.api.task
def Extract():
#fabric.api.task
def Start():
However, when I try to do fab Deploy, I get something like:
[remote1.serv.com] Executing task 'Deploy'
[localhost] local: find . -name "*.java" > sources.txt
...
The first line seems wrong to me (and in fact, causes errors).
You can spawn new task and defining on what hosts should it run, for example - how to create rabbitmq of all hosts are provisioned with puppet with the same erlang cookie.
See around line 114 - there is an executon of the tasks on specific hosts.
https://gist.github.com/nvtkaszpir/17d2e2180771abd93c46
I hope this helps.

Categories

Resources