Setting fabric hosts list from an external hosts file - python

I need to get fabric to set its hosts list by opening and reading a file to get the hosts.
Setting it this way allows me to have a huge list of hosts without needing to edit my fabfile for this data each time.
I tried this:
def set_hosts():
env.hosts = [line.split(',') for line in open("host_file")]
def uname():
run("uname -a")
and
def set_hosts():
env.hosts = open('hosts_file', 'r').readlines
def uname():
run("uname -a")
I get the following error each time I try to use the function set_hosts:
fab set_hosts uname
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/fabric/main.py", line 712, in main
*args, **kwargs
File "/usr/local/lib/python2.7/dist-packages/fabric/tasks.py", line 264, in execute
my_env['all_hosts'] = task.get_hosts(hosts, roles, exclude_hosts, state.env)
File "/usr/local/lib/python2.7/dist-packages/fabric/tasks.py", line 74, in get_hosts
return merge(*env_vars)
File "/usr/local/lib/python2.7/dist-packages/fabric/task_utils.py", line 57, in merge
cleaned_hosts = [x.strip() for x in list(hosts) + list(role_hosts)]
AttributeError: 'builtin_function_or_method' object has no attribute 'strip'

The problem you're hitting here is that you're setting env.hosts to a function object, not a list or iterable. You need the parens after readlines, to actually call it:
def set_hosts():
env.hosts = open('hosts_file', 'r').readlines()

Related

How to specify a scapy startup file?

I use scapy with the -c command line option to load a startup file:
# liquidsoap debug
streamerIP = "192.168.0.53"
dump= []
def filterStreamer(pkt):
if pkt.src == streamerIP or pkt.dst == streamerIP:
dump.append(pkt)
sniff(prn=filterStreamer)
ls(dump)
it gives:
Traceback (most recent call last):
File "/usr/lib/python2.7/dist-packages/scapy/main.py", line 30, in _read_config_file
execfile(cf)
File "icecast-debug.py", line 9, in <module>
sniff(prn=filterStreamer)
File "/usr/lib/python2.7/dist-packages/scapy/sendrecv.py", line 586, in sniff
r = prn(p)
File "icecast-debug.py", line 6, in filterStreamer
if (pkt.src == streamerIP or pkt.dst == streamerIP):
NameError: global name 'streamerIP' is not defined
Welcome to Scapy (2.2.0)
and in the console I see nor the streamerIP neither dump, but funniest of all filterStreamer as a function are not defined.
However if I do not pass filterStreamer to sniff it begins sniffing. So it's like interpreting the code line by line, and clear the scope after all line interpretition.
You have to use the global keyword. Also, use a PacketList() rather than a list. And ls() will not work against a list, but if you use a PacketList(), you have the .summary() method.
streamerIP = "192.168.0.53"
dump = PacketList()
def filterStreamer(pkt):
global streamerIP, dump
if pkt.src == streamerIP or pkt.dst == streamerIP:
dump.append(pkt)
dump.summary()

Python - Fabric - Getting files

I am trying to write a simple python code with fabric to transfer a file from one host to another using the get() function although I keep getting error message:
MacBook-Pro-3:PythonsScripts$ fab get:'/tmp/test','/tmp/test'
[hostname] Executing task 'get'
Traceback (most recent call last):
File "/Library/Python/2.7/site-packages/fabric/main.py", line 743, in main
*args, **kwargs
File "/Library/Python/2.7/site-packages/fabric/tasks.py", line 387, in execute
multiprocessing
File "/Library/Python/2.7/site-packages/fabric/tasks.py", line 277, in _execute
return task.run(*args, **kwargs)
File "/Library/Python/2.7/site-packages/fabric/tasks.py", line 174, in run
return self.wrapped(*args, **kwargs)
File "/Users/e0126914/Desktop/PYTHON/PythonsScripts/fabfile.py", line 128, in get
get('/tmp/test','/tmp/test') ***This line repeats many times then last error below***
RuntimeError: maximum recursion depth exceeded
My current code is:
from fabric.api import *
from getpass import getpass
from fabric.decorators import runs_once
env.hosts = ['hostname']
env.port = '22'
env.user = 'parallels'
env.password="password"
def abc(remote_path, local_path):
abc('/tmp/test','/tmp/')
Any help would be appreciated!
fabric.api.get already is a method. When you perform from fabric.api import * you are importing fabric's get. You should rename your get function to avoid conflict.
From inside the abc function, you need to call get
def abc(p1,p2):
get(p1, p2)
EDIT:
When executing functions through fabric, the arguments are passed through the command line
ie. $ fab abc:string1,string2

getting iis worker processes from wmi in python

I'm trying to dispay process if and pool names of iis in python.
Here is my python code:
import wmi
c = wmi.WMI('.', namespace="root/WebAdministration")
c.query("select ProcessId from WorkerProcess")
it fails:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Python27\lib\site-packages\wmi.py", line 1009, in query
return [ _wmi_object (obj, instance_of, fields) for obj in self._raw_query(wql) ]
File "C:\Python27\lib\site-packages\win32com\client\util.py", line 84, in next
return _get_good_object_(self._iter_.next(), resultCLSID = self.resultCLSID)
pywintypes.com_error: (-2147217389, 'OLE error 0x80041013', None, None)
I also tried:
for p in c.WorkerProcess:
print p.ProcessId
which does not work either.
Now here is a very similar visualbasic script code that works fine:
Set oWebAdmin = GetObject("winmgmts:root\WebAdministration")
Set processes = oWebAdmin.InstancesOf("WorkerProcess")
For Each w In processes
WScript.Echo w.ProcessId
WScript.Echo w.AppPoolName
Next
the documentation is:
http://msdn.microsoft.com/en-us/library/microsoft.web.administration.workerprocess(v=vs.90).aspx
It looks like I'm supposed to instantiate but I cannot figure out how.
Any ideas how to get it to work in python?
Actually my code was correct. I just needed to run it with admin preivileges.

Why cannot call method in org.freedesktop.NetworkManager with dbus in Python?

I tried the code below in interactive python shell and got the follow error in line 3 of code, using D-Feet I see that path and interface exists in bus, and with the command dbus-send I able to get the devices, see in end of this message. Why not work with this code in python? PS: I using ubuntu 12.04, tried too in ubuntu 11, same problem.
Code:
import dbus
bus = dbus.SessionBus()
obj = bus.get_object('org.freedesktop.NetworkManager', '/org/freedesktop/NetworkManager')
t = dbus.Interface(obj, "org.freedesktop.NetworkManager")
t.GetDevices()
Output error entering line 3 of code:
Traceback (most recent call last):
File "<input>", line 1, in <module>
File "/usr/lib/pymodules/python2.7/dbus/bus.py", line 244, in get_object
follow_name_owner_changes=follow_name_owner_changes)
File "/usr/lib/pymodules/python2.7/dbus/proxies.py", line 241, in __init__
self._named_service = conn.activate_name_owner(bus_name)
File "/usr/lib/pymodules/python2.7/dbus/bus.py", line 183, in activate_name_owner
self.start_service_by_name(bus_name)
File "/usr/lib/pymodules/python2.7/dbus/bus.py", line 281, in start_service_by_name
'su', (bus_name, flags)))
File "/usr/lib/pymodules/python2.7/dbus/connection.py", line 630, in call_blocking
message, timeout)
DBusException: org.freedesktop.DBus.Error.ServiceUnknown: The name org.freedesktop.NetworkManager was not provided
by any .service files
Shell command that work:
dbus-send --system --print-reply --reply-timeout=2000 --type=method_call --dest=org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager.GetDevices
Output:
method return sender=:1.2 -> dest=:1.69 reply_serial=2
array [
object path "/org/freedesktop/NetworkManager/Devices/0"
]
This is just a example, I wish to know why don't work, If I change line 3 to (note the DBus name in first parameter):
obj = bus.get_object('org.freedesktop.DBus', '/org/freedesktop/NetworkManager')
the error don't occur, but in this interface the method GetDevices doesn't exists.
In your command line example you're asking for the system bus:
dbus-send --system ...
In your Python code, you're asking for the session bus:
bus = dbus.SessionBus()
If you try the request over the system bus, I think you'll find that it works:
>>> import dbus
>>> bus = dbus.SystemBus()
>>> obj = bus.get_object('org.freedesktop.NetworkManager', '/org/freedesktop/NetworkManager')
>>> t = dbus.Interface(obj, "org.freedesktop.NetworkManager")
>>> t.GetDevices()
dbus.Array([dbus.ObjectPath('/org/freedesktop/NetworkManager/Devices/0'), dbus.ObjectPath('/org/freedesktop/NetworkManager/Devices/1')], signature=dbus.Signature('o'))
>>>

Why this code doesn't work in parallel python

I tried to use pp(Parallel Python) like this:
import glob
import subprocess
import pp
def run(cmd):
print cmd
subprocess.call(cmd, shell=True)
job_server = pp.Server()
job_server.set_ncpus(8)
jobs = []
for a_file in glob.glob("./*"):
cmd = "ls"
jobs.append(job_server.submit(run, (cmd,)))
for j in jobs:
j()
But encountered such an error that subprocess.call is not a global name.
An error has occured during the function execution
Traceback (most recent call last):
File "/Library/Python/2.7/site-packages/pp-1.6.1-py2.7.egg/ppworker.py", line 90, in run
__result = __f(*__args)
File "<string>", line 3, in run
NameError: global name 'subprocess' is not defined
I've imported subprocess, why can't it be used here?
According to abarnert's suggestion, I changed my code to this:
import glob
import pp
def run(cmd):
print cmd
subprocess.call(cmd, shell=True)
job_server = pp.Server()
job_server.set_ncpus(8)
jobs = []
for a_file in glob.glob("./*"):
cmd = "ls"
jobs.append(job_server.submit(run, (cmd,),modules=("subprocess",)))
for j in jobs:
j()
But it still doesn't work, it complains like this:
Traceback (most recent call last):
File "/usr/lib/python2.6/threading.py", line 532, in __bootstrap_inner
self.run()
File "/usr/lib/python2.6/threading.py", line 484, in run
self.__target(*self.__args, **self.__kwargs)
File "/usr/local/lib/python2.6/dist-packages/pp-1.6.1-py2.6.egg/pp.py", line 721, in _run_local
job.finalize(sresult)
UnboundLocalError: local variable 'sresult' referenced before assignment
The documentation explains this pretty well, and each example shows you how to deal with it.
Among the params of the submit method is "modules - tuple with module names to import". Any modules you want to be available in the submitted job has to be listed here.
So, you can do this:
jobs.append(job_server.submit(run, (cmd,), (), ('subprocess',)))
Or this:
jobs.append(job_server.submit(run, (cmd,), modules=('subprocess',)))
Sorry, untested, but did you try:
from subprocess import call
Inside the 'run' function?
And then use "call" instead of "subprocess.call" ? That would make 'call' local to the function but accessible.

Categories

Resources