I'm currently using python-inotify to monitor local directories for changes, and run scripts when they happen.
Now though, I need functionality to monitor a remote directory for changes. The remote directory will be either a git or svn repo, on a server I have root ssh access to. I know about git hooks, but they only get run on commit/push/rebase etc, not on generic changes.
Is there an existing python library I might be able to use for this? Or can I just open an ssh connection in Python and then carry on using python-inotify?
You need file system level access for inotify to work. So the easiest way if you have ssh is to just run the monitor script on the remote system.
You can then use something like Twisted to communicate the changes from one system to another over the network.
Related
I have a Python script inside a container that needs to continuously read changing values inside a file located on the host file system. Using a volume to mount the file won't work because that only captures a snapshot of the values in the file at that moment. I know it's possible since the node_exporter container is able to read files on the host filesystem using custom methods in Golang. Does anyone know a general method to accomplish this?
I have a Python script [...] that needs to continuously read changing values inside a file located on the host file system.
Just run it. Most Linux systems have Python preinstalled. You don't need Docker here. You can use tools like Python virtual environments if your application has Python library dependencies that need to be installed.
Is it possible to get a Docker container to read a file from host file system not using Volumes?
You need some kind of mount, perhaps a bind mount; docker run -v /host/path:/container/path image-name. Make sure to not overwrite the application's code in the image when you do this, since the mount will completely hide anything in the underlying image.
Without a bind mount, you can't access the host filesystem at all. This filesystem isolation is a key feature of Docker.
...the [Prometheus] node_exporter container...
Reading from the GitHub link in the question, "It's not recommended to deploy it as a Docker container because it requires access to the host system." The docker run example there uses a bind mount to access the entire host filesystem, circumventing Docker's filesystem isolation.
I have a python script located on a remote server with SSH enabled. That script displays a lot of debug messages displayed while executing. I want to trigger this script using another python script which is on my local system and depending on the output of the earlier script, I want to proceed further. While doing all this, I want the display messages on the remote server to be displayed on my local system as well. Basically, I want to view whatever output is thrown by the remote script during the course of the script, on my local system. I am able to trigger the script using paramiko but I am neither able to check whether the script on the remote server is running nor am I able to view it's output. Is there any way to do it? Already tried conn.recv(65535) but to no avail.
In my experience I found python fabric module easier than using paramiko. If you want to execute local script on remote machine using fabric. You just need to upload them using put() and then call run() api.
http://docs.fabfile.org/en/1.14/api/core/operations.html#fabric.operations.put
I'm trying to understand all the methods available to execute remote commands on Windows through the impacket scripts:
https://www.coresecurity.com/corelabs-research/open-source-tools/impacket
https://github.com/CoreSecurity/impacket
I understand the high level explanation of psexec.py and smbexec.py, how they create a service on the remote end and run commands through cmd.exe -c but I can't understand how can you create a service on a remote windows host through SMB. Wasn't smb supposed to be mainly for file transfers and printer sharing? Reading the source code I see in the notes that they use DCERPC to create this services, is this part of the smb protocol? All the resources on DCERPC i've found were kind of confusing, and not focused on its service creating capabilities. Looking at the sourcecode of atexec.py, it says that it interacts with the task scheduler service of the windows host, also through DCERPC. Can it be used to interact with all services running on the remote box?
Thanks!
DCERPC (https://en.wikipedia.org/wiki/DCE/RPC) : the initial protocol, which was used as a template for MSRPC (https://en.wikipedia.org/wiki/Microsoft_RPC).
MSRPC is a way to execute functions on the remote end and to transfer data (parameters to these functions). It is not a way to directly execute remote OS commands on the remote side.
SMB (https://en.wikipedia.org/wiki/Server_Message_Block ) is the file sharing protocol mainly used to access files on Windows file servers. In addition, it provides Named Pipes (https://msdn.microsoft.com/en-us/library/cc239733.aspx), a way to transfer data between a local process and a remote process.
One common way for MSRPC is to use it via Named Pipes over SMB, which has the advantage that the security layer provided by SMB is directly approached for MSRPC.
In fact, MSRPC is one of the most important, yet very less known protocols in the Windows world.
Neither MSRPC, nor SMB has something to do with remote execution of shell commands.
One common way to execute remote commands is:
Copy files (via SMB) to the remote side (Windows service EXE)
Create registry entries on the remote side (so that the copied Windows Service is installed and startable)
Start the Windows service.
The started Windows service can use any network protocol (e.g. MSRPC) to receive commands and to execute them.
After the work is done, the Windows service can be uninstalled (remove registry entries and delete the files).
In fact, this is what PSEXEC does.
All the resources on DCERPC i've found were kind of confusing, and not
focused on its service creating capabilities.
Yes, It’s just a remote procedure call protocol. But it can be used to start a procedure on the remote side, which can just do anything, e.g. creating a service.
Looking at the sourcecode of atexec.py, it says that it interacts with
the task scheduler service of the windows host, also through DCERPC.
Can it be used to interact with all services running on the remote
box?
There are some MSRPC commands which handle Task Scheduler, and others which handle generic service start and stop commands.
A few final words at the end:
SMB / CIFS and the protocols around are really complex and hard to understand. It seems ok trying to understand how to deal with e.g. remote service control, but this can be a very long journey.
Perhaps this page (which uses Java for trying to control Windows service) may also help understanding.
https://dev.c-ware.de/confluence/pages/viewpage.action?pageId=15007754
I normally use a bash script to grab all the files onto local machine and use glob to process all the files. Just wondering what would be the best way to use python (instead of another bash script) to ssh into each server and process those files?
My current program runs as
for filename in glob.glob('*-err.txt'):
input_open = open (filename, 'rb')
for line in input_open:
do something
My files all have the ending -err.txt and the directories where they reside in the remote server have the same name /documents/err/. I am not able to install third party libraries as I don't have the permission.
UPDATE
I am trying to not to scp the files from the server but to read it on the remote server instead..
I want to use a local python script LOCALLY to read in files on remote server.
The simplest way to do it is to use paramico_scp to use ssh copy from the remote server (How to scp in python?)
If you are not allowed to download any libraries, you can create SSH key pair so that connecting to server does not require a password (https://www.debian.org/devel/passwordlessssh). You then can for each file do
import os
os.system('scp user#host:/path/to/file/on/remote/machine /path/to/local/file')
Note that using system is usually considered less portable than using libraries. If you give the script that use system('scp ...') to copy the files and they do not have SSH key pair set up, they will experience problems
Looks like you want to use a local Python script remotely. This has been answered here.
I'm connected to a VM on a private network at address 'abc.def.com' using ssh, and on that VM there's an application that hosts a Python web app (IPython Notebook) that I can access by pointing my local browser to 'abc.def.com:7777'.
From that web app I can call shell commands by preceding them with '!', for example !ls -lt will list the files in the VM current working directory. But since I'm using my own laptop's browser, I think I should be able to run shell commands on my local files as well. How would I do that?
If that's not possible, what Python/shell command can I run from within the web app to automatically get my laptop's IP address to use things like scp? I know how to get my IP address, but I'd like to create a program that will automatically enable scp for whoever uses it.
You have ssh access so you could possibly write a python function that would let you transfer files via scp the secure copy command which uses ssh to communicate. If you exchange keys with the server you wouldn't have to put in a password so I see no problem from that standpoint. The issue is if you have an address for your local machine to be accessed from the server.
I work on various remotes from my laptop all day and from my laptop to the sever I could have this function:
def scp_to_server(address, local_file, remote_file):
subprocess.call(['scp',local_file,"myusername#{}:{}".format(address, remote_file)])
that would copy a file from my local machine to the remote provided the paths were correct, I have permissions to copy the files, and my local machine's id_rsa.pub key is in the ~/.ssh/authorized_keys file on the remote.
I have no way to initiate a secure copy from the remote to my local machine however because I don't have an address to access the local machine from that I can "see" on the remote.
If I open the terminal on my laptop and run hostname I see mylaptop.local and on the remote I see remoteserver#where.i.work.edu but the first is a local address I can see it from other machines on my LAN at home, (because I have configured that) but I can't see mylaptop.local from the remote. I know there is a way to configure that so I could find my laptop at home from anywhere, but I never had the need to do that (since I bring the laptop with me) so I can't help you there. I think there are a few more hurdels to go-over than you would like.
You could implement the function above on your local machine and transfer the files that way though.