HTTP/REST based file synchronization - python

I would like to synchronize folders and files between a server and some client. Due to the fact that the client part is limited by firewalls and proxy server, I'm forced to use a HTTP based solution.
Is there any HTTP/REST based library (both server and client side) optimized for file synchronization?
(Python or C based solutions would be nice.)
PS: the server side has to run on linux

You could try WebDAV
Python has some libraries to handle it

Seafile is a very interesting server and client syncronization software.
http://www.seafile.com/
It's open source and written in Python and C.
There are many clients for different platforms: mobile (Android and iOS), Linux and Windows.
The server part can run both on Linux (also for RaspberryPI) and Windows.
The software is based on the concept of libraries that can be shared between users and also crypted both in local and server side.
It uses also a deduplication algorithm for bandwidth and performance optimizations.

Related

Understanding smb and DCERPC for remote command execution capabilities

I'm trying to understand all the methods available to execute remote commands on Windows through the impacket scripts:
https://www.coresecurity.com/corelabs-research/open-source-tools/impacket
https://github.com/CoreSecurity/impacket
I understand the high level explanation of psexec.py and smbexec.py, how they create a service on the remote end and run commands through cmd.exe -c but I can't understand how can you create a service on a remote windows host through SMB. Wasn't smb supposed to be mainly for file transfers and printer sharing? Reading the source code I see in the notes that they use DCERPC to create this services, is this part of the smb protocol? All the resources on DCERPC i've found were kind of confusing, and not focused on its service creating capabilities. Looking at the sourcecode of atexec.py, it says that it interacts with the task scheduler service of the windows host, also through DCERPC. Can it be used to interact with all services running on the remote box?
Thanks!
DCERPC (https://en.wikipedia.org/wiki/DCE/RPC) : the initial protocol, which was used as a template for MSRPC (https://en.wikipedia.org/wiki/Microsoft_RPC).
MSRPC is a way to execute functions on the remote end and to transfer data (parameters to these functions). It is not a way to directly execute remote OS commands on the remote side.
SMB (https://en.wikipedia.org/wiki/Server_Message_Block ) is the file sharing protocol mainly used to access files on Windows file servers. In addition, it provides Named Pipes (https://msdn.microsoft.com/en-us/library/cc239733.aspx), a way to transfer data between a local process and a remote process.
One common way for MSRPC is to use it via Named Pipes over SMB, which has the advantage that the security layer provided by SMB is directly approached for MSRPC.
In fact, MSRPC is one of the most important, yet very less known protocols in the Windows world.
Neither MSRPC, nor SMB has something to do with remote execution of shell commands.
One common way to execute remote commands is:
Copy files (via SMB) to the remote side (Windows service EXE)
Create registry entries on the remote side (so that the copied Windows Service is installed and startable)
Start the Windows service.
The started Windows service can use any network protocol (e.g. MSRPC) to receive commands and to execute them.
After the work is done, the Windows service can be uninstalled (remove registry entries and delete the files).
In fact, this is what PSEXEC does.
All the resources on DCERPC i've found were kind of confusing, and not
focused on its service creating capabilities.
Yes, It’s just a remote procedure call protocol. But it can be used to start a procedure on the remote side, which can just do anything, e.g. creating a service.
Looking at the sourcecode of atexec.py, it says that it interacts with
the task scheduler service of the windows host, also through DCERPC.
Can it be used to interact with all services running on the remote
box?
There are some MSRPC commands which handle Task Scheduler, and others which handle generic service start and stop commands.
A few final words at the end:
SMB / CIFS and the protocols around are really complex and hard to understand. It seems ok trying to understand how to deal with e.g. remote service control, but this can be a very long journey.
Perhaps this page (which uses Java for trying to control Windows service) may also help understanding.
https://dev.c-ware.de/confluence/pages/viewpage.action?pageId=15007754

Defining a Proxy used by the Azure IoT Hub Client (Python)

I am using the Azure IoT Hub Client SDK for Python. I am using a slightly modified version of the sample script from the github repo to upload files to the IoT Hub. Everything works fine as long as I do not have to use a proxy for outgoing connections.
I tried to understand how to configurate a proxy for this, but I did not find anything for the Python SDK. I searched also in the other SDKs and found some ProxySettings in the iothub_client_options.h of the C SDK. But I do not know how to set these settings in the python client (in case the settings are actually working).
I also found an issue that the connection over websockets needs some special format of the Linux environment variables. But I do not use websockets.
I tried to run my script both in Windows and Linux environments where the proxy system settings are correctly configured (Win: Internet settings, Linux: environment variables).
Is there any documentation on this topic? Does anybody how to configure a proxy either on windows or on linux?
Per my experience, I think you can run the python script using Azure IoTHub Client SDK without any proxy settings to communicate with Azure IoT Hub if the OS configured correctly the proxy.
However, there are some notes which need to be focused by using different protocol (such as HTTP, Socks, etc) configured in proxy server, as below.
Normally, the proxy server was configured for working on HTTP protocol which only allow the HTTP communication. So if using IoTHub Client within HTTP mode, the script will works fine, but not works within AMQP/MQTT mode.
If the proxy server was configured for working on Socks protocol, such as Socks4/Socks5, the script within any mode will works fine, because the Socks protocol just transmit datagram, not check the protocol type.
So please check which protocols be supported in your proxy server, then to use HTTP mode or configure Socks protocol for proxy to make the script works.

Python communicate into VM windows app

How can I setup a virtualized Ubuntu on real Windows so I can have two apps communicating simple messages between them? VM can be offline, no internet access. Real system probably offline too.
Host<->VM communication on Windows host can be implemented in several ways, independently of hypervisor you are using:
Host Only network - just assign static IP for host and machine, and use sockets api to transfer your data via virtual network. Very good for large amount of data, but require a little bit time for configuration.
Virtual COM ports - if you don't want to use sockets api and want to write data to files(on linux VM)/named pipes(on windows host). This can be simpler because require almost zero configuration, but it will not work very well with large amount of data.
Choose what will fit your needs.

Python distributed application architecture on constrained corporate environment

This is my scenario: I developed a Python desktop application which I use to probe the status of services/DBs on the very same machine it is running on.
My need is to monitor, using my application, two "brother" Window Server 2003 hosts (Python version is 2.5 for both). One of the hosts lies in my own LAN, the other one lies in another LAN which is reachable via VPN.
The application is composed by:
A Graphical User Interface (gui.py), which provides widgets to collect user inputs and launches the...
...business-logic script (console.py), which in turn invokes slave Python scripts that check the system's services and DB usage/accounts status/ecc. The textual output of those checks is then returned back to the GUI.
I used to execute the application directly on each the two machines, but it would be great to turn it into a client/server application, so that:
users will just be supposed to run the gui.py locally
the gui.py will be supposed to communicate parameters to some server remakes of console.py which will be running on both of the Windows hosts
the servers will then execute system checks and report back the results to the client GUIs which will display them.
I thought about two possible solutions:
Create a Windows service on each of the Windows hosts, basically executing console.py's code and waiting for incoming requests from the clients
Open SSH connections from any LAN host to the eliged Windows host and directly run console.py on it.
I am working on a corporate environment, which has some network and host constraints: many network protocols (like SSH) are filtered by our corporate firewall. Furthermore, I don't have Administration privileges onto the Windows hosts, so I can't install system services on them...this is frustrating!
I just wanted to ask if there is any other way to make gui.py and console.py communicate over the network and which I did not take into account. Does anyone have any suggestion? Please note that - if possible - I'm not going to ask ICT department to give me Administration privileges on the Windows hosts!
Thanks in advance!
Answer to myself: I found one possible solution..
I'm lucky because the console.py script is actually invoking many slave python scripts, each of them performing one single system check via standard third-party command-line tools which can be fired to check features on remote hosts.
Then, what I did was to modify the gui.py and console.py so that users can parametrically specify on which Windows host the checks must be carried out.
In this way, I can obtain a ditributed application...but I've been lucky, what if one or more of the third-party CL tools did not support remote host features checking?

Determine user connecting a local socket with Python

If Python, if you are developing a system service that communicates with user applications through sockets, and you want to treat sockets connected by different users differently, how would you go about that?
If I know that all connecting sockets will be from localhost, is there a way to lookup through the OS (either on windows or linux) which user is making the connection request?
On Linux and other unixy system, you can use the ident service.
I'm not sure if Windows offers something similar.
Unfortunately, at this point in time the python libraries don't support the usual SCM_CREDENTIALS method of passing credentials along a Unix socket.
You'll need to use an "ugly" method as described in another answer to find it.
On Linux you can get the source (i.e. client-side) port of the socket and parse the output of the lsof(8) utility searching for who is using that port.
Here's the manual page.

Categories

Resources