I have written a Python TCP/IP server for internal use, using win32serviceutil/py2exe to create a Windows service.
I installed it on a computer running Windows XP Pro SP3. However, I can't connect to it when it's running as a service. I can confirm that it's binding to the address/port, because I get a conflict when I try to bind to that address/port with another application. Further, I have checked the Windows Firewall settings and have added appropriate exceptions. If I run the server as a simple console application, everything works as expected. However, when I run it as a service, it doesn't work.
I vaguely remember running into this problem before, but for the life of me can't remember any of the details.
Suggestions, anyone?
Possibly the program may be terminated just after initialization. Please check whether it is continuously listening to the requests.
netstat -an |find /i "listening"
And analyze the command line parsed to the programs. You may use procexp to do that.
First of all, whenever you implement a Windows service, be sure to add proper logging.
My worker threads were terminating because of the exception, "The socket operation could not complete without blocking."
The solution was to simply call sock.setblocking(1) after accepting the connection.
Check to see that the service is running under the Nertwork Service account and not the Local System account. The later doesn't have network access and is the default user to run services under. You can check this by going to the services app under administrative tool in the start menu and looking for your service. If you right-click the service you can go to properties and change the user that it is run under.
Related
I'm trying to understand all the methods available to execute remote commands on Windows through the impacket scripts:
https://www.coresecurity.com/corelabs-research/open-source-tools/impacket
https://github.com/CoreSecurity/impacket
I understand the high level explanation of psexec.py and smbexec.py, how they create a service on the remote end and run commands through cmd.exe -c but I can't understand how can you create a service on a remote windows host through SMB. Wasn't smb supposed to be mainly for file transfers and printer sharing? Reading the source code I see in the notes that they use DCERPC to create this services, is this part of the smb protocol? All the resources on DCERPC i've found were kind of confusing, and not focused on its service creating capabilities. Looking at the sourcecode of atexec.py, it says that it interacts with the task scheduler service of the windows host, also through DCERPC. Can it be used to interact with all services running on the remote box?
Thanks!
DCERPC (https://en.wikipedia.org/wiki/DCE/RPC) : the initial protocol, which was used as a template for MSRPC (https://en.wikipedia.org/wiki/Microsoft_RPC).
MSRPC is a way to execute functions on the remote end and to transfer data (parameters to these functions). It is not a way to directly execute remote OS commands on the remote side.
SMB (https://en.wikipedia.org/wiki/Server_Message_Block ) is the file sharing protocol mainly used to access files on Windows file servers. In addition, it provides Named Pipes (https://msdn.microsoft.com/en-us/library/cc239733.aspx), a way to transfer data between a local process and a remote process.
One common way for MSRPC is to use it via Named Pipes over SMB, which has the advantage that the security layer provided by SMB is directly approached for MSRPC.
In fact, MSRPC is one of the most important, yet very less known protocols in the Windows world.
Neither MSRPC, nor SMB has something to do with remote execution of shell commands.
One common way to execute remote commands is:
Copy files (via SMB) to the remote side (Windows service EXE)
Create registry entries on the remote side (so that the copied Windows Service is installed and startable)
Start the Windows service.
The started Windows service can use any network protocol (e.g. MSRPC) to receive commands and to execute them.
After the work is done, the Windows service can be uninstalled (remove registry entries and delete the files).
In fact, this is what PSEXEC does.
All the resources on DCERPC i've found were kind of confusing, and not
focused on its service creating capabilities.
Yes, It’s just a remote procedure call protocol. But it can be used to start a procedure on the remote side, which can just do anything, e.g. creating a service.
Looking at the sourcecode of atexec.py, it says that it interacts with
the task scheduler service of the windows host, also through DCERPC.
Can it be used to interact with all services running on the remote
box?
There are some MSRPC commands which handle Task Scheduler, and others which handle generic service start and stop commands.
A few final words at the end:
SMB / CIFS and the protocols around are really complex and hard to understand. It seems ok trying to understand how to deal with e.g. remote service control, but this can be a very long journey.
Perhaps this page (which uses Java for trying to control Windows service) may also help understanding.
https://dev.c-ware.de/confluence/pages/viewpage.action?pageId=15007754
I tried to do this with WMI, but interactive processes cannot be started with it (as stated in Microsoft documentation). I see processes in task manager, but windows do not show.
I tried with Paramiko, same thing. Process visible in task manager, but no window appears (notepad for example).
I tried with PsExec, but the only case a window appears on the remote machine, is when you specify -i and it does not show normally, only through a message box saying something like "a message arrived do you want to see it".
Do you know a way to start a program remotely, and have its interface behave like it would if you manually started it?
Thanks.
Normally the (SSH) servers run as a Windows service.
Window services run in a separate Windows session (google for "Session 0 isolation"). They cannot access interactive (user) Windows sessions.
Also note that there can be multiple user sessions (multiple logged in users) in Windows. How would the SSH server know, what user session to display the GUI on (even if it could)?
The message you are getting is thanks to the "Interactive Services Detection" service that detects that a service is trying to show a GUI on an invisible Session 0 and allows you to replicate the GUI on the user session.
You can run the SSH server in an interactive Windows session, instead as a service. It has its limitations though.
In general, all this (running GUI application on Windows remotely through SSH) does not look like a good idea to me.
Also this question is more about a specific SSH server, rather that about an SSH client you are using. So you you include details about your SSH server, you can get better answers.
ok i found a way. With subprocess schtasks( the windows task scheduler). For whatever reason, when i launch a remote process with it , it starts as if i had clicked myself on the exe. For it to start with no delay, creating the task to an old date like 2012 with schtasks /Create /F and running the then named task with schtasks /Run will do the trick
I have a remote server through Blue Host that's intended to run a server based on Twisted for Python. The only access I have to it is over SSH, so to keep Python running after I log out I tried using nohup python server.py & and screen -dm python server.py, getting the same results for each. Everything works fine until I log out of SSH - even though Python is running in the background as expected, once I've logged out, my client can no longer communicate with the server. The strange part is that if I log back in over SSH and check the running processes with ps aux, I see Python running and my client can successfully communicate with the server again. Even if I don't type anything at all once I log back in, everything works as expected. But, of course, as soon as I log back out, it's as if the server is gone.
I've contacted support for the hosting service in case this is some oddity on their end, but hopefully this is something that can be resolved on my end instead.
Edit: Looks like Blue Host doesn't want me doing server-y stuff without buying the VPS upgrade so it looks like that's the big problem.
Edit 2: Okay, so in case anybody ends up having a similar problem, here's what the main issue turned out to be. I was mistaken in my original description; I was able to connect to the server but I was getting kicked off immediately for what turned out to be a MySQL error. I guess trying to connect to a localhost database with no active connection somehow causes problems, so instead I changed the MySQL connection command to connect to my site's IP address instead, even though it was the same IP as the server. That seemed to do the trick in terms of my main issue.
Don't use this method to keep the server process running. Instead try using supervisor (apt-get install supervisor). It allows you to daemonize your process, and ability to stop/restart etc.
Here's a sample config entry (/etc/supervisor/supervisord.conf):
[program:my_server]
command=python /path/to/server/server.py
directory=/path/to/server/
autostart=true
autorestart=true
stdout_logfile=/var/log/server.log
stderr_logfile=/var/log/server_error.log
user=your_linux_user_name
After you edit your config, do
sudo service supervisor stop
sudo service supervisor start #need to do this - doing a `restart` doesn't reload the config file!
your server should now be running properly. You can manage its lifecycle via sudo supervisorctl
This is my scenario: I developed a Python desktop application which I use to probe the status of services/DBs on the very same machine it is running on.
My need is to monitor, using my application, two "brother" Window Server 2003 hosts (Python version is 2.5 for both). One of the hosts lies in my own LAN, the other one lies in another LAN which is reachable via VPN.
The application is composed by:
A Graphical User Interface (gui.py), which provides widgets to collect user inputs and launches the...
...business-logic script (console.py), which in turn invokes slave Python scripts that check the system's services and DB usage/accounts status/ecc. The textual output of those checks is then returned back to the GUI.
I used to execute the application directly on each the two machines, but it would be great to turn it into a client/server application, so that:
users will just be supposed to run the gui.py locally
the gui.py will be supposed to communicate parameters to some server remakes of console.py which will be running on both of the Windows hosts
the servers will then execute system checks and report back the results to the client GUIs which will display them.
I thought about two possible solutions:
Create a Windows service on each of the Windows hosts, basically executing console.py's code and waiting for incoming requests from the clients
Open SSH connections from any LAN host to the eliged Windows host and directly run console.py on it.
I am working on a corporate environment, which has some network and host constraints: many network protocols (like SSH) are filtered by our corporate firewall. Furthermore, I don't have Administration privileges onto the Windows hosts, so I can't install system services on them...this is frustrating!
I just wanted to ask if there is any other way to make gui.py and console.py communicate over the network and which I did not take into account. Does anyone have any suggestion? Please note that - if possible - I'm not going to ask ICT department to give me Administration privileges on the Windows hosts!
Thanks in advance!
Answer to myself: I found one possible solution..
I'm lucky because the console.py script is actually invoking many slave python scripts, each of them performing one single system check via standard third-party command-line tools which can be fired to check features on remote hosts.
Then, what I did was to modify the gui.py and console.py so that users can parametrically specify on which Windows host the checks must be carried out.
In this way, I can obtain a ditributed application...but I've been lucky, what if one or more of the third-party CL tools did not support remote host features checking?
I am running a Python script as a Windows service, but it seems to be failing whenever I set it to auto-start. I believe this may be because the service uses network resources that are not yet mounted when the service starts. Is there a way I can get it to wait until startup is complete before running?
Configure your Windows Service so that it has the Workstation Service as a dependency.
This means Windows won't attempt to start your service until the appropriate resources are available.
Add in script wait for the resources who script must use is in good standing, or rewrite script to better design like not exit if dont have connection; wait 1s and try again if connection failed.