I have a pretty simple task: using ssh, I want to create a tunnel that forwards traffic from my local machine to a specific port on a remote machine. I can do this from the command line:
ssh -N -L 123:127.0.0.1:456 user#remotehost
Then if I run:
telnet localhost 123
it's the equivalent of logging into remotehost and running
telnet 127.0.0.1 456
I've managed to do this with something along the lines of;
subprocess.Popen(['ssh', '-N', '-L', '%i:127.0.0.1:%i' % (new_port, old_port), ssh_user + '#' + ip_addr])
But now I want to move away from that and use only Python - no external processes.
I've tried using fabric.context_managers.remote_tunnel but unless I've misunderstood this is meant for creating a tunnel that starts at a remote location, not from the local machine. That is, it is the equivalent of SSHing into a remote machine and creating an SSH tunnel from there, which is silly for my purpose. I suppose I could set the remote host to actually be the local machine but this seems inefficient and honestly I don't even understand how to do that.
I've also tried forward.py on paramiko and it doesn't work because my private key is encrypted. I'd like to modify the script to handle that, and also just simplify it for my needs, but both the script and the paramiko library are daunting and I don't know how to begin.
Surely there's an easy way to do this? I seem to be so close yet so far.
What do you mean by "pure Python"? Subprocess is bundled with standard python installation.
Subprocess and Fabric are designed for such tasks, why would you want to move away from them?
If you have, minimal tasks to be performed remotely e.g. check memory, hostname, etc. you can go ahead with suprocess. However if you have some big requirements, I would suggest going with fabric.
For your purpose where you have to work on the same machine, why not use subprocess with check_call or Popen. As an alternative, you can change your code altogether so as to be able to get into Unix core to achieve what the Linux commands do.
Related
Using fabric 2.4 and trying to set up ssh keys to remotely connect to various linux servers?
Python is new to me, I've followed example on this site and Read python doc but still unclear to me.
Currently running python on windows and my script is able to connect to remote linux servers because i have connection string defined as follows:
ssh_connect = Connection(host='servername', user='user1', connect_kwargs={'password': 'blahblah'})
I am running python script from my window server and instead of defining the connection string, I would like it to use ssh key. I have the id_rsa.pub file from the linux server. I would like to setup up on my windows box and have the script use that for connections?
You can always just pass a key via Fabric through it's command line arguments. This is definitely kind of a vague question, especially since it's running fabric on Windows, which to my knowledge is not technically supported by them, though it does work.
fab -i /Path/to/key
Source: fab --help and the official documentation here
I'm trying to understand all the methods available to execute remote commands on Windows through the impacket scripts:
https://www.coresecurity.com/corelabs-research/open-source-tools/impacket
https://github.com/CoreSecurity/impacket
I understand the high level explanation of psexec.py and smbexec.py, how they create a service on the remote end and run commands through cmd.exe -c but I can't understand how can you create a service on a remote windows host through SMB. Wasn't smb supposed to be mainly for file transfers and printer sharing? Reading the source code I see in the notes that they use DCERPC to create this services, is this part of the smb protocol? All the resources on DCERPC i've found were kind of confusing, and not focused on its service creating capabilities. Looking at the sourcecode of atexec.py, it says that it interacts with the task scheduler service of the windows host, also through DCERPC. Can it be used to interact with all services running on the remote box?
Thanks!
DCERPC (https://en.wikipedia.org/wiki/DCE/RPC) : the initial protocol, which was used as a template for MSRPC (https://en.wikipedia.org/wiki/Microsoft_RPC).
MSRPC is a way to execute functions on the remote end and to transfer data (parameters to these functions). It is not a way to directly execute remote OS commands on the remote side.
SMB (https://en.wikipedia.org/wiki/Server_Message_Block ) is the file sharing protocol mainly used to access files on Windows file servers. In addition, it provides Named Pipes (https://msdn.microsoft.com/en-us/library/cc239733.aspx), a way to transfer data between a local process and a remote process.
One common way for MSRPC is to use it via Named Pipes over SMB, which has the advantage that the security layer provided by SMB is directly approached for MSRPC.
In fact, MSRPC is one of the most important, yet very less known protocols in the Windows world.
Neither MSRPC, nor SMB has something to do with remote execution of shell commands.
One common way to execute remote commands is:
Copy files (via SMB) to the remote side (Windows service EXE)
Create registry entries on the remote side (so that the copied Windows Service is installed and startable)
Start the Windows service.
The started Windows service can use any network protocol (e.g. MSRPC) to receive commands and to execute them.
After the work is done, the Windows service can be uninstalled (remove registry entries and delete the files).
In fact, this is what PSEXEC does.
All the resources on DCERPC i've found were kind of confusing, and not
focused on its service creating capabilities.
Yes, It’s just a remote procedure call protocol. But it can be used to start a procedure on the remote side, which can just do anything, e.g. creating a service.
Looking at the sourcecode of atexec.py, it says that it interacts with
the task scheduler service of the windows host, also through DCERPC.
Can it be used to interact with all services running on the remote
box?
There are some MSRPC commands which handle Task Scheduler, and others which handle generic service start and stop commands.
A few final words at the end:
SMB / CIFS and the protocols around are really complex and hard to understand. It seems ok trying to understand how to deal with e.g. remote service control, but this can be a very long journey.
Perhaps this page (which uses Java for trying to control Windows service) may also help understanding.
https://dev.c-ware.de/confluence/pages/viewpage.action?pageId=15007754
I am running tensorflow code remotely on a ssh server. (e.g., ssh -X account#server.address)
On the remote server, it says You can navigate to http://0.0.0.0:6006.
In this case, how can I check tensorboard? How can I navigate the address of a remote machine?
I tried to search, but there seems no useful information.
0.0.0.0 is the wildcard address. Thus, you can use any address for the purpose unless the system's firewall is implementig something more restrictive.
That said, let's assume that it is implementing firewall-based restrictions (if it weren't, you could just access http://server.address:6006/ -- but so could anyone else). In that case:
ssh -L 16006:127.0.0.1:6006 account#server.address
...and then refer to http://127.0.0.1:16006/ in a local browser.
I'm using Paramiko to connect to remote Cisco routers and switches. When connecting to these devices, I'd like to be able to turn off echo when entering "configuration" mode. That way, I can issue commands to the remote system and avoid seeing them come back (and thereby concentrate only on looking for error messages).
I'm performing the following commands to get a shell with the Cisco device:
self.chan = self.transport.open_session()
self.chan.get_pty()
self.chan.invoke_shell()
Now, I'd like to be able to take paramiko's file descriptor for the pty and issue something like the following:
fd = self.chan.fileno()
old = termios.tcgetattr(fd)
old[3] = old[3] | termios.ECHO
termios.tcsetattr(fd, termios.TCSADRAIN, old)
However, termios chokes on the file-descriptor returned by chan.fileno().
Most suggestions for turning off echo that I have seen require issuing a bash command like "stty -echo" on the remote box, but a Cisco router is not running bash.
After spending a lot of time with this, I ended up going back to the pxssh library. This library explicitly has a way to turn echo off:
connection.setecho(False)
...which was exactly what I needed. It also (via the parent module, pexpect) has a way to deal with telnet using the exact same library infrastructure (which unfortunately is still necessary often in the Cisco world), so you can have a connection object that uses telnet or ssh and it works in exactly the same way.
Whereas Paramiko seems like a much cleaner, better maintained library, the consensus in the Paramiko community seems to be that if you want to stop echo you need to tell the remote system not to echo. But when the remote system is not a linux/bash process, that becomes difficult or impossible. Pxssh is the library you need for more fine-grained control of your ssh session.
I've been using Paramiko today to work with a Python SSH connection, and it is useful.
However one thing I'd really like to be able to do over the SSH is to utilise some Pythonic sugar. As far as I can tell I can only use the inbuilt Paramiko functions, and if I want to anything using Python on the remote side I would need to use a script which I have placed on there, and call it.
Is there a way I can send Python commands over the SSH connection rather than having to make do only with the limitations of the Paramiko SSH connection? Since I am running the SSH connection through Paramiko within a Python script, it would only seem right that I could, but I can't see a way to do so.
RPyC could be what you're looking for. It gives you access to another machine's Python environment directly from the local script.
>>> import rpyc
>>> conn = rpyc.classic.connect("someremotehost.com")
>>> conn.modules.sys.path
['D:\\projects\\rpyc\\servers', 'd:\\projects', .....]
To establish a connection over SSL or SSH, see:
http://rpyc.sourceforge.net/docs/secure-connection.html#ssl
Well, that is what SSH created for - to be a secure shell, and the commands are executed on the remote machine (you can think of it as if you were sitting at a remote computer itself, and that either doesn't mean you can execute Python commands in a shell, though you're physically interact with a machine).
You can't send Python commands simply because Python do not have commands, it executes Python scripts.
So everything you can do is a "thing" that will make next steps:
Wrap a piece of Python code into file.
scp it to the remote machine.
Execute it there.
Remove the script (or cache it for further execution).
Basically shell commands are remote machine's programs themselves, so you can think of those scripts like shell extensions (python programs with command-line parameters, e.g.).