PuTTY Windows pywinauto background - python

I did a script on Windows using PuTTY:
from pywinauto.application import Application
app = Application().Start(cmd_line='C:\Program Files (x86)\PuTTY\putty.exe -l user -pw **pwd** -load Proxy_10.153.1.250 '+ ip +' -ssh')
putty = app.PuTTY
putty.Wait('ready')
time.sleep(7)
cmd1 = "show log "+ "{ENTER}"
This script will be executed for many switchs, but when it is executed, I cannot do other tasks on Windows else script will be interrupted? Is it possible to be executed in background?

You need a proper tool for CLI automation. Just run subprocess.call('ssh user#host <the rest of cmd>') or use Paramiko to run remote SSH command.
BTW, pywinauto's code is incomplete, I don't see .type_keys(cmd1). You may try .send_chars(cmd1) instead and use putty.minimize() first. But send_chars is not guaranteed to work with every app (and it's experimental). So you can just try.

Related

Is there a python edge software update tool?

So I have a simple app written in python and running on a set (10) Raspberry Pis.
It is a folder with one runnable script.
I want to have on my external server with public IP some kind of CI/CD like service that would deploy updates to all edge nodes and restart my app on them.
Internet is rare on edge devices thus I want to push updates when I push some button on the server
Is there such thing for python programs that are meant to run on edge devices?
As I understand the main problem is to update and run some script on multiple Raspberry Pi boards, correct?
There are a lot of ready-made solution like dokku or piku. Both allows you to do git push deployments to your own servers (manually).
Or you can develop your own solution, using GitHub webhooks or some HTML form (for manual push) and Flask web-server that will do CI/CD steps internally.
You'll need to run script like above on each node/board. And configure Webhook with URL similar to: http://your-domain-or-IP.com:8000/deploy-webhook but with different port per node.
Or you can open that page manually from browser. Or create separate page that allows you to do that asynchronously. As you'll wish.
from flask import Flask
import subprocess
app = Flask(__name__)
script_proc = None
src_path = '~/project/src/'
def bash(cmd):
subprocess.Popen(cmd)
def pull_code(path):
bash('git -C {path} reset --hard'.format(path=path))
bash('git -C {path} clean -df'.format(path=path))
bash('git -C {path} pull -f'.format(path=path))
# or
# If need just to copy files to remote machine:
# (example of remote path "pi#192.168.1.1:~/project/src/")
bash('scp -r {src_path} {dst_path}'.format(src_path=src_path, dst_path=path))
def installation(python_path):
bash('{python_path} -m pip install -r requirements.txt'.format(python_path=python_path))
def stop_script():
global script_proc
if script_proc:
script_proc.terminate()
def start_script(python_path, script_path, args=None):
global script_proc
script_proc = subprocess.Popen(
'{} {} {}'.format(str(python_path), script_path, ' '.join(args) or '')
)
#app.route('/deploy-webhook')
def deploy_webhook():
project_path = '~/project/some_project_path'
script_path = 'script1.py'
python_path = 'venv/bin/python'
pull_code(project_path)
installation(python_path)
stop_script()
start_script(python_path, script_path)
return 'Deployed'
If your don't need a user interface and use linux I want suggest to use a bash script.
I wrote a simple bash script "to push an update and restart" to
as set for raspberry pi's. Please configure before ssh with key-less login.
#!/bin/bash
listOfIps=(
192.168.1.100
192.168.1.101
192.168.1.102
192.168.1.103
)
username="pi"
destDir="work/"
pythonScriptName="fooScript.py"
for i in "${listOfIps[#]}"
do
echo "will copy folder \"$1\" with content to ip: ${i} and perform"
echo "scp -r $1 ${username}#${i}:${destDir}"
scp -r $1 ${username}#${i}:${destDir}
echo "will kill all python scripts unfriendly"
ssh ${username}#${i} "pkill python"
echo "will restart my python scripts ${pythonScriptName} in dir ${destDir} "
ssh ${username}#${i} "python3 ${destDir}/${pythonScriptName} &"
done
exit 0
save the code in file copyToAll.sh edit username destDir and your script name and make it executable:
chmod 755 copyToAll.sh
call
copyToAll.sh myFileToSend

Calling Matlab scripts from Django with Python's Popen class

I'm developing a Django app which runs Matlab scripts with Python's Popen class. The python script that calls Matlab scripts lives in the main folder of my Django app (with views.py). When I call the script from command line, it runs like a charm but when I make a request from the client in order to run the corresponding python script, I receive the following warning:
"< M A T L A B (R) > Copyright 1984-2018 The MathWorks, Inc. R2018a (9.4.0.813654) 64-bit (glnxa64) February 23, 2018 To get started, type one of these: helpwin, helpdesk, or demo. For product information, visit www.mathworks.com. >> [Warning: Unable to create preferences folder in /var/www/.matlab/R2018a. Preferences folder location must be writable. Using a temporary preferences folder for this MATLAB session. See the preferences documentation for more details.] >>
My app uses a Python virtual environment and it is being deployed with Apache web server.
Here is my python script that calls Matlab scripts:
import os
import subprocess as sp
import pymat_config
def pymat_run():
pwd = pymat_config.pwd_config['pwd']
cmd1 = "-r \"Arg_in = '/path/to/my/main/folder/input.txt'; Arg_out = '/path/to/my/main/folder/file.txt'; matlab_script1\""
baseCmd1 = ['/usr/local/MATLAB/R2018a/bin/matlab', '-nodesktop', '-nosplash', '-nodisplay', 'nojvm', cmd1]
os.chdir('/path/to/matlab_script1')
sudo_cmd = sp.Popen(['echo', pwd], stdout=sp.PIPE)
exec1 = sp.Popen(['sudo', '-S'] + baseCmd1, stdin=sudo_cmd.stdout, stdout=sp.PIPE, stderr=sp.PIPE)
out, err = exec1.communicate()
return out
Any suggestions ?
Finally I managed to find the solution of that issue by myself. The problem came from the kind of user who called the Matlab's script. When I was running the above script from a Python interpreter or from the shell, it was the user (with the user password) who was running the script while when I was calling the script from the client the user was the web server's user: www-data.
So at first to avoid the above warning I gave permissions to www-data user to the /var/www folder with the following command:
sudo chown -R www-data /var/www/
After that, the "Warning" disappeared but the script still didn't run because it was asking for www-data's password internally and taking user's password from pymat_config file.
To solve this, I edited /etc/sudoers file in order for www-data to be able to call Matlab scripts without asking password. So I added the following line:
www-data ALL=(ALL) NOPASSWD: /usr/local/MATLAB/R2018a/bin/matlab
and now it runs like a charm !

Running sudo command via CGI (Python)

I am writing a test suite for a web application using Selenium.
In the course of which I need to test behaviour of the app in case a certain service is running or not.
I wanted to create a cgi call to a Python script turning that service on and off.
I know that the cgi call is in the context of the webserver (Apache) however thought that issuing sudo calls like so:
import subprocess
import os
command = 'sudo -S launchctl unload /Library/LaunchAgents/com.my.daemon.plist'
pwd = 'pwd123'
test1 = subprocess.Popen( command, shell=True, stdin=subprocess.PIPE)
test1.communicate(input=pwd)
test2 = os.system( 'echo %s|%s' % (pwd,command) )
would do the trick, well they don't I get return code 256.
What can I do to have this call be executed w/o touching the context in which Apache runs?
As for security: this will only run on a test machine.
The user that Apache runs as needs to be in the /etc/sudoers file, or belong to the sudo group, which I guess it usually doesn't. You also need to make it not ask for a password, which is configured in /etc/sudoers
For Ubuntu, check these out: https://askubuntu.com/questions/7477/how-can-i-add-a-new-user-as-sudoer-using-the-command-line
https://askubuntu.com/questions/147241/execute-sudo-without-password
It could potentially be a pathing issue..
Have you tried writing out the full path like this:
command = '/usr/bin/sudo -S launchctl unload /Library/LaunchAgents/com.my.daemon.plist'
command should be a list, not a string. Try with:
command = ['sudo', '-S', 'launchctl', 'unload', '/Library/LaunchAgents/com.my.daemon.plist']
Cant run sudo this way -- sudo needs a controlling terminal to run.

Python ssh tunneling over multiple machines with agent

A little context is in order for this question: I am making an application that copies files/folders from one machine to another in python. The connection must be able to go through multiple machines. I quite literally have the machines connected in serial so I have to hop through them until I get to the correct one.
Currently, I am using python's subprocess module (Popen). As a very simplistic example I have
import subprocess
# need to set strict host checking to no since we connect to different
# machines over localhost
tunnel_string = "ssh -oStrictHostKeyChecking=no -L9999:127.0.0.1:9999 -ACt machine1 ssh -L9999:127.0.0.1:22 -ACt -N machineN"
proc = subprocess.Popen(tunnel_string.split())
# Do work, copy files etc. over ssh on localhost with port 9999
proc.terminate()
My question:
When doing it like this, I cannot seem to get agent forwarding to work, which is essential in something like this. Is there a way to do this?
I tried using the shell=True keyword in Popen like so
tunnel_string = "eval `ssh-agent` && ssh-add && ssh -oStrictHostKeyChecking=no -L9999:127.0.0.1:9999 -ACt machine1 ssh -L9999:127.0.0.1:22 -ACt -N machineN"
proc = subprocess.Popen(tunnel_string, shell=True)
# etc
The problem with this is that the name of the machines is given by user input, meaning they could easily inject malicious shell code. A second problem is that I then have a new ssh-agent process running every time I make a connection.
I have a nice function in my bashrc which identifies already running ssh-agents and sets the appropriate environment variables and adds my ssh key, but of cource subprocess cannot reference functions defined in my bashrc. I tried setting the executable="/bin/bash" variable with shell=True in Popen to no avail.
You should give Fabric a try.
It provides a basic suite of operations for executing local or remote
shell commands (normally or via sudo) and uploading/downloading files,
as well as auxiliary functionality such as prompting the running user
for input, or aborting execution.
The program below will give you a test run.
First install fabric with pip install fabric then save the code below in fabfile.py
from fabric.api import *
env.hosts = ['server url/IP'] #change to ur server.
env.user = #username for the server
env.password = #password
def run_interactive():
with settings(warn_only = True)
cmd = 'clear'
while cmd is not 'stop fabric':
run(cmd)
cmd = raw_input('Command to run on server')
Change to the directory containing your fabfile and run fab run_interactive then each command you enter will be run on the server
I tested your first simplistic example and agent forwarding worked. The only think that I can see that might cause problems is that the environment variables SSH_AGENT_PID and SSH_AUTH_SOCK are not set correctly in the shell that you execute your script from. You might use ssh -v to get a better idea of where things are breaking down.
Try setting up a SSH config file: https://linuxize.com/post/using-the-ssh-config-file/
I frequently am required to tunnel through a bastion server and I use a configuration like so in my ~/.ssh/config file. Just change the host and user names. This also presumes that you have entries for these host names in your hosts (/etc/hosts) file.
Host my-bastion-server
Hostname my-bastion-server
User user123
AddKeysToAgent yes
UseKeychain yes
ForwardAgent yes
Host my-target-host
HostName my-target-host
User user123
AddKeysToAgent yes
UseKeychain yes
I then gain access with syntax like:
ssh my-bastion-server -At 'ssh my-target-host -At'
And I issue commands against my-target-host like:
ssh my-bastion-server -AT 'ssh my-target-host -AT "ls -la"'

converting the psftp command written on windows on to linux in python

I am working on python, actually i am trying to connect to sftp and get some files after connecting.
The process which i followed on windows is below
import os
psftpCmd='psftp sftp.example.com -l user -pw pass -b client_configurations\lifebridge.scr -batch'
os.system(psftpCmd)
The code in lifebrige.scr is
cd lifebridge
lcd feeds\lifebridge
get jobs.xml
bye
So i am able to fetch the file successfully i want to do the same process on linux(fedora) machine and i tried the following
import os
psftpCmd='psftp sftp.example.com -l user -pw pass -b client_configurations\lifebridge.scr -batch'
os.system(psftpCmd)
Result:
sh: psftp: command not found
Here i can expect that psftp is putty command so need to do something else on linux fro the same, Can anyone let me now how to write the same command in linux
On Linux, the command is sftp.

Categories

Resources