I am trying to write data to a disk on a remote machine via ssh using a Python script. However it is giving an error of dd: /dev/xbd2d: Device not configured.
import argparse
import os
import time
parser = argparse.ArgumentParser(description='basis')
parser.add_argument("-g", default=1, help="")
args = parser.parse_args()
volume = args.g
instance_ip=10.1.12.3
cmd_ssh='ssh -tt -i basis.pem root#'+instance_ip+ ''' "date | dd of=/dev/xbd2d"'''
os.system(cmd_ssh)
What is quite unusual is that if use the command:
ssh -tt -i basis.pem root#10.1.12.3 "date | dd of=/dev/xbd2d"
in a terminal, it executes correctly without any problem and writes the data to the disk. I wrote the same script in C++ and it worked fine but for some reason Python gives me dd: /dev/xbd2d: Device not configured.
Checking the quotes around the variable and also using \" instead of the triple quotes solved my problem as recommended by Eran
import argparse
import os
import time
parser = argparse.ArgumentParser(description='basis')
parser.add_argument("-g", default=1, help="")
args = parser.parse_args()
volume = args.g
instance_ip=10.1.12.3
cmd_ssh="ssh -tt -i basis.pem root#"+str(instance_ip)+" \"date | dd of=/dev/xbd2d\""
os.system(cmd_ssh)
Related
I am currently working on a small Python Application which turns a Raspberry Pi 4B into a 48 Channel Audio-Recorder. Basics work, but during Recording, I need a log file which tells me when recording started, which ALSA warnings occurred and when recording stopped.
The recorder can be started with this terminal command:
pi#raspberrypi:~ $ rec -q -t caf --endian little --buffer 96000 -c 48 -b 24 /home/pi/myssd-one/Aufnahmen/test.caf 2>&1 | tee /home/pi/myssd-one/Aufnahmen/logging.log
this records audio in the test.caf file and writes ALSA warnings to logging.log
So far so good.
The Python Program (which should run on a touchscreen with GUI so recording can easily started and stopped) takes care of variable audio-filenames (date-time-stamp) and controls an LED to show that recording is running.
This part of the code takes care of switching on and off:
#!/usr/bin/env python
from tkinter import *
import shlex
import os
import subprocess
import tkinter.font
import datetime
from gpiozero import LED
import RPi.GPIO as GPIO
GPIO.setmode(GPIO.BCM)
GPIO.setup(11, GPIO.OUT)
def ledToggle():
if led.is_lit:
led.off()
my_env = os.environ.copy()
my_env['AUDIODRIVER'] = 'alsa'
my_env['AUDIODEV'] = 'hw:KTUSB,0'
ledButton["text"] = "Turn Recorder on"
print ("recorder stops")
subprocess.Popen(['sudo', 'pkill', '-SIGINT', 'rec'], env = my_env, shell = FALSE, stdout=subprocess.PIPE)
else:
led.on()
my_env = os.environ.copy()
my_env['AUDIODRIVER'] = 'alsa'
my_env['AUDIODEV'] = 'hw:KTUSB,0'
ledButton["text"] = "Turn Recorder off"
print ("recorder starts")
##reference statement command line: "rec -q -t caf --endian little --buffer 96000 -c 48 -b 24 /home/pi/myssd-one/Aufnahmen/test.caf 2>&1 | tee /home/pi/myssd-one/Aufnahmen/logging.log"
command_line = shlex.split("rec '-q' '-t' 'caf' '--buffer' '96000' '-c 48' '-b 24' '/home/pi/myssd-one/Aufnahmen/test.caf' '"2>&1 | tee"' '/home/pi/myssd-one/Aufnahmen/logging.log'")
p1 = subprocess.Popen(command_line, env = my_env, shell = False, stdout=subprocess.PIPE)
I am trying to move the original command line statement into the subprocess.Popen command, to no success yet. The part where routing to the log file is done, fails. It looks as the initiating sox-application 'rec' tries to interpret it as part of its own parameter list, instead of interpreting it as a redirection of stdout and stderr to the log file. I appreciate some guidance in this issue.
Variable Filenames for audio files is already done, but for simplicity taken out of this code snippet.
Thanks Mark, I dived into this command line along your hint that it only can run with shell=True and this implied that it had to be written as a full statement without separating commas and escape quotes. Now it works. Actually, the shlex.split() becomes obsolete.
I am reading data from influxdb into pandas with the below code:-
import pandas as pd
from influxdb import InfluxDBClient
client = InfluxDBClient(host='10.0.0.0', port=7076, username='abcdefghi',
password='dsdsd', ssl=False, verify_ssl=False)
read_client.switch_database('test')
q = 'SELECT * FROM "abc"
df = pd.DataFrame(read_client.query(
q, chunked=False, chunk_size=10000).get_points())
df
Then I am doing some processing.
Now, I have 50 domains and for some domains the influxdbclient is different. Now I want to do something so that it can take input from argpase and I can pass the read client on command line.
I want to keep one common connecting string which will be default and if we pass --influx_host on command line then it should consider that. I am new to this kind of programming in python. I was able to achieve passing variables but not able to Create a connection string. I will appreciate any help.
Basically if I can do something like below or anything better.:-
python test.py -influx_host 10.0.0.10 --port 7076
You should get arguments
import argparse
parser = argparse.ArgumentParser()
parser.add_argument('--influx_host', default='10.0.0.10')
parser.add_argument('--port', default=7076)
args = parser.parse_args() # it will use values from `sys.argv`
and use args.influx_host and args.port in code
client = InfluxDBClient(host=args.influx_host, port=args.port, ...)
And now you can run it with arguments
python test.py --influx_host 10.0.0.10 --port 7076
or without arguments
python test.py
or with only some of arguments
python test.py --influx_host 10.0.0.10
python test.py --port 7076
If you run without --influx_host and/or --port then it uses default values from .add_argument()
Not tested because I don't have influxdb but args shows expected values
import argparse
import pandas as pd
from influxdb import InfluxDBClient
# --- args ---
parser = argparse.ArgumentParser()
parser.add_argument('--influx_host', default='10.0.0.10')
parser.add_argument('--port', default=7076)
args = parser.parse_args() # it will use values from `sys.argv`
#args = parser.parse_args(['--influx_host', 'localhost', ]) # for tests
print(args)
# --- rest ---
client = InfluxDBClient(host=args.influx_host, port=args.port, username='abcdefghi',
password='dsdsd', ssl=False, verify_ssl=False)
read_client.switch_database('test')
q = 'SELECT * FROM "abc"'
df = pd.DataFrame(read_client.query(q, chunked=False, chunk_size=10000).get_points())
print(df)
I'm using a command shell on my script python to retrieve the Id of the last scenario created. (only 1 scenario)
I want to retrieve many scenarios by specifying an argument on my python in order to retrieve like 3 or 4 last scenario's ID.
This is my code:
import argparse
import os
import subprocess
import one_sdk
import time
import argparse
parser = argparse.ArgumentParser()
parser.add_argument('num', help='Number of scenarios to select', default=1, type=int)
options = parser.parse_args()
process = "path/one-ctl so list kind:one.scenario.SharedScenario | tail -'{num}' | awk '{print $1}'"
retrieveId = subprocess.check_output(process, shell=True).rstrip()
print(retrieveId)
When I execute the script num is not known.
can anyone help me thanks to you all
try this:
process = "path/one-ctl so list kind:one.scenario.SharedScenario | tail -'%i' | awk '{print $1}'" % options.num
Most of the code samples I've seen are trying to read from stdin without local echo. To do this they modify the "local modes" flag to remove the setting to "Echo input characters". I thought I could just modify the "input modes" flag to TIOCSTI which is for "Insert the given byte in the input queue.". However, even though I run the script as root, it has no effect. anything I write to the fd seems to go to the terminal output, rather than the terminal input. Basically what I want to do is this exact thing, but in pure python.
"""
termfake.py
Usage: sudo python termfake.py /dev/ttys002
Get the tty device path of a different local termimal by running `tty`
in that terminal.
"""
import sys
import termios
fd = open(sys.argv[1], 'w')
fdno = fd.fileno()
# Returns [iflag, oflag, cflag, lflag, ispeed, ospeed, cc]
tatters = termios.tcgetattr(fdno)
print('original', tatters)
tatters[0] = termios.TIOCSTI
print('TIOCSTI', termios.TIOCSTI)
# Set iflag
termios.tcsetattr(fdno, termios.TCSANOW, tatters)
# Verify setting change
with open('/dev/ttys002', 'w') as fd2:
print('modified', termios.tcgetattr(fd2.fileno()))
fd.write('This is test\n')
fd.close()
TIOCSTI is an ioctl (documented in tty_ioctl(4)), not a terminal setting, so you can't use tcsetattr() -- you need to feed each character of the fake input to ioctl() instead. Never had to do ioctl's from Python before, but the following seems to work for running an ls in a different terminal (specified as the argument, e.g. /dev/pts/13) that's running Bash:
import fcntl
import sys
import termios
with open(sys.argv[1], 'w') as fd:
for c in "ls\n":
fcntl.ioctl(fd, termios.TIOCSTI, c)
TIOCSTI requires root privileges (or CAP_SYS_ADMIN to be more specific, but that's usually the same in practice) by the way -- see capabilities(7).
I took the answer from #Ulfalizer and expanded it a bit to be a complete and usable app.
import sys
import fcntl
import termios
import argparse
parser = argparse.ArgumentParser()
parser.add_argument('tty', type=argparse.FileType('w'),
help='full tty path as given by the tty command')
group = parser.add_mutually_exclusive_group()
group.add_argument('-n', action='store_true',
help='prevent sending a trailing newline character')
group.add_argument('--stdin', action='store_true',
help='read input from stdin')
group = parser.add_argument_group()
group.add_argument('cmd', nargs='?',
help='command to run (required if not using --stdin)')
group.add_argument('args', nargs='*',
help='arguments to command')
args = parser.parse_known_args()
if args.stdin:
data = sys.stdin.read()
else:
data = ' '.join([args.cmd] + args.args)
for c in data:
fcntl.ioctl(args.tty, termios.TIOCSTI, c)
if not args.n and data[-1][-1] != '\n':
fcntl.ioctl(args.tty, termios.TIOCSTI, '\n')
Here is how you use it:
Terminal #1: do...
$ tty > /tmp/t1
Terminal #2: do...
$ sudo python termfake.py $(cat /tmp/t1) date +%s
Terminal #1: observe...
$ tty > /tmp/t1
$ date +%s
1487276400
I have thoroughly confused myself with Python subprocess syntax!
I would like to decrypt a string using openssl from within a Python script.
Here is the bash script snippet that works:
readable_code=$(echo "$encrypted_code"| openssl enc -aes-128-cbc -a -d -salt -pass pass:$key)
So in a python script - I understand that to run this same bash command I should use subprocess.
I need to Pipe the echo to the openssl command and as well pass in the encrypted_code and key variables dynamically(its in a loop).
Anyone out there know the correct syntax for this ?
Below's snippet should give the background to what i'm trying to do.
thank-you
import subprocess
key = "my-secret-key"
file = list_of_ips #format ip:long-encrypted-code
with open(file_read) as f:
#read in all connecion requests
content=f.readlines()
#create list that will hold all ips whose decrypted codes have passed test
elements = []
for ip_code in content:
#grab the ip address before the colon
ip = ip_code.split(':', 1)[0]
#grab the encrypted code after the colon
code = ip_code.split(':',1)[1]
#here is where I want to run the bash command and assign to a python variable
decrypted_code = subprocess....using code and key variables
...on it goes....
To emulate the shell command:
$ readable_code=$(echo "$encrypted_code"| openssl enc -aes-128-cbc -a -d -salt -pass "pass:$key")
using subprocess module in Python:
from subprocess import Popen, PIPE
cmd = 'openssl enc -aes-128-cbc -a -d -salt -pass'.split()
p = Popen(cmd + ['pass:' + key], stdin=PIPE, stdout=PIPE)
readable_code = p.communicate(encrypted_code)[0]
I highly recommend you to use Plumbum Python library to write shell scripts.
Particularly it has a convenient way to do piping and redirection.
I don't really understood what exact task you trying to solve, but your code could look approximately like this:
from plubum.cmd import openssl
with open('file') as f:
for ip_code in f:
(openssl['whatever', 'params'] << ip_code)()