Im developing an application on my raspberry Pi 3, using gTTS for Python:
from gtts import gTTS
import os
import threading
def greet_thread(word):
tts_thread = threading.Thread(target = greet, args=[word])
tts_thread.start()
def greet(word):
tts = gTTS(text=word, lang='es')
tts.save("words.mp3")
print 'Reproduciendo audio'
os.system("mpg321 -q presilence.mp3")
os.system("mpg321 -q words.mp3")
This works perfectly if i run the python script directly from a shell. But if i execute the python script in background using:
python -u script.py > log.txt 2>&1 &
i get this error in my log:
tcgetattr(): Inappropriate ioctl for device
and don't know why. I think is the way it's called from a background process, but no idea how to solved it. Thanks for your attention and help
The problem is that the program needs to be executed using the same user that executes the GUI. So if you are going to execute it in a command shell, avoid using 'root' user.
In my case i need the program executes on start up too. So i solved it using "auto start" instead of a crontab
Navigate to ~/.config/lxsession/LXDE-pi
nano autostart
Edit the file:
#lxpanel --profile LXDE-pi
#pcmanfm --desktop --profile LXDE-pi
//your script.sh
//or #python script.py
#xscreensaver -no-splash
#point-rpi
Save and exit
Reboot
I was having a similar issue with calling mpg321 from a python script that was launched from a bash script via crontab on reboot. I was getting the vague error: tcgetattr(): Inappropriate ioctl for device
After digging thru numerous threads and trying everything I could, I changed to use omxplayer instead and it seems to have solved the issue.
Best I can tell it was some kind of permissions issue with launching from crontab because I could run it without any issue from a terminal session.
Related
I am trying to run a python3 script remotely trough ssh, first of all i would like to know if this is even possible if the machine i am trying to run the script on doesnt have a python3 interpreter only a 2001 version of python.
And also i am using the following command to run the script , but its not working:
spawn sh -c "ssh -oPort=$port $ip /usr/bin/env < /home/pythonscript
Pythonscript contains a command meant to output the connected COM ports,it is the following:
import serial.tools.list_ports
print([comport.device for comport in serial.tools.list_ports.comport()])
The output that i get from this is a bunch of system information belonging to the machine i am connecting to, stuff like HOSTNAME,USER,MACHTYPE,MAIL,SHELL,OSTYPE
How would i get my intended output from the command that i am executing
All help appreciated
Firstly you will need to install python3 on each server you will be running this on. You will also need to install pip3 and install pyserial. You could also use virtualenv if you want but I'll leave that to you.
Found a small bug in that script. According to the latest version anyway it's "comports" not "comport: https://pyserial.readthedocs.io/en/latest/tools.html
Updated Version:
from serial.tools.list_ports import comports
print([comport.device for comport in comports()])
I was able to run that remotely simply by doing the following
ssh localhost "python3 /home/user/projects/stack_overflow/56900773_remote_python_ssh/pythonscript.py"
Change localhost to whatever server you want and swap my path for the full path to your script.
When I ran that command I just get a blank list, probably because I have no comports :)
I have a Python script that should open my Linux terminal, browser, file manager and text editor on system startup. I decided crontab is a suitable way to automatically run the script. Unfortunately, it doesn't went well, nothing happened when I reboot my laptop. So, I captured the output of the script to a file in order to get some clues. It seems my script is only partially executed. I use Debian 8 (Jessie), and here's my Python script:
#!/usr/bin/env python3
import subprocess
import webbrowser
def action():
subprocess.call('gnome-terminal')
subprocess.call('subl')
subprocess.call(('xdg-open', '/home/fin/Documents/Learning'))
webbrowser.open('https://reddit.com/r/python')
if __name__ == '__main__':
action()
here's the entry in my crontab file:
#reboot python3 /home/fin/Labs/my-cheatcodes/src/dsktp_startup_script/dsktp_startup_script.py > capture_report.txt
Here's the content of capture_report.txt file (I trim several lines, since its too long, it only prints my folder structures. seems like it came from 'xdg-open' line on Python script):
Directory list of /home/fin/Documents/Learning/
Type Format Sort
[Tree ] [Standard] [By Name] [Update]
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
/
... the rest of my dir stuctures goes here
I have no other clue what's possible going wrong here. I really appreciate your advice guys. thanks.
No, cron is not suitable for this. The cron daemon has no connection to your user's desktop session, which will not be running at system startup, anyway.
My recommendation would be to hook into your desktop environment's login scripts, which are responsible for starting various desktop services for you when you log in, anyway, and easily extended with your own scripts.
I'd do as tripleee suggested, but your job might be failing because it requires an X session, since you're trying to open a browser. You should put export DISPLAY=:0; after the schedule in your cronjob, as in
#reboot export DISPLAY=:0; python3 /home/fin/Labs/my-cheatcodes/src/dsktp_startup_script/dsktp_startup_script.py > capture_report.txt
If this doesn't work, you could try replacing :0 with the output of echo $DISPLAY in a graphical terminal.
So I am trying to do following:
I have Cygwin enabled with screen and ssh daemon in Windows 7.
I create a new screen using the command screen -dmS "my_screen" on my Windows machine.
I ssh to the Windows machine from my Linux machine.
I attach to it from my unix machine using screen -d -r my_screen
Now I try to launch a Windows application, for example notepad.exe.
Now I want to a automate this using Python. The objective is to just manually ssh to Windows and then run a Python script which will do the above steps. I have written the following script but it is not working:
import shlex
import os
import time
import subprocess
cmdString = "screen -d -r default_screen"
cmdArgs=shlex.split(cmdString)
p=subprocess.Popen(cmdArgs)
cmds = "./notepad.exe"
cArgs=shlex.split(cmds)
pp=subprocess.Popen(cArgs)
This is not working. :( Basically to get the screen I will probably need to import pty package or tty. But pty & tty are not supported in Windows. I am able to attach to the newly created screen but then attempt to launch the Windows program like notepad for example fails. It hangs and the windows GUI is not launched as it would when down manually.
I am still exploring this but I will appreciate it if someone can point me to the right way to do it.
I put the screen command in the bash profile script of cygwin user. This is working now.
I am a newbie in Fabric, and want to run one command in a background, it is written in shell script, and I have to run that command via Fabric, so lets assume I have a command in shell script as:
#!/bin/bash/
java &
Consider this is a file named myfile.sh
Now in Fabric I am using this code to run my script as:
put('myfile.sh', '/root/temp/')
sudo('sh /root/temp/myfile.sh')
Now this should start the Java process in background but when I login to the Machine and see the jobs using jobs command, nothing is outputted.
Where is the problem please shed some light.
Use it with
run('nohup PATH_TO_JMETER/Jmetercommand & sleep 5; exit 0)
maybe the process exists before you return. when you type in java, normally it shows up help message and exits. Try a sleep statement or something that lingers. and if you want to run it in the background, you could also append & to the sudo call
I use run("screen -d -m sh /root/temp/myfile.sh",pty=False). This starts a new screen session in detached mode, which will continue running after the connection is lost. I use the pty=False option because I found that when connecting to several hosts, the process would not be started in all of them without this option.
In remote server, I have a script test.sh like:
#!/bin/bash
echo "I'm here!"
nohup sleep 100&
From local, I run 'fab runtest' to call the remote test.sh.
def runtest():
run('xxxx/test.sh')
I can get the output "I'm here!", but I can Not find the sleep process in remote sever.
What did I miss?
Thanks!
It is possible to run the nohup inside the script in remote machine?
I checked the answer here and Fabric FAQ, also get the hints from fabric appears to start apache2 but doesn't and it works for me to combine them together
You can keep your test.sh without changes, and add pty=False with related shell redirection.
from fabric.api import *
def runtest():
run("nohup /tmp/test.sh >& /dev/null < /dev/null &",pty=False)
At least, it works for me.
According to the Fabric FAQ you can no longer effectively do this. Instead you should use tmux, screen, dtach or even better use the python daemon package:
import daemon
from spam import do_main_program
with daemon.DaemonContext():
do_main_program()
We ran into this problem and found that you can use nohup in a command, but not in the script itself.
For example, run('nohup xxxx/test.sh') works.