Pyserial producing different results based on the environment - python

I have found this odd issue where running the following code in a python shell works but running a python file with the code in it does not.
import serial
connection = serial.Serial("/dev/ttyACM0", 19200)
write = connection.write("h\r".encode())
print(connection.read(connection.inWaiting()))
connection.close()
running directly in python3 shell gives the following output (which is correct):
b'halt ack\r\n'
while running a python file gives this output:
b''
I have no idea what is causing the inconsistency, does anyone know? No amount of delay seems to fix the problem.

Related

Problems with redirected stdin and stdout when using py in matlab to run python code

I have some python code that runs perfectly when run within python. When I run the code from within matlab using py I appear to have problems with the redirection of stdin and stdout.
The problem can be illustrated with the python code in a file ztest.py:
import os
def ztest():
os.system(f'./jonesgrid > /dev/null < zxcvbn-split-08.txt')
When run within python, all is well:
>>> import ztest
>>> ztest.ztest()
>>>
When this is run from matlab with the command
>> py.ztest.ztest()
I get the following output:
At line 283 of file jonesgrid.for (unit = 5, file = 'fort.5')
Fortran runtime error: End of file
Error termination. Backtrace:
#0 0x2aaaaacead1a
<other lines cut>
The file fort.5 has been created together with fort.6. Normally these two files are associated with standard input and output respectively and are not created in the run. I have also tried using subprocess.run() and get the same problem.
I'm not sure whether this should be posted in a python forum or a matlab one, but I'm guessing the problem lies with the way in which matlab is interfacing with python. Other parts of my code that use os.system() that don't make use of redirection work fine.

Python script acting different when run from /.bashrc

I made a code on Raspberry pi that clone one usb stick to another. It works fine when I run it from python shell. But when I tried to run it using /.bashrc when I boot up or open terminal I get error TypeError: __init__() got an unexpected keyword argument 'text'. I fugured out that by ommitting text argumet I fix this error but then my code fail to read from stderr.
My code:
comm = 'sudo dd if=/dev/sda of=/dev/sdb status=progress'
cloning = subprocess.Popen(comm, shell = True, stderr = PIPE, text = True)
while True:
output = cloning.stderr.readline()
progress_bar(output)
Above code works when run directly in python. When run using /.bashrc - Command get executed when 'text' argument deleted from code. But code get stuck at .readline().
Connected question:Getting stdout from console
Thanks for help in advance.

Unable to read file because of unfinished subprocess: Python

I'm using subprocess to run a script.R file from test.py. My test.py goes as:
import subprocess
import pandas as pd
subprocess.call(["/usr/bin/Rscript", "--vanilla", "script.R"]) #line 3
df=pd.read_csv("output.csv") #line 4
script.R goes as:
library(limma)
df <- read.csv("input.csv")
df<-normalizeCyclicLoess(df)
write.csv(df,"output.csv")
When I run the above file (test.py), I get an error:
FileNotFoundError: [Errno 2] File b'output.csv' does not exist: b'output.csv'
I understand that this error is because a output.csv file doesn't exist in my working directory. But, I assume that it would be created by script.R, which isn't happening probably because before the execution of line 3 finishes, python goes to line 4. We used help from here for this, and as mentioned, we are using call for this. What's going wrong then? Thanks...
EDIT: I noticed that if I don't import any libraries in my code (like limma above), everything works fine. I can write any lengthy code as I want, and it doesn't give me any error and proceeds to completion. But, as soon as I import any library, subprocess.call(....) gives me a non-zero result (zero means process completed). For a test, I changed my script.R to library(limma) (and nothing else) (tried for other libraries too- got the same results), and it gave me a non-zero result. Thus, I think, there's some problem with the import of libraries using subprocess. Please note that I'm able to run this R code directly, and nothing is wrong with my code/library. Please give me some hints on what might be going wrong here...
Apologies, my initial answer was completely wrong.
Your subprocess call:
subprocess.call(["/usr/bin/Rscript", "--vanilla", "script.R"])
will return a number - this is the exit code of the process. If it succeeds, without error, it should be zero. You could check this to make sure your R code ran correctly.
Have you tried running your R code directly? Does it produce the output.csv? And if so, does it produce it in the right place?

Python script hangs without throwing exception/error on Windows

There isn't a particular point in the script where it hangs (I've seen it getting stuck at random points in the script), so checking the logs didn't yield much insight. It doesn't even throw an exception or an error. It just keeps running while stuck.
I'm basically calling this python script from a powershell file (which later gets called by Task scheduler).
$python = "C:\Python34\python.exe"
$python_path = "C:\Source\main.py"
cd (split-path $python_path)
while($true)
{
& $python $python_path
}
Is there something I need to do to make sure it doesn't get stuck?
You Will have to do your path strings like this path ="c:\\test\\file"
Because it ignorer the first backslash

Subprocess statement works in python console but not work in Serverdensity plugin?

in the python console the following statement works perfectly fine (i guess using eval that way is not really good, but its just for testing purpose in this case and will be replaced with proper parsing)
$ python
>>> import subprocess
>>> r = subprocess.Popen(['/pathto/plugin1.rb'], stdout=subprocess.PIPE, close_fds=True).communicate()[0]
>>> data = eval(r)
>>> data
{'test': 1}
when i convert this into a Serverdensity plugin however it keeps crashing the agent.py daemon everytime it executes the plugin. i was able to narrow it down to the subprocess line but could not find out why. exception catching did not seem to work also.
here how the plugin looks like:
class plugin1:
def run(self):
r = subprocess.Popen(['/pathto/plugin1.rb'], stdout=subprocess.PIPE, close_fds=True).communicate()[0]
data = eval(r)
return data
I'm quite new to work with python and cant really figure out why this wont work. Thanks a lot for ideas :)
Do you have subprocess imported in the module? Also what error are you getting could you post the error message ?
After switching my dev box (maybe because of the different python version?) i finally was able to get some proper error output.
Then it was rather simple: I really just needed to import the missing subprocess module.
For who is interested in the solution:
http://github.com/maxigs/Serverdensity-Wrapper-Plugin/blob/master/ruby_plugin.py
Not quite production ready yet, but works already for save input

Categories

Resources