Using Python To Plot Live Lidar Data Causing Circular Plots - python

I am using the RPLidar A1: https://www.adafruit.com/product/4010
My goal is to collect data sets and plot them in order to get a live visual representation of the data.
My current code is:
import numpy as np
import matplotlib.pyplot as plt
from rplidar import RPLidar
def get_data():
lidar = RPLidar('COM6', baudrate=115200)
for scan in lidar.iter_scans(max_buf_meas=500):
break
lidar.stop()
return scan
for i in range(1000000):
if(i%7==0):
x = []
y = []
print(i)
current_data=get_data()
for point in current_data:
if point[0]==15:
x.append(point[2]*np.sin(point[1]))
y.append(point[2]*np.cos(point[1]))
plt.clf()
plt.scatter(x, y)
plt.pause(.1)
plt.show()
The above code produces a refreshing graph with changing data as shown below:
The issue is that this is not an accurate representation. There is a native application by SLAMTEC called frame_grabber which clearly shows this device giving accurate rectangular shaped representation of my room. Instead I keep getting a circular shape ranging from small to large.
The raw data from the sensor comes in the form of an array containing roughly 100 sets of the following data: (quality, theta, r). My code checks if the quality is good (15 is maximum), and then goes on to plot data sets and clears the data array every seven instances in order to get rid of old data.
I checked the raw polar data in excel and the data also appears to create a circularish shape.

After a few days of trying out various libraries for plotting and trying a few other things I've noticed the mistake.
Hopefully this can prevent someone from making the same mistake in the future.
The lidars typically give data in terms of "theta" and "r". On the other hand the numpy as well as the built in math library in Python accept arguments in radians for performing cos and sin operations.
I've converted the units from degrees to radians and now everything works perfectly.

Related

Convert stl 2 numpy, volume data

Is there a way to convert a stl file to a numpy array?
The numpy array, resolved with x*y*z datapoints should contain volumetric information in the sense of "inside" or "outside" the geometry, say as 0 or 1.
To my surprise I didn't find anything on this yet, although numpy2stl seems to be quite popular.
The problem is a complex geometry of porous media, so convex hull conversion does not work either.
import numpy
import stl
from stl import mesh
stl.stl.MAX_COUNT = 1e10
your_mesh = stl.mesh.Mesh.from_file('Data.stl')
print your_mesh.data
seems to be able to export triangles only.
In addition, even this usually leads to MemoryError messages; but numpy-stl (usually) works for loading the datapoints into numpy.
Is there a way to convert the stl data into volume data that contains information if every point is inside or outside the geometry?
The resulting 3D array could technically be of binary data type, but this isn't required.
overcomplicated
With commercial software this conversion seems to be trivial, but it's not python or free. Implementing a ray casting algorithm from scratch seems over complicated for file type conversion.
I do believe that what you want to do is a voxelization of your volume. You can do that with the trimesh package at https://trimsh.org/
import trimesh
mesh = trimesh.load_mesh('path2yourstlfile.stl')
assert(mesh.is_watertight) # you cannot build a solid if your volume is not tight
volume = mesh.voxelized(pitch=0.1)
mat = volume.matrix # matrix of boolean
You can also check if a list of point are inside the volume using:
mesh.contains(points)
Small typo in [4], trimesh has no matrix atribute, you get it from VoxelGrid.
mat = mesh.matrix
fixed
mat = volume.matrix

Python 2.7 time series non numeric values

I am using Python 2.7 and need to draw a time series using matplotlib library. My y axis data is numeric and everything is ok with it.
The problem is my x axis data which is not numeric, and matplotlib does not cooperate in this case. It does not draw me a time series even though it is not supposed to affect the correctness of the plot, because the x axis data is arranged by a given order anyway and it's order does not affect anything logically.
For example let's say the x data is ["i","like","python"] and the y axis data is [1,2,3].
I did not add my code because I've found that the code is ok, it works if I change the data to all numeric data.
Please explain me how can I use matplotlib to draw the time series, without making me to convert the x values to numeric stuff.
I've based my matplotlib code on following answers: How to plot Time Series using matplotlib Python, Time Series Plot Python.
Matplotlib requires someway of positioning those labels. See the following example:
import matplotlib.pyplot as plt
x = ["i","like","python"]
y = [1,2,3]
plt.plot(y,y) # y,y because both are numeric (you could create an xt = [1,2,3]
plt.xticks(y,x) # same here, the second argument are the labels.
plt.show()
, that results in this:
Notice how I've put the labels there but had to somehow say where they are supposed to be.
I also think you should put a part of your code so that it's easier for other people to suggest upon.

DSP : audio processing : squart or log to leverage fft?

Context :
I am discovering the vast field of DSP. Yes I'm a beginner.
My goal :
Apply fft on an audio array given by audiolab to get the different frequencies of the signal.
Question :
One question : I just cannot get what to do with a numpy array which contains audio datas, thanks to audiolab. :
import numpy as np
from scikits.audiolab import Sndfile
f = Sndfile('first.ogg', 'r')
# Sndfile instances can be queried for the audio file meta-data
fs = f.samplerate
nc = f.channels
enc = f.encoding
print(fs,nc,enc)
# Reading is straightfoward
data = f.read_frames(10)
print(data)
print(np.fft.fft(data))
Now I have got my datas.
Readings
I read those two nice articles here :
Analyze audio using Fast Fourier Transform (the accepted answser is wonderful)
and
http://www.onlamp.com/pub/a/python/2001/01/31/numerically.html?page=2
Now there are two technics : apparently one suggests square (first link) whereas the other a log, especially : 10ln10(abs(1.10**-20 + value))
Which one is the best ?
SUM UP :
I would like to get the fourier analysis of my array but any of those two answers seems to only emphasis the signal and not isolating the components.
I may be wrong, I am a still a noob.
What should I really do then ?
Thanks,
UPDATE:
I ask this question :
DSP - get the amplitude of all the frequencies which is related to this one.
Your question seems pretty confused, but you've obviously tried something, which is great. Let me take a step back and suggest an overall route for you:
Start by breaking your audio into chunks of some size, say N.
Perform the FFT on each chunk of N samples.
THEN worry about displaying the data as RMS (the square approach) or dB (the ln-based approach).
Really, you can think of those values as scaling factors for display.
If you need help with the FFT itself, my blog post on pitch detection with the FFT may help: http://blog.bjornroche.com/2012/07/frequency-detection-using-fft-aka-pitch.html
Adding to the answer given by #Bjorn Roche.
Here is a simple code for plotting frequency spectrum, using dB scale.
It uses matplotlib for plotting.
import numpy as np
import pylab
# for a real signal
def plotfftspectrum(signal, dt): # where dt is the sample rate
n = signal.size
spectrum = np.abs(np.fft.fft(signal))
spectrum = 20*np.log(spectrum/spectrum.max()) # dB scale
frequencies = np.fft.fftfreq(n, dt)
pylab.plot(frequencies[:n//2], spectrum[:n//2])
# plot n//2 due real function symmetry
pylab.show()
You can use it, after reading at least some samples of your data, e.g like 1024.
data = f.read_frames(1024)
plotfftspectrum(data, 1./f.samplerate)
Where I believe your sample rate is in frequency.

Python - How to transform counts in to m/s using the obspy module

I have a miniseed file with a singlechannel trace and I assume the data is in counts (how can i check the units of the trace?). I need to transform this in to m/s.
I already checked the obspy tutorial and my main problem is that i dont know how to access the poles and zeros and amplification factor from the miniseed file.
Also, do I need the calibration file for this?
Here is my code:
from obspy.core import *
st=read('/Users/guilhermew/Documents/Projecto/Dados sismicos 1 dia/2012_130_DOC01.mseed')
st.plot()
Thanks in advance,
Guilherme
EDIT:
I finally understood how to convert the data. Obspy has different ways to achieve this, but it all comes down to removing the instrument response from the waveform data.
Like #Robert Barsch said, I needed another file to get the instrument response metadata.
So I came up with the following code:
parser=Parser("dir/parser/file")
for tr in stream_aux:
stream_id=tr.stats.network+'.'+tr.stats.station+ '..' + tr.stats.channel
paz=parser.getPAZ(stream_id, tr.stats.starttime)
df = tr.stats.sampling_rate
tr.data = seisSim(tr.data, df, paz_remove=paz)
Im using the seisSim function to convert the data.
My problem now is that the output dosen't look right (but i cant seem to post the image)
This is clearly a question which should be asked to the seismology community and not at StackOverflow! How about you write to the ObsPy user mailinglist?
Update: I still feel the answer is that he/she should ask directly at the ObsPy mailing list. However, in order to give a proper answer for the actual question: MiniSEED is a data only format which does not contain any meta information such as poles and zeros or the used unit. So yes you will need another file such as RESP, SAC PAZ, Dataless SEED, Full SEED etc in order to get the station specific meta data. To apply your seismometer correction read http://docs.obspy.org/tutorial/code_snippets/seismometer_correction_simulation.html
To get it in real-life units and not counts, you need to remove the instrument response. I remove instrument response using this code:
# Define math defaults
from __future__ import division #allows real devision without rounding
# Retrieve modules needed
from obspy.core import read
import numpy as np
import matplotlib.pyplot as plt
#%% Choose and import data
str1 = read(fileloc)
print(str1) #show imported data
print(str1[0].stats) #show stats for trace
#%% Remove instrument response
# create dictionary of poles and zeros
TrillC = {'gain': 800.0,
'poles': [complex(-3.691000e-02,3.712000e-02),
complex(-3.691000e-02,-3.712000e-02),
complex(-3.739000e+02,4.755000e+02),
complex(-3.739000e+02,-4.755000e+02),
complex(-5.884000e+02,1.508000e+03),
complex(-5.884000e+02,-1.508000e+03)],
'sensitivity': 8.184000E+11,
'zeros': [0 -4.341E+02]}
str1_remres = str1.copy() #make a copy of data, so original isn't changed
str1_remres.simulate(paz_remove=TrillC, paz_simulate=None, water_level=60.0)
print("Instrument Response Removed")
plt.figure()
str1_remres_m = str1_remres.merge()
plt.plot(str1_remres_m[0].data) #only will plot first trace of the stream
As you can see I have manually defined the poles and zeros. There is probably a way to automatically input it but this was the way that I found that worked.
Remember each instrument has different poles and zeros.
The number of zeros you use depends on what you want your output to be. Seismometers are normally velocity (2 zeros)
3 zeros = displacement
2 zeros = velocity
1 zero = acceleration

Reconstructing the original data from detrended data -- Python

I have obtained the detrended data from the following python code:
Detrended_Data = signal.detrend(Original_Data)
Is there a function in python wherein the "Original_Data" can be reconstructed using the "Detrended_Data" and some "correction factor"?
Are you referring to scipy.signal.detrend? If so, the answer is no -- there is no (and can never be an) un-detrend function. detrend maps many arrays to the same array. For example,
import numpy as np
import scipy.signal as signal
t = np.linspace(0, 5, 100)
assert np.allclose(signal.detrend(t), signal.detrend(2*t))
If there were an undetrend function, it would have to map signal.detrend(t) back to t, and also map signal.detrend(2*t) back to 2*t. That's impossible, since signal.detrend(t) is the same array as signal.detrend(2*t).
I guess you could use numpy to trend your data. That wouldn't properly give you the original data but it would make less 'noisy'.
Read this question as it goes much more in detail into this.

Categories

Resources