Issues in OFDM signal transmission - python

I am trying to work on the signal transmission and reception using the OFDM. I got following issues :
If a channel module is used, will be able to receive what transmitted, but while trying to transmit and receive over a hackRf module it displays as data packets missing or parser value detected.
The SNR value is always low on both transmitter and receiver sides.
There is always a shifting and duplication of the constellation points after the repack bits module.
Do anyone have any suggestions or previous experience like this ?
Any help provided will be appreciated.Thanks in advance.
Transmitter flow graph:
Constellation plot:
Transmitted signal FFT:

Related

Decode serial strings from recorded voltage signal

I used Arduino (a Teensyduino) to intermittently print strings through Serial. These strings are basically of integers ranging from 1 to 1000,
e.g.
Serial1.print('456');
delay(1000);
Serial1.print('999');
At the same time, I directly record the voltage output from the serial transmission pin, using some data acquisition system sampling at 30000 Hz. The voltage recording occurs over the span of an hour, where multiple strings are printed at random times during the hour. These strings are printed typically several seconds apart.
I have a recording of the entire voltage signal across an hour, which I will analyse offline in Python. Given this vector of 0-5V values across an hour, how do I detect all occurrences of strings printed, and also decode all the strings from the voltage values? e.g. retrieve '456' and '999' from the example above
Okay, if you want to do it from scratch, you're doing this wrong.
First thing you need to know is the transmission protocol. If you can transmit whatever you can from the Teensy, then you've got yourself what is called an oracle and you've already half way to the goal: start transmitting different bit sequences (0xFF, 0xF0, 0x0F, 0x00) and see what gets transmitted along the line, and how. Since the Teensy is almost certainly using straight 9600 8N1, you are now at this stage exactly (you could reproduce the oscilloscope picture from voltage data if you wanted).
Read those answers, and you'll get the rest of the road to a working Python code that translates voltage spikes to bits and then to characters.
If you don't have an oracle, it gets more complicated. My own preference in that case would be to get myself a pliable Teensy all for me and do the first part there. Otherwise, you have to first read the post above, then work it backwards looking at data recordings, which will be much more difficult.
In a pinch, in the oracle scenario, you could even shoot yourself all codes from '0' to '9' - or from 0x00 to 0xFF, or from '0000' to '9999' if that's what it takes - then use a convolution to match the codes with whatever's on the wire, and that would get you the decoded signal without even knowing what protocol was used (I did it once, and can guarantee that it can be done. It was back in the middle ages and the decoder ran on a 80286, so it took about four or five seconds to decode each character's millisecond-burst using a C program. Nowadays you could do it real time, I guess).

How to send thousands of data points in a single LoRaWAN packet?

I am currently doing a simulation, in which each second has 200k points. I want to send this in real time as much as possible with very minimal delay. The problem is sending 1 packet in LoRaWAN has delays and some packets are not sent, which is natural.
How can I send these 200k points in a single packet? For example, after 1 second I will send all data (200k points) to the network, in a packet.
BTW, I am using Python.
The use case you have is not one for LoRaWAN. It is for low data, low need applications over wide areas. 200k points (which I must assume by the name is not a single byte per unit) every second is a datatransfer of at least 720MB. That is way to much.
This is never going to work, you need to move to WiFi/Bluetooth to those kinds of transfer but your range is going to decrease dramatically.

StdOut value meanings from ZMQ pub/sub from GnuRadio?

I am working with a remote source to get data from a 1MHz bandwidth. Once I have isolated a peak in the FFT plot like this, I am sending that peak data through a ZMQ Pub Sink to a separate computer, where I decode the bytes into a string using this code. When I run the code, I get a series of numbers separated by periods. However, I don't know what the output numbers mean. Could anyone help me out with this? Thank you!!
This is a picture of the plain output from the low pass filter block and this is the output when I push the peak detector data.

Recovering Real PSK

I am trying to transmit and receive a BPSK signal from an Ettus Research N210 to an Ettus Research B200. I run my received signal through gain control, clock sync, and a PLL, then try to demodulate the signal.
Here is my flowchart.
In simulation (passing the signal through a channel block instead of transmitting from one radio to the other), this flowchart works fine. Below are the results of the simulation. As you can see, the receiver sees the rotated constellation and the processing corrects for this. Everything is fine and the packets are successfully decoded.
However, when I transmit and receive from my two real radios, I no longer receive signals that resemble 2-PSK. Instead, the constellation plots of the RX signal just look like blobs.
Here is my flowchart again with the USRP blocks un-commented.
And here are the results of the transmission and receive.
I am very confused by the lack of constellation pattern in the received signal. Sometimes when I send a packet, the RX constellation takes on a more orderly oval-looking shape, but it does not look like a line. Unfortunately I was unable to catch the oval pattern on screenshot since it returns to blob pattern very quickly.
I do not think this is a hardware issue because I have successfully used these radios before for UHF GMSK stuff.
Is there something wrong with my timing recovery / processing?
Thanks yall in advance for any and all help.
Found the issue. I had set my sampling rate lower than the USRP's minimum sampling rate. After a day of frustration, I changed my sampling rate to 320k and a few things in my processing block, and now things work and I get a nice looking constellation.
Here are my updated (working) flowchart and plots.

Remove noise from Load Cell ADC Reading

I have connected my load cell to raspberry pi. I have used ADS1231 from TI and python code on my raspberry pi to get adc readings at 80 samples/sec or 80Hz sampling speed. As you know the adc signals do contain some spikes or noise which i want to remove and get the stable reading. I have read many articles redaring Fourier transform and various other articles on removing noise from the signal.
Can any body guide me on using any dsp method in python for my application ? I don't know much about DSP and how to implement them. Any library in python or any example code can be of help.
Thanks in advance.

Categories

Resources