I'm trying to setup a ROS Action server & client to handle sending images (encoded as 64-bit strings) between Python and ROS (with the goal of making the image something other scripts can pull from ROS). Being very new to all of this (Python, Ubuntu, Bash, ROS, etc...), I'm having a very difficult time determining HOW exactly to do this. I think part of the reason is that the ROS wiki tutorials/documentation are linear to a fault, and the process just comes across as convoluted and extraordinarily complicated. Does anyone out there know of any non-ROS-wiki related/dependent tutorials to help me figure this out? Or can you create a concise step-by-step guide to establishing this system? I've been unable to find much of anything relating to this topic specifically - which makes me think it's either a very uncommon use, or it's super easy and I'm just not at that level yet.
My attempt at a solution is essentially just getting the information flow down. I want Python to be able to read in an image, convert it to bytes (using b64encode), and send it over to ROS to publish as an action. (Thus, a stream of images can be sent with no pause, as would be done with a service, if I understand correctly.) Anything subscribed to the node (or server, however that works, I'll figure it out when I get there) can then see the images and pull them from the action server.
Now, I'm being told an action is the best way to do this. Personally, I don't see why a service wouldn't suffice (and I've at least gotten one of those to work).
Thanks again for any help you all can provide!
Edit: The end application here is for video streaming. A server will grab live video, convert it to images, change those to byte strings, and stream them to the client, which will then publish them to a ROS Action Server.
I think you're overcomplicating it. I wouldn't implement it as an actionlib server, although that is one approach. I've created a few similar systems, and this is how I structured them:
Write your node that streams video by publishing images on a topic. You can implement this as an actionlib server, but that's not required. In my case, I used the pre-existing raspicam_node to read a Raspberry Pi's camera. If you want to implement this in Python, read the tutorial on creating a publisher.
Create a node that subscribes to your image topic, and then reads the image from the topic message. Again, the tutorial shows how to create a subscriber. The main difference is that you'd use either CompressedImage or Image from sensor_msgs.msg as your message type.
For the subscriber-side, here's an example Python ROS node I wrote that implements a MJPEG streamer. It subscribes to a topic, reads the image data, and re-publishes it via a streaming HTTP response. Even though it's "slow Python", I get less than 1 second of latency.
Related
I'm working on a project that requires a control interface in the browser, because some controls are proportional and of very high resolution, it would be best to have a constant stream of data without the need for requests. After some research it appears I need to use a websocket (if there is a better way, that would be an equally useful answer) but the micropython websocket module appears completely undocumented and in fact not baked into my build. Does anyone know of a better module that has some really simple examples for an idiot like me?
You might be able to use EventSource. It allows your server to easily send a stream of data chunks over a single long-lived connection that can be easily handled in JavaScript (using the EventSource class).
Check out this example in picoweb.
PS: EventSource has some "icing on the cake" features including automatic reconnect, and replay. Discovering these is left as an exercise to the reader XD
I'm working on a project where one client needs to take several snapshots from a camera (i.e. it's actually taking a short-duration video, hence a stream of frames), then send all images to a server which then performs some processing on these images and returns a result to the client.
Client and server are all running Python3 code.
The critical part is the image sending one.
Some background first, images are *640*480 jpeg* files. JPEG was chosen as a default choice, but lower quality encoding can be selected as well. They are captured in sequence by a camera. We have thus approximately ~600 frames to send. Frame size is around 110KiB.
The client consists of a Raspberry Pi 3 model B+. It sends the frames via wifi to a 5c server. Server and client both reside in the same LAN for the prototype version. But future deployments might be different, both in term of connectivity medium (wired or wireless) and area (LAN or metro).
I've implemented several solutions for this:
Using Python sockets on the server and the client: I'm either sending one frame directly after one frame capture, or I'm sending all images in sequence after the whole stream capture is done.
Using Gstreamer: I'm launching a GStreamer endpoint on my client and directly send the frames to the server as I stream. I'm capturing the stream on the server side with OpenCV compiled with GStreamer support, then save them to the disk.
Now, the issue I'm facing is that even if both solutions work 'well' (they get the 'final' job done, which is to send data to a server and receive a result based on some remote processing), I'm convinced there is a better way to send a large amount of data to a server, using either the Python socket library, or any other available tools.
All personal researches are done on that matter lead me to solutions similar to mine, using Python sockets, or were out of context (relying on other backends than pure Python).
By a better way, I assume:
A solution that saves bandwidth as much as possible.
A solution that sends all data as fast as possible.
For 1. I slightly modified my first solution to archive and compress all captured frames in a .tgz file that I send over to the server. It indeed decreases the bandwidth usage but also increases the time spent on both ends (due to the un/compression processes). It's obviously particularly true when the dataset is large.
For 2. GStreamer allowed me to have a negligible delay between the capture and the reception on my server. I have however no compression at all and for the reasons stated above, I cannot really use this library for further development.
How can I send a large number of images from one client to one server with minimal bandwidth usage and delay in Python?
If you want to transfer images as frames you can use some existing apps like MJPEG-Streamer which encode images from a webcam interface to JPG which reduces the image size. But if you need a more robust transfer with advanced encoding you can use some Linux tools like FFMPEG with streaming which is documented in here.
If you want lower implementation and control the whole stream by your code for modifications you can use web-based frameworks like Flask and transfer your images directly throw HTTP protocol. You can find a good example in here.
If you don't want to stream you can convert a whole set of images to a video encoded format like h264 and then transfer bytes throw the network. You can use opencv to do this.
There are also some good libraries written in python like pyffmpeg.
you can restream camera using ffmpeg over network so that client can read it both ways. it will reduce delays.
I have a program that sniffs network data and stores it in a database using pcapy (based on this). I need to make the data available in realtime over a network connection.
Right now when i run the program it will start a second thread for the sniffer and a Twisted server on the main thread, however i have no idea how to get clients to 'tap into' the sniffer that's running in the background.
The end result should be that a client enters an url and the connection will be kept open until the client disconnects (even when there's nothing to send), whenever the server has network activity the sniffer will sniff it and send it to the clients.
I'm a beginner with Python so i'm quite overwhelmed so if anyone could point me in the right direction it would be greatly appreciated.
Without more information (a simple code sample that doesn't work as you expect, perhaps) it's tough to give a thorough answer.
However, here are two pointers which may help you:
Twisted Pair, a (unfortunately very rudimentary and poorly documented) low-level/raw sockets networking library within Twisted itself, which may be able to implement the packet capture directly in a Twisted-friendly way, or
The recently-released Crochet, which will allow you to manage the background Twisted thread and its interactions with your pcapy-based capture code.
I want to be able to schedule delivery of a lightweight message from a server to a client. This is new territory to me so I'd appreciate some advice on the possible approaches available.
The client is running on a Raspberry Pi using node.js (because I'm using node libraries to control a piece of attached hardware). Eventually there will be multiple clients like it.
The server could be anything, though I'm most familiar with Python, django and node.
I want to be able to access the server from a browser and cause it to schedule a future message to the client, effectively a push notification with a tiny bit of data.
I'm looking at pub-sub and messaging systems to do this; I started writing a system that uses node on both ends and sockets, but the approach I want is more fire-and-forget occasional messages, not constant realtime data exchange. I'm also not a huge fan of the node-cron style scheduling, I'd like to be able to retrieve and alter scheduled events and it felt somewhat heavy-handed to layer this on top of a cron system.
My current solution uses python on the server (so I can write a django web interface) with celery and rabbitmq, using a named queue per client. The client subscribes to that specific queue using node-amqp, and off we go. This also allows me to create queues that multiple clients can be interested in, which is a neat bonus.
This answer makes me think I'm doing the right thing -- but as I'm new to this stuff, it feels like I might be missing something. Are there alternatives I should consider in the world of server-client messaging?
Since you are already using python you could take a look at python remote objects, (pyro).
I would like to play around with coding an application that could capture a desktop or section of a screen (height and width variables for resolution) and stream those to an RTMP server (rtmp://server.com/live).
I saw something called rtmplite, but the description of this package is:
"This is a python implementation of the Flash RTMP server"
So I would ultimately like to achieve the following, but will implement it in pieces as I go along, without getting overwhelmed at the project scope:
Make connection to RTMP server (with authentication where needed) to channel on ustream.com, justin.tv/twitch.tv, own3d.tv, etc.
Ability to select height, width selection of desktop or entire desktop and stream live to that channel, as if I was using Flash Media Live Encoder.
Really I just want to make my own Python-based FMLE or Xsplit application so I can stream live on my own without using those applications.
Any libraries you can send me to read up on that explain this FMLE-clone type process or information would be helpful! Thanks
I did some RTMP streaming from python for the wiidiaplayer project: http://wiidiaplayer.org. This is by no means a full solution, but at least some RTMP functionality has been implemented in python there.
Unfortunately it has been a long time since I touched that code; if you have any questions feel free to ask them; I'm not sure how much of the answers I will be able to provide.