Computer Vision for Drones using FlytAPIs (Part 1: Video Streaming)
It has been a typical scenario for drones to have an onboard camera and a key requirement to transmit live camera feed to ground devices. As a next step we also want the drones to acquire the images into the onboard processing framework and apply some computer vision algorithms or even simple image processing to enhance the raw image. And then we would like the processed image stream to be sent to ground devices as well. Once we have the video stream in our laptop or mobile, why not interact with it and send the inputs back to the onboard computer? Welcome to FlytOS!
FlytOS is ROS based framework and ROS enables a cleaner modularized way to process and share data. FlytOS image pipeline is as follows:
The basic idea is to get all image data in the ROS pub-sub bus. Then an image processing module can subscribe to raw images and publish back processed images. A web video server can subscribe to any of the available image topics and stream it to ground devices on the web as needed.
The images are captured from USB camera attached to FlytPOD or ODROID and are published on ROS using a cam driver. The ROS based web video server subscribes to an image topic and provides mjpeg stream over http. This stream can then be viewed in most modern browsers without any plugin or client software. FlytOS provides options through its web APIs to list available stream topics and start/stop a particular stream. You can also get a snapshot on demand. Further, you can interact with the image stream e.g. select a region of interest and send that information back to the onboard computer. We have tested this setup and it works well as can be seen in the demo video above.
However, there are several constraints when we think of drone video streaming. We should be able to
- Capture the images from camera into the onboard processing framework like FlytOS for image processing (as against having camera as payload and transmitting images directly to ground)
- Transmit raw as well as processed images to ground
- Get a low latency/lag real-time stream as stale data may not be helpful or sometimes even harmful
- Have relatively low power/processing requirement to be suitable for onboard computer
- Have relatively low bandwidth requirement
- Get good quality image/video
- Have adaptive streaming based on network bandwidth
The current setup in FlytOS serves the first two items well and reasonably tackles the next three points. It is not yet focused on HD quality and does not have adaptive streaming. Additionally, we wanted to stream the video to a web browser without the need for a client software or plugin for easy integration with web apps and cross platform accessibility. A good alternative web oriented implementation with real-time focus could be WebRTC however it does not really support ARM Linux yet.
In any case, having multiple parallel streams being served directly from the drone to several ground devices is probably not an ideal scenario as it would put a strain on the onboard computer and its network link. An alternate setup is to stream the images from the drone to a single computer on the ground. This ground device would be a distribution server (preferably in the cloud) that can then stream parallel to multiple client devices and can also be a WebRTC setup. FlytOS will continue to improve upon its solution and such a streaming architecture could be the next step.
Meanwhile, you can get started with FlytOS right away, test video streaming, try out sample apps and we would like you to give us your feedback and discuss your views on the next steps.
Note: The demo video above shows a sample of Object Detection and Tracking. This is accomplished using an Object Tracking module in FlytOS. We would talk more about what is available and how it works in the next article. Stay tuned!.
Download FlytOS: http://flytbase.com/flytos/#download (get updated .deb if already downloaded)
Video Streaming API reference: http://docs.flytbase.com/docs/FlytAPI/REST_WebsocketAPIs.html#video-streaming-apis
Stay Tuned with us for part 2.