FlytBase Blog

Computer Vision for Drones – Part 2

Computer Vision for Drones using FlytAPIs (Part 2: Object Tracking and Following)

Earlier we have posted about computer vision for drones part 1 and now here is another guide. Cameras have become an integral part of autonomous drones, helping them navigate as well as providing some application-specific insights like an object of interest. Detecting a desired object in the camera view allows the drone to take decisions like following the object or orbiting around it. Such capabilities are useful in several applications from photo/video shoots to surveys and search and rescue missions.

The previous article (Part 1) had a glimpse of onboard object tracking module in FlytOS and this article will delve into details on the same. It uses relatively simple OpenCV based algorithms to detect and track an object in the field of view using attributes like color and shape for detection along with Kalman Filter for tracking. It also has support for ROS based OpenTLD library which needs to be separately installed. Along with tracking, there are APIs to make the drone follow the object being tracked. It uses a PD controller and currently assumes a downward looking camera.

FlytVisionScreen-1024x576

FlytVision is an onboard sample web app which provides an interface for ground devices to access onboard video streams and is now extended to include object tracking and following. It also demonstrates how seamlessly onboard image processing fits in the overall framework and allows for data plumbing with ground devices.

The first step is to stream the processed images from the object tracking module. Then select the Detect/Track mode in the app. Currently available modes:

  1. Color: Uses HSV color, heuristics like change in distance and area of the object and Kalman Filter for tracking
  2. Circle: Uses HSV color, Hough circle for circle detection and Kalman Filter for tracking
  3. TLD: Uses ros_open_tld_3d library modified for integration with object tracking module

The object of interest can be selected on the video stream itself. Depending on the selected mode, corresponding attributes are detected and tracked in subsequent images. To follow the object, its distance from the image center is projected to a ground and position setpoints are generated with a PD controller. The overall workflow:

ObjTrackingBlog-1024x383

The onboard modules and web app are first tested in simulation using FlytSim. While the parameters may need to be tuned again for real drone, simulation helps in validating the algorithm and overall functionality before the drone even takes off.

flyt vision

Several params have been exposed from the onboard object tracking module so that they can be tuned from the ground app for a given setup. These include HSV color ranges, Hough circle params, TLD params, controller gains and options to turn attitude compensation, tracking and follow modes On and Off.

Besides params, customs data sharing is required for indicating the region of interest as selected by a user in the video stream. This is achieved by publishing a new topic in the app and subscribing the same in the onboard object tracking module. Whenever a user selects a region by drawing a rectangle in the video stream, the corresponding coordinates are published on this topic.

The Inspect section in the app shows object centroid position, drone’s position and the setpoints being sent. These data streams are obtained by subscribing to them using FlytOS WebSocket APIs. The object tracking features can be accessed in your own custom app using the object tracking FlytAPIs.

In the demo video above we used an  SJCam-4000 camera plugged into a FlytPOD and flew it on a hex-550 frame. The images are captured at 30 fps with 320 x 240 resolution. The onboard object tracking module ran at ~25 fps, had approx 75 % CPU core utilization and used color mode. The camera is rigidly attached to the frame without a gimbal. This requires the onboard attitude compensation for setpoint correction though the onboard video feed is still a bit shaky and smoother operation is possible with a gimbal.

We are exploring more capabilities to add like April tags recognition, gimbal control to keep an object in the image center combined with following and a possibility of accelerated vision processing.

We are currently running FlytPOD Beta program. Sign up here – http://flytbase.com/flytpod/#beta

To learn more about the Flyt platform, please go through – www.flytbase.com

Add comment