HACKER Q&A
📣 devlop

How to track flying objects?


What technology can be used to track multiple flying objects in a space like football field and up to 100 ft above it? Attaching something lightweight to each object is fine. I would like to visualise an FPV drone race with computer graphics in realtime.

Any ideas are welcome thanks!


  👤 jcims Accepted Answer ✓
There are systems specifically for this purpose, typically called a local positioning system. Decawave has a product that might work for you, you’ll have to check on the range limitations however.

In general you’re going to want to go with a time difference of arrival (TDOA) system, as it can work one way to support many simultaneous locations. These generally require that you set up an array of anchors or bases that send synchronized ultra wideband radio pulses out. The “tags” are the individual receivers that calculate position based on when these are received. The RF behavior is different but ultimately it’s similar type of system to GPS.

Two-way ranging is a different technique that can work with fewer anchors and be more precise, but won’t scale nearly as well. Most commercial products will support both of these modes of operation. In addition, some products have a channel to support reasonably high-rate data transport as well.

If you search for ‘local positioning system TDOA UWB’ you’ll start getting in the right area. I would start small and test heavily with this in realistic venue situations as the protocols used are incredibly simple and could be subject to noise, reflections, etc. Most of the ones I've seen have relatively low power transmitters, you may want to see if a licensed band is an option. You may also need to integrate onboard imu/gps streams with a kalman filter or similar mechanism to patch over data loss and noise. GPS and/or visual failsafes will also be essential for safety. I'm sure there are plenty of regulations here as well if you want to go commercial.

Either way good luck!


👤 bprater
Differential GPS is what drones use for light shows or mapping or anything requiring high fidelity. This gives them centimeter-level precision, at the cost of a large sensor array on both the ground and the drone. If you are adding this to a drone, you'd also need to include orientation and other parameters in your data stream, as you'd only be getting XYZ from GPS. Unfortunately, no simple hardware exists for your request.

👤 yeldarb
I’ve seen a bunch of drone-detection computer vision projects. Usually they’re detecting drones from other drones though (Eg for autonomous racing[1] or drone-defense).

A challenge with doing it from the ground is that the drones will be quite small relative to the size of the image. But with sufficient compute and several cameras, a tiling-based approach[2] should work.

If you want to do unique-identification you’ll also need object tracking[3].

This is exactly the type of project Roboflow (our startup) is built to empower! Happy to chat/help further (Eg we might be able to help source a good dataset to start from). And if it’s for non-commercial use it should be completely free.

[1] https://blog.roboflow.com/drone-computer-vision-autopilot/

[2] https://blog.roboflow.com/detect-small-objects/

[3] https://blog.roboflow.com/zero-shot-object-tracking/


👤 jdiez17
Hey, interesting question! I think using ground-based equipment to accurately measure the position of fast-moving drones in a large outdoor volume might be quite difficult. A pretty common way to do this indoors is to attach infrared retroreflectors to the object. You then have infrared cameras in different locations. Each camera viewpoint restricts the possible location of the marker to a cylinder along its view axis. Using two or more viewpoints, you can estimate the actual 3D position. Two problems for your application: 1) the infrared light from the sun will add a lot of noise to your measurements, 2) it is generally difficult to identify which marker is which. [1] is an example of an off-the-shelf system that works quite well though.

I think some onboard positioning system might work best for your application. kognate suggests using "inside out" tracking based on the features observed by a camera on the drone. A nice thing here is that most FPV drones are already transmitting realtime video. It would require significant computational power on the ground to localize drones from their camera feeds though. See [2] for some inspiration.

Another idea that may be possible is to use inertial sensor fusion algorithms using the data from the IMUs onboard the drones to find out their trajectory in real time. However this is quite a tricky business. The sensors would have to be characterized extremely well and be able to deal with the highly dynamic forces that would be felt by a racing drone. Probably would make sense as a standalone module that accepts 5V from the drone's power system and has its own IMU(s) and telemetry radios.

[1] https://optitrack.com/

[2] https://matthewearl.github.io/2021/03/06/mars2020-reproject/


👤 xaedes
Won't be easy, drones are small and fast.

RGB based detection will probably be too slow and error prone. Rather put active IR LEDs or similar markings which can be easily detected, use cameras which only let through IR and high framerate! Then use computer vision to spot blobs. Finally compute 3D position by triangulation.

Active IR tracking is still pretty much State of the Art for motion capturing and the like.

Short googling leads to OptiTrack, where they even advertise exactly this use case of drone tracking:

https://optitrack.com/applications/robotics/


👤 crispyambulance
There's a company called "Sports Media Technology" SMT that has some nice integrations with the NHL (Hockey). https://www.smt.com/hockey#techinfo

They put emitters/sensors in the hockey puck as well as on the players. The data gets processed and displayed on video for audiences as an "augmented reality" experience.

My understanding is that the puck has an infrared emitter that is tracked by sensors in various locations around the rink and this can locate the realtime position of the puck. The players also have sensors/transmitters and this makes it possible to have really responsive position tracking (the video in the link shows how it looks quite nicely).

I suppose the speed and erratic motion of a hockey puck is not unlike that of an FPV drone.


👤 transistor-man
If you are able to race at dusk or in the evenings with medium external lighting, tracking different colored indicator leds doesn't require extensive or expensive hardware. This will only really give you a 2d overview, but some information on altitude can be gleaned from the amplitude of the light from each drone.

A quick example on how well this works, this is a few roombas bouncing around [1]

This is what that path integration can look like rendered in a video [2]

[1] https://transistor-man.com/PhotoSet/roomba_dance/animated/da...

[2] https://vimeo.com/645355520#t=30s


👤 shireboy
A few ideas: have computer vision tool analyze each frame from one or more cameras above the field, look for the drones, and overlay your graphics. You might have a QR Code or similar. Even a simple color badge if it's just a handful of drones. Then, your cv code just has to look for the big chunk of pink, blue, or green pixels. Look up color tracking, object tracking, etc. Look into https://en.wikipedia.org/wiki/FoxTrax as an old example. OpenCV, Tensorflow possibly as tools now.

As you noted in another comment, GPS by itself probably isn't accurate enough, but there is GPS augmentation tech. You put a base station in the area that measures the drift and sends that over the air to use as correction. I'm thinking you'd take the raw GPS from the drones, apply the correction, and hopefully get sub-meter positioning. Look up DGPS, WAAS for options there.

The other idea that comes to mind is to triangulate based on their radios. You'd have base stations around the perimeter, each measuring the signal strength and direction of the target frequencies. Positioning would be a matter of fairly simple trig + error correction. I don't know if there's anything doing this off-the-shelf, but indoor positioning systems may be a rabbit hole to go down (even if used outdoors).

A final idea is to use the video feed from the drones. You'd place QR Codes throughout the course, process the video feeds, and use the codes seen in each feed to tell which ones are ahead. Or instead of QR Codes, build a point cloud of each point on the course to use as position.

Sounds fun!


👤 kognate
I would suggest using the video feed from each drone to localize it (since it's in a known space). You'll need to do a few passes beforehand and record the background. Doing a puzzle-piece style match isn't terribly (computationally) difficult and then you can fuse the position data in your visualization.

👤 dcanelhas
If the drones have cameras and you can scatter AR markers around the space with known positions, you can probably just use a Kalman filter based on detecting the markers.

👤 weeeee
RTK base station and zed-f9p if you own the drones. Haven’t heard the best about the TDOA systems but ymmv.

👤 brk
Would help to know which drones and/or the software/OS controlling them. Many drones already have APIs that can give you X/Y/Z positioning data. If this is indoors you may not be able to rely on GPS data, but you may have options for other coordinates based on triangulating wifi control signals.

Doing it with machine vision would likely be challenging (and I say this having a fair bit of experience with AI/MV systems). The area you are covering a very large field of view, and drones are generally very small relatively speaking.

If you can't do it with native APIs, I would probably look into an RF style system with a small transmitter on each drone and then antennas places around the stadium to detect and triangulate the signals into 3D space.


👤 Aspos
If there are multiple drones in the frame and you rely on object detection only, then you will have hard time telling drones apart. Even if you use something like lidars and get a bunch of coordinates, you still need to know who is who.

You may want to predict future coordinates of drones to increase tracking accuracy.

Drones have inertia and when split into small enough chunks, trajectory in each chunk can be expressed as a Bezier curve. Given a few past coordinates you can predict the future one, so this helps with object detection and keeping track of each individual drone.

When doing object detection, instead of scanning the whole frame searching for a drone, you will be scanning only the areas it is likely to be, meaning you can run at higher FPS and with higher frame resolution.

https://hsto.org/r/w1560/webt/vq/ga/at/vqgaat7sqymkhlro_8vef...


👤 MayeulC
Ultra-Wideband is starting to make its appearance in some customer-level products, such as airtags. I am not sure if interference would be an issue if tracking multiple object, but the technology seems like a good fit! From a simple search, this product might work: https://www.inpixon.com/technology/rtls/anchors/nanoanq-uwb

You could also go the custom hardware route and trilaterate signals from small embedded transmitters. That would require a lot of effort, but it should work using FPGAs and/or analog electronics.

Another approach could be to use radar and/or RFID: http://rfidradar.com/howworks.html


👤 ivankirigin
If the drones localize, you can get a constantly updating position over radio. Even with other tools, this will help disambiguate.

There are a few different parts here from robotics that can help.

  - Tracking allows you find a how a patch of pixels move. Look up "klt" and "SIFT features". Older
  - Recognition allows you to find a given object. Look up "yolo". Newer
  - Motion modeling allows you to predict where something should be, and can include the transponder ego data. Look up "kalman filter" 
  - All three of the above should be available in existing libraries
  - If possible, engineer the environment, like putting easy to spot LED patterns on the drones. This is always the easiest

👤 mhb_eng
RTK GPS? Assuming you have a base station, you correct the precision of the drones to ~1cm

👤 azur_w
I've been working with motion capture system for several years as a phd student in robotics. The first thing came into my mind is the motion capture system which is already mentioned by someone ahead. I worked in one of the largest indoor motion capture in academy, the ASU drone studio which possibly can be a reference if you searched it. But the cost of such a field is pretty high since you basically have to cover every corner with camera each of which costs 4k-6k dollars. It can track 10s of objects easily with sub-mm accuracy and 360 frame per second.

👤 goodlinks
Can you expand on the gates (using many more of them) that work off the fpv video signal? You can add cameras detecting motion and not worry about which drone is which (getting that from fpv signal as it passes the gate).

Then interpolate with accelerometer data from the drones telemetry?

Dunno seems hard anywhich way but having more than one tech involved to cross check seems like it might help.


👤 KaiserPro
Depends on how you want to do it.

The big boys use sensor blending and kinematic gps to get cm level accuracy.

However that requires good GPS coverage (ie no roof)

Second to that its fiducial tracking. Either attacking markers to each drone and using multiple cameras to work out 6 degrees of freedom, or giving the drones enough horsepower to do it onboard.


👤 carapace
I would try sound. Make each drone emit a distinct (ultrasonic) frequency and triangulate them that way. Maybe you could get velocity estimates too from the Doppler shift? I don't really know what I'm talking about, but it feels like this would be easier than a visual method.

👤 svet_0
Did you consider using Lidar? If you need high positional accuracy it might be better than cameras.

👤 sandreas
You might take a look at OpenCV or BoofCV:

http://boofcv.org/index.php?title=Example_Tracker_Object

BoofCV also has a great Android App to check out its features.


👤 ganzuul
SteamVR's Lighthouse stuff is limited to 10x10m but if they let you hack it the drones would only need to carry a few LEDs. It would work in indoor arenas too if this is a traveling show.

👤 Evidlo
I have a similar question. How would you track an animal like a rat in a one acre plot of land to something like 10cm accuracy?

Could decawave work for this? What if the rat goes behind a tree?


👤 tgflynn
Multiview cameras.

Once you have the tracking data how do you plan to view it ?


👤 emddudley
What is your budget?

👤 dorianmariefr
A very blurry camera and being far away, and at night, then you can call it an OVNI.

👤 roberto
I've only wanted to add that I'm curious too

👤 ezconnect
Multiple camera probably is the fastest (latency & update) tracking sensor you can build.

👤 poindontcare
ORB, SIFT, ICP, Open CV