Massachusetts Institute of Technology have proposed a new approach to time-of-flight imaging which increases its depth resolution 1,000-fold. This is the type of resolution that should make self-driving cars more practical and address safety concerns.
Central to the new development is the concept of ‘time of flight’. This is an approach that gauges distance by measuring the time it takes light projected into a scene to bounce back to a sensor. Time of flight is a property of an object, particle or acoustic, electromagnetic or other wave. Different techniques can assess the time that such an object needs to travel a distance through a medium.
By measuring this time (i.e. the time of flight), a time standard can be developed for the specific particle or medium. With the new technology, a short burst of light is fired into an area; a camera then measures the time it takes the light to return (this indicates the distance of the object that reflected it).
For this technique to be effective, good detectors are required. Here the researchers used interferometry, where a light beam is split in two. Half of the light beam is kept circulating locally while the other half — the “sample beam” — is fired into a visual scene. When the reflected sample beam is recombined with the locally circulated light, the difference in phase between the two beams produces a precise measure of the distance the sample beam has traveled.
What is interesting about the new research, other than the improved accuracy, is that the method can generate accurate distance measurements through fog. Assessing distance through fog has been a significant obstacle with the development of self-driving cars to date. This has been made possible with the added use of a gigahertz optical system; this system is better at compensating for fog compared with standard lower-frequency systems.
The new research has been published in the journal IEEE Access. The research paper is titled “Rethinking Machine Vision Time of Flight With GHz Heterodyning.”