See objects through the fog

(by Giovanni Calcerano) A group of researchers at MIT - Massachusetts Institute of Technology - has developed a system capable of producing images of objects wrapped in fog even when it is so dense that human vision can not penetrate it. The same system is also able to measure the distance of objects with respect to a reference point such as a moving car. In this way, it is hoped to obtain an integrated module that will help the human drivers and that, likewise, allows autonomous driving even in conditions of fog and low visibility.

The researchers tested the system using a small water tank in which a humidifier was immersed. The fog obtained allowed human vision for only 36 centimeters. The system was instead able to identify images of objects up to a depth of 57 centimeters.

This value certainly does not represent a great distance, but the fog produced for the study is much denser than what a human driver usually has to deal with; in the real world, the typical conditions would allow to reach a visibility of about 30-50 meters. "I decided to accept the challenge of developing a system that could see through the fog, and I knew it was not a simple challenge," says Guy Satat, the MIT Media Lab researcher who led the development team. "We are dealing with realistic situations, in which the fog is dynamic and heterogeneous, in constant movement and change, with denser and less dense areas. Other methods are not designed to cope with such scenarios. "

The new system uses a camera that shoots ultrashort bursts of laser light and measures the time it takes to return the reflected rays. On a clear day, the time taken by light to return to the meter faithfully indicates the distances of the objects that reflect it. But the fog causes the light to "scatter" or randomly bounce. Most of the light that reaches the camera sensor will then be reflected by the droplets of water carried by the air and not by the various objects that the vehicles must avoid.

The MIT system circumvents this problem using statistical analysis. Researchers have in fact been able to demonstrate that, irrespective of the density of the fog, the return times of the reflected light adhere to a statistical scheme known as Gamma Distribution. The system estimates the parameters to construct this distribution curve and uses it to filter the fog reflection from the light signal that reaches the camera sensor.

Basically, the system calculates a different distribution for each of the sensor's 1.024 pixels. And this is why it is able to manage the variations in density that make current systems useless: it is in fact able to adapt to those circumstances in which each pixel sees a different type of fog.

"The nice thing is that it's all pretty simple," says Satat. "If one analyzes the method, one realizes that it is surprisingly not very complex. In addition, the system does not require any prior knowledge of fog and its density, which helps to work in the widest fog conditions. "

Satat and his colleagues will describe their system in detail in a document at the International Conference on Computational Photography in May. Satat was joined, in the work, by his thesis supervisor, the associate professor of media arts and sciences Ramesh Raskar, and by Matthew Tancik, a graduate student in electronic and computer engineering.

See objects through the fog