A new algorithm can now counteract the distorting impact of water on photography.
The pictures you have seen underwater probably look blandly blue-green? The reason why this is has to do with how light travels through water.
Derya Akkaynak, an oceangoing engineer, wants to change this. She has created a new algorithm called Sea-thru which recovers the colours from photos of the deep.
The new technique isn’t about improving your holiday snaps. Underwater distortions to images are making it very difficult for coral scientists to use modern computer vision and machine-learning algorithms to identify, count and classify species in underwater images. They are stuck using time-consuming human evaluation instead.
The effects that Sea-thru might have on ocean science could be far-reaching. Many other types of biologists need to see true colours underneath the surface.
Sea-thru’s image analysis algorithm works by factoring in the physics of light absorption and scattering in the atmosphere and compares this with how light behaves in the ocean.
The software then restoring lost colours by reversing image distortion from water pixel by pixel.
This process won’t work on a single image though. The software relies on detailed distance data to work.
Akkaynak has to take numerous photographs of the same subject from different angles so that Sea-thru can estimate all the distances involved. However, this doesn’t mean scientists have to reshoot images. Many oceanographers have captured distance information in existing image sets as they rely on a process called photogrammetry. The software can already work with these types of photo-sets.
The process shouldn’t be confused with software where users can enhance images by uniformly increasing reds or yellows. The new approach is all about colour accuracy as this is critical to identifying coral species.
Akkaynak and engineer Tali Treibitz, her postdoctoral adviser, first detailed the process in a paper presented in June. This article was published with the title Ocean Vision in Scientific American 321, 6, 16 (December 2019) doi:10.1038/scientificamerican1219-16