Everyone hates washed-out spots in images but there isn't much that could be done to prevent it as of today. Things might be different in the future, however. MIT researchers believe that they have found a way to weed out overexposed elements from pictures, using HDR-like technology without the limitations.
Several smartphones come equipped with HDR technology which reduces the washed-out spots by offering a wide dynamic range. But MIT has a better solution. The researchers have combined the next-generation camera hardware called "modulo camera" with new image-processing algorithm to get rid off overexposed spots.
The researchers claim that modulo camera is designed to never overexpose an image. Created in a collaboration between Media Lab's Camera Culture group, MIT Lincoln Lab, and Singapore University of Technology and Design, the modulo camera requires only one shot to create an HDR image, compared to today's widely used HDR sensors that require many photos to be taken. The current HDR technology also requires one to keep the camera still while taking a shot, and if done incorrectly can produce blur and other artefacts.
With the modulo camera and the new image-processing algorithm, the researchers have essentially created what they call a 'real-time HDR' camera that removes the hassle of HDR photography completely, from correctly choosing aperture to exposure length, and keeping the camera steady.
The other issue in overexposure is light, or its abundance to be precise. Conventional camera sensors can only accept certain amounts of light because of limited "well capacity," the amount of light an individual pixel can hold. Compared to that, modulo camera resets the sensor capacitors every time the "well" gets full. It further makes use of "an inverse modulo algorithm to calculate how much light the reset sensors took in," the news release said.
"For example, if a certain camera sensor can record eight bits of information, then when those eight bits are filled, the capacitor will be reset to zero. The number of resets is recovered by the algorithm, which then calculates the relative brightness of each area of the photo," the researchers explained, adding the technology can be used for computer vision and astronomy, or any field that deals with bright and low-light sources.
Sadly, this technology isn't available in current digital and smartphone cameras. And there's no word yet on when we can expect this on our devices. But what's exciting is that many companies are working to take the existing camera technologies to the next level. Earlier in August, MIT and Google had partnered to make an algorithm that removes reflections and obstructions from an image.
For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.