Have you ever tried taking an image of the outside world from the inside of a hotel room? Did your photo also end up with a big reflection of you in the final image and very little of what you wanted to capture? You're not alone. Thankfully, researchers at MIT and Google teamed up to devise an algorithm that removes all such reflections and obstructions from the images.
In their finding titled, A Computational Approach for Obstruction-Free Photography, the researchers detail exactly how the algorithm manages to remove the reflection from the images, and what a photographer needs to do to make that happen. Essentially, a photographer is required to take a short sequence of images while slightly moving the camera between frames.
The reason a number of photos are required is to make it possible to separate out the reflection, and recover the desired background scene "as if the visual obstructions were not there."
The algorithm in question analyses the obstructions and gets rid off all the reflections. It could work with rain drops, fences and other similar obstructions. In the video (embedded below), the researchers show how accurately they are able to figure out the correct background.
Which leaves us to the most anticipating question you may have: when is it coming to our smartphones? There isn't any concrete information yet. "The ideas here can progress into routine photography, if the algorithm is further robustified and becomes part of toolboxes used in digital photography," the researchers said. We hope the camera companies are planning to get behind this project at the earliest.
For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.