The iPhone has relied on Apple’s computational photography techniques for years, and one would assume that the company’s algorithm for image processing would only improve with time until a woman posed in front of a mirror wearing her wedding dress. The captured result shows that the person in the image looks different than her own reflections, indicating errors in Apple’s processing techniques.
A U.K. comedian and actress, Tessa Coates, was trying out a wedding dress and decided to pose in front of the mirror, with someone behind her taking a picture from an iPhone to show how she would look from various angles. After inspecting the photo herself, Coates was perplexed to see that instead of one person, there were three different people in the image, but in reality, those were only reflections of that person. Coates mentions what she experienced on Instagram below.
“Okay hello! Upgraded from stories to the grid. I went wedding dress shopping and the fabric of reality crumbled. This is a real photo, not photoshopped, not a pano, not a Live Photo. If you can’t see the problem, please keep looking and then you won’t be able to unsee it. Full story in my highlights (THE MIRROR) Please enjoy this glitch in the matrix/photo that me nearly vomit in the street.”
So, what is really going on here? With PetaPixel’s investigation and a little help from AppleInsider, it was concluded that Apple’s computational photography could not distinguish between a reflection and a real person. The iPhone’s camera pretty much thought that it was capturing three different people in a single frame, so it produced a result where Coates’ hands were in a different position, but her reflections’ hands were elsewhere.
While Coates was moving, the iPhone
Read more on wccftech.com