AppleInsider may earn an affiliate commission on purchases made through links on our site.
Moon photos taken using the “Space Zoom” of Samsung’s flagship smartphone models appear to be more of an AI trick than anything else, a Reddit user’s investigation into the feature claims.
Samsung’s flagship Galaxy smartphone lineup, including the Galaxy S23 Ultra, has very high level of zoom for rear cameras. With a 100x zoom level, created by enlarging 3x and 10x telephoto cameras with digital zoom aided by Samsung’s AI Super Resolution technology, it captures shots of objects that are very far away.
The so-called Space Zoom could potentially allow users to take pictures of the moon, and many are doing so. However, the level of detail in moon shots may simply be higher due to software quirks.
In Friday’s post on the Android subreddit, “u/ibreakphotos” declared that Samsung’s Space Zoom is “fake moon shots,” and they have proof. The long post then demonstrates that belief, in a pretty convincing way.
Referring to earlier reporting that the moon photos from the S20 Ultra and later models were real and not fake, the Redditor pointed out that no one had succeeded in proving they were real or fake, until their post.
The user tested the effect by downloading a high-resolution image of the moon, then downsizing it to a 170 by 170-resolution image, and then applying a gaussian blur to remove any final detail of its surface.
They then displayed the low-res blurry moon in full screen on their monitor, walked to the other end of their room, zoomed in on the fake celestial body, and took a photo. After some processing, an image of the moon is produced by the smartphone, but the surface has more detail for the surface than the doctor sourced.
The user speculates that Samsung is “using an AI model to put craters and other details in areas that are just a blur.” They further point out that while super resolution processing uses multiple images to recover otherwise lost detail, this seems different.
It suggests that this is a case “where you have a specific AI model trained on a set of moon images, to recognize the moon and slap a moon texture on it.”
“It’s not the same kind of processing that’s done when you zoom in on another object, when multiple exposures and different data from each frame account for one object,” they suggested. “It is certain of the moon.”
It reckons that because the moon is so well locked to Earth, “it’s very easy to train your model on other moon images and just slap that texture on when it sees something that looks like the moon,” and the AI is ” doing most of the work, not the optics.”
Referring to an earlier failed attempt to destroy the quality of Space Zoom, Samsung ensured that the feature used up to 20 images, then processed them as a composite with AI. That AI determines the content of the scene, and then performs “detailed function enhancement” on the subject.
During a previous investigation in 2021, attempts to trigger an overlay or AI processing on a clove of garlic on a black background or a table tennis ball failed to trick the smartphone. Testing 2023 with a 170-by-170 resolution image of the real moon may have given the AI processing enough basic detail to think it was looking at the actual moon.
The new test also removes any kind of multi-frame sharpening from use, since it’s a shot of the same low-resolution moon for every frame.
It remains to be seen whether this brief investigation will trigger a closer investigation into the use of AI in photography, but the concept is one that has been used throughout the mobile industry. Even Apple leaned in computational photography to improve the quality of images from its cameras.
While the public may be convinced that AI processing techniques applied to images from smartphone cameras is a good thing in general, rare specific instances like this may give some pause for thought. people who care about photography as an artform.