Faced with the controversy that agitates the web around the photos of the Moon captured with the Galaxy S23 Ultra, Samsung publishes a press release explaining the operation of the algorithms that are hidden behind the very (too) optimized shots of the terrestrial satellite.
Samsung week kicked off with a controversy over the highly detailed Moon photos that their smartphones can take with a 100x digital zoom feature. Therefore, the Galaxy S23 Ultra stands out in particular.
In fact, it has been shown that the smartphone could generate a very clean image of the Moon from a very blurry image of it. Unsurprisingly, accusations of cheating have flourished on the web. It is above all a broader and more interesting debate that is being reopened. How far can the photo-processing algorithms our smartphones are packed with go before they are accused of distorting reality too much?
YouTube LinkSubscribe to Frandroid
In fact, there’s nothing really shocking to find out that the Galaxy S23 Ultra relies on a powerful behind-the-scenes software process to capture a beautiful photo of the Moon. However, we can blame Samsung for not being clear enough in its communication, conveniently suggesting that this feat was entirely down to the quality of the 100x zoom.
It is undoubtedly to extinguish this small fire that the company publishes a press release playing on transparency. ” How Samsung Galaxy cameras combine super definition technologies and AI technology to produce high-quality images of the Moon “. The title is not misleading: in fact, we discovered a detailed explanation of the methods used. The controversy is not mentioned, but themomentof the publication leaves no room for doubt.
Samsung places special emphasis on the “Scene Optimizer” feature that can be turned on or off in the Camera app settings. The brand explains that it is this option that allows you to activate the automatically generated enhancements.
«When Scene Optimizer is on and the Moon has been recognized as an object, the camera provides users with a clear and bright image thanks to Scene Optimizer’s detail enhancement engine in addition to Super Resolution technology.»
In order for its camera system to detect the Moon well, Samsung explains that it trained it with a variety of shapes and details of the Moon, “full moon to crescent moonThanks to an AI deep learning model, it is possible to recognize the shape of the Moon even in situations that have not been used for training.
Add to that a well-measured brightness adjustment, work on shot stabilization and various optimizations, and you have the recipe for good Moon photos according to Samsung.
With this detailed release, Samsung opens some doors, but it has the merit of being educational and transparent. the edge also remember that this explanation already existed in Korean and that this new press release is essentially a slightly modified translation.
Users potentially disappointed by these tricks may not be satisfied with Samsung’s explanations. It must be said that the processing of photos of the Moon is particularly zealous. In the opposite camp, defenders of the brand will always be able to argue that it only goes one step further than the already very numerous reality checks that our smartphones make when they take photos.
All this history is reminiscent of a similar case that affected the Huawei P30 Pro at the time. Regardless, it’s mostly a good reminder that our phone cameras are still a long way from being able to guarantee such beautiful shots of the Moon without a huge software push. This is still true even for the best smartphones in photos.
To follow us, we invite you to download our application for Android and iOS. You can read our articles, archives and watch our latest YouTube videos.