200 megapixel sensor, Super HDR video, Super Quad Pixel technology… we help you understand the photo and video technologies of the Galaxy S23 Ultra.

As expected, Samsung did present, on Wednesday, February 1 during its Unpacked, its new range of smartphones, the Galaxy S23, S23 Plus and S23 Ultra. But even more than in previous years, the Korean manufacturer has emphasized above all the photographic capabilities of its model.Ultra“. Of the hour-long lecture, a total of 32 minutes were devoted to smartphones alone, including 27 minutes specifically on the Galaxy S23s’ photo and video performance.
Suffice it to say that for Samsung, photo and video are now at the center of communication around their smartphones. But it is still necessary to know how to decipher these novelties. Between the 200-megapixel sensor, the “dual-focus AF“video stabilization”OIS + VDIS” or 12-bit dynamic range, here’s what to remember about the Galaxy S23 Ultra’s photo and video performance.
A 200 megapixel sensor for 50 or 12.5 megapixel photos
Unsurprisingly, Samsung has chosen to equip the main camera of its Galaxy S23 Ultra with a 200-megapixel sensor. It is actually the Samsung Isocell HP2 sensor, a 1/1.3″ format photosensor, presented by the firm in mid-January.
In fact, to be more precise, we will mainly be talking about a sensor with 200 million photosites. In a photo or video sensor, each pixel is effectively registered by a photosite, that is, a cell that will be in charge of transforming what it sees into a point of light. However, on the Samsung Galaxy S23 Ultra, whether it is possible to capture 200-megapixel snapshots is not a default option.

In fact, the Galaxy S23 Ultra will use this ultra-sharp sensor to group the pixels, depending on the light conditions. In the case of a dark scene, 16 adjacent photo sites will be grouped together to create a single pixel. We will therefore have a 12.5 megapixel photo. In more favorable light conditions, it is then by 4 that the photosites will group together, for 50 megapixel shots.
The idea behind this technology is twofold: to provide sharper images when the light is sufficient and a greater dynamic range, with more light and less digital noise when the light is too low.

For the first case, remember that conventional camera sensors that exceed 100 million pixels are rare. This threshold is only reached by medium format photosensors, such as the Fujifilm GFX 100S. On full-frame bodies, sharper models top out at 61 million pixels, like on the Sony A7R V, while APS-C cameras are limited to 40 million pixels, like on the Fujifilm X-H2. In this order of ideas, the wider the sensor, the more defined it will be. In fact, too high a definition on too small a sensor will reduce the size of the photosites by increasing their density. However, the Galaxy S23 Ultra will still let you capture 50-megapixel snapshots by default, or 200-megapixel via a dedicated option, for people who want to crop their photos in post-production. This greater definition will also allow this main sensor to be used as part of the hybrid zoom, associated with telephoto sensors.


For the second case we will have the right to pixel binning, that is, to the fusion of the photosites between them. By putting them together in groups of 4 or 16 to create a single pixel, we will form much larger virtual photosites. This allows two advantages: first, by grouping the pixels, we reduce the risk of digital noise, color grains, linked to the increase in ISO sensitivity. We then increase the dynamics of these virtual photosites which will be wider and therefore can more easily capture very bright elements than darker ones. Thus, if each photosite measures 0.6 μm on a side, we will have the right to photosites of 1.2 μm on a side in 50 megapixel mode and 2.4 μm on a side in 12.5 megapixel mode.
For comparison, the iPhone 14 Pro, which uses the same technology with a 48 million photosite sensor for 12-megapixel shots, will bring together four 1.22μm photosites to create virtual 2.44μm photosites.
Autofocus based on groups of four pixels
However, the photo announcements about the Galaxy S23 Ultra were not limited to the single definition of the main sensor. Samsung has also emphasized its autofocus technology, dubbed “Quad super pixel“. Specifically, it is a phase detection autofocus technology that will use the different photosites used to form a single pixel, distinguishing the focus difference between each one.

During the Galaxy Unpacked conference, Jaclyn Wyatt, Samsung’s social media manager, explained:We’ve made autofocus simple with Super Quad Pixel, which uses every one of 200 million photo sites to easily focus on subjects. By using four adjacent photosites to detect differences between left and right, and up and down, it allows the camera to focus faster as it has more reference points».
This technology is not new in the field of smartphone photography. Google has used for years an autofocus system called “double pixelwith each photosite cut into two parts to better distinguish the difference in perspective thanks to stereoscopy. Enough to distinguish, thanks to a three-dimensional view, the depth and therefore the focus. This use ofQuad super pixelwhich is used to measure depth in autofocus, could also be useful for portrait mode photos, measuring the depth of the scene as best as possible to generate background blur.
More advanced video features
Samsung has also focused on the video performance of its Galaxy S23 Ultra and here too the ambitions are high. It is true that the manufacturer is not committed to capturing in RAW format as Apple has been offering on its iPhone Pro for a year and a half, but nevertheless Samsung has announced the arrival of a “super HDR” on video.

Specifically, this function is obviously reminiscent of the HDR modes already offered in photos. As a reminder, the principle of the smartphone is to capture multiple photos at the same time, with different exposures. By grouping them together, the smartphone will allow detail to be retained in dark areas as well as in brighter areas.
Specifically, in video it is the same technology that is implemented, capturing the scene in several different exposures. Enough to plan with 12-bit dynamic range according to Samsung. While a classic video sequence is limited to 8-bit dynamic range, for 256 brightness values, a 12-bit dynamic range will allow up to 4096 exposure values. In other words, videographers will be able to enjoy a more detailed image.

Also in the field of video, Samsung has announced a new stabilization technology.OIS + VDIS“. Concretely, this means that the Galaxy S23 Ultra’s sensor stabilizes mechanically – moving to compensate for movements – but also digitally. On its website, Samsung claims that VDIS (video digital image stabilization) helps to “reduce the level blur or distortion in videos that can result from movement or shake” and that this stabilization works by increasing the sensitivity and shutter speed to find the smoothest image possible.
Features for photo and video professionals
Finally, as has been the fashion for a few years, Samsung has also made an effort to target professionals in photographic or video imaging. In addition to the participation of filmmakers Ridley Scott and Na Hong-jin, the Korean firm has also focused on the professional photo and video modes of its Galaxy S23 Ultra.
As was the case with previous models, these modes will allow you to capture snapshots through manual settings. Enough to not only manually change the shutter speed, but also the ISO sensitivity, the white balance or the manual focus distance. For this last point, Samsung has also integrated a “focus peakthat you will be able to attract videographers by highlighting the elements you focus on in the image thanks to a green border visible only during filming. This feature, present for years in mirrorless cameras, thus makes it possible to ensure that the focus is as precise as possible.
Another feature that will certainly appeal to budding videographers: the “clean preview on HDMI displayswithin the Camera Assistant app. Specifically, this feature will allow professional or semi-professional videographers to directly display the image return on an HDMI display without the interface elements. If this use will be logically limited, given the few uses offered, however, it should also allow the Galaxy S23 to be connected to an HDMI acquisition dongle on a computer to transform it into a webcam.

Finally, always aiming to target imaging professionals, Samsung has partnered with Adobe to immediately offer Lightroom as the default photo editing app on the Galaxy S23, in addition to Samsung’s Expert RAW app. This will allow photographers used to Adobe’s suite to maintain their habits for developing RAW files.
Obviously, in the batch of functions presented by Samsung on its Galaxy S23 Ultra, many of them will only be used by a handful of people. Above all, we are still far from the capacities, formats or optical quality offered by mirrorless cameras or professional cameras. The fact is that, like Apple before it, Samsung relies more and more on the image and that certain functions, invisible to neophytes, should still make it possible to take full advantage of the photographic module of Samsung’s high-end smartphone.
Do you use Google News (News in France)? You can follow your favorite media. Follow, continue Frandroid on Google News (and Numerama).