Apple didn’t care about megapixels until the iPhone 14 Pro arrived
- September 10, 2022
- 0
Apple is used to implementing technologies that have been on the market for many years. The truth is, the gesture usually works for him, and proof of that
Apple is used to implementing technologies that have been on the market for many years. The truth is, the gesture usually works for him, and proof of that
Apple is used to implementing technologies that have been on the market for many years. The truth is, the gesture usually works for him, and proof of that is his reinterpretation with widgets in iOS 14, or their return to the Always on Display mode. But the biggest leap forward with the iPhone 14 Pro and 14 Pro Max this year is in the camera: finally, goodbye to 12 megapixels.
This generation’s bet is on Pixel Binning, the pixel grouping system we’ve known since Huawei implemented it on the Huawei P20 Pro under the name Light Fusion. Four years later, Apple bet on the same bet. The challenge ahead is not small: they need to show that their apps are better than their competitors.
The megapixel war has had a heated time with 108 megapixel sensors, it has calmed down years later and in the last two years they have proven that 12 megapixels is no longer enough. Google takes a historic step with Pixel 6 by giving up 12 megapixels and embracing 50.
Apple has done the same with its camera, which now groups pixels 4-1 to shoot at 12 megapixels despite having 48 sensors. This technique results in brighter photos on paper (we’re grouping the pixels into larger groups of pixels after all). The fact is that few manufacturers have taken advantage of this, both at the brightness level and at the raw detail level recovered by the sensor. As proof, here’s the maximum resolution photo section of our photo comparison.
In this regard, Apple now assures that it can. triple the light in dim areas. The opening of the main camera has been significantly reduced, so Apple has to make sure its camera is brighter now.
The iPhone 14 Pro will be capable of 2X “optical” zooming, using information from the middle part of the 48-megapixel sensor, interpolating and (theoretically) combining it to produce a lower resolution photo, but with more detail than the classic 2X. digital zoom (which we enlarge with the zoom gesture). Looks promising on paper but that’s something Samsung has already done with its high-end technology..
There is also a word Apple ProRaw will improve significantlyallows for raw footage at 48 megapixels. Here, it will be key for Apple to stop washing out the noise like it was in the last generation so we can recover the maximum detail of the sensor. Whether it works as it should or not, it is a function already available in Android terminals.
Apple has a big challenge ahead of it. you have to prove it Jumping to 48 megapixels makes senseHe said the difference in detail between the 12 and 48-megapixel mode was real (not all phones could tell it), and it was well worth the wait.
If the iPhone 14 Pro doesn’t show a significant improvement in camera compared to the iPhone 13 Pro, will practically be late for a train in vain.
Source: Xataka
I’m Sandra Torres, a passionate journalist and content creator. My specialty lies in covering the latest gadgets, trends and tech news for Div Bracket. With over 5 years of experience as a professional writer, I have built up an impressive portfolio of published works that showcase my expertise in this field.