A recent thread on our news forum asked whether sensor technology had reached its peak. This was furthered by an article that said new cameras aren’t getting any better, highlighting that the best-scoring full-frame sensors according to DxOMark are models from six years ago. Neither point, I believe, is true: sensor technology continues to improve and cameras absolutely are getting better.
The fundamental flaw in both cases is the unacknowledged assumption that improvements in sensors and improvements in camera mean improvements in image quality. It’s true that we’ve reached a plateau in image quality, but both sensors and cameras continue to improve.
Sensors continue to improve
The last big sensor contributions to image quality were arguably the adoption of more ADCs, close to the pixel (enabling a big leap forward in dynamic range), and dual conversion gain designs that meant you could have this additional DR at low ISOs as well as really good performance at high ISOs. This brought us to the current state where sensors capture significantly more than half the light that hits them, and add very, very little noise of their own. Without a major change in the way color is captured, it’s not obvious where significant additional IQ gains will come from.
Stacked CMOS sensors unlock high readout speeds with few drawbacks to image quality. |
But this doesn’t mean sensors aren’t improving. In large sensors, BSI technology’s main benefit is that it allows still more ADCs to be included, allowing faster readout without undermining noise performance. Current Stacked CMOS designs take this even further, allowing the inclusion of still more complex designs that deliver faster readout and, in some cases, in-sensor RAM to cope with these speed increases, again with limited IQ cost.
Tech Timeline: How sensors have evolved
Cameras continue to improve
Even without improvements in IQ, newer sensors are making cameras better. Faster readout speed can enable features such as more usable electronic shutter modes, faster burst rates, faster and higher resolution feeds to the autofocus processing, less viewfinder lag, as well as faster, smoother and higher resolution video.
Subject detection makes shooting at wide apertures faster and more reliable. Canon RF 50mm F1.2L | ISO 400 | 1/1600 sec | F1.2 |
In parallel with these sensor improvements, more powerful processors and subject recognition algorithms trained by machine learning are making significant changes to what cameras can do and how easy it is to get them to do it. It’s easy to overlook if you’ve not used a recent camera but performance that used to be reserved for professional sports photographers is now available in sub-$1000 cameras, and it’s often easier to use.
It’s to assume ‘well, he’s a camera reviewer, it’s at least to some extent in his interest to say new cameras are good.’ But the flipside of that is that I’m a camera reviewer: I’d have got bored long ago if cameras weren’t getting better. I’m currently finishing up my review of the Nikon Z8, so there’s simply no way you can convince me that cameras aren’t getting better.
Image quality continues to improve
Finally, just because sensor output has plateaued, that doesn’t mean image quality has stagnated. Sensors play a fundamental role in image quality but they’re not the sole contributor. The lenses launched in the past five years are some of the best we’ve ever seen. There are a number of factors at play, which I hope to delve into in a coming article, but we’re living through a golden period of lens development.
Just because sensor performance has plateaued, that doesn’t mean image quality has stagnated.
Getting back to the original arguments. It’s true to say that sensors aren’t delivering better IQ performance than, say, the chip in the Nikon D850, but it’s just as true to say that the latest lenses and AF systems mean you can more consistently capture images with detail levels that exceed it.
Author:
This article comes from DP Review and can be read on the original site.