You can already find them all over the Internet: D800 versus D810 comparisons. These come in two forms, both of which have some implicit problems you need to consider:
- JPEG comparisons — In order to be “fair,” these are always done at “default settings.” I’d be surprised if any update to a camera didn’t show better JPEG results at default settings. First, the camera makers know that virtually all so-called critical testing by Web sites tends to be done with JPEGs at default settings (again, to be “fair”). So if the camera maker learned anything about the response of their camera between version one and two, they should have adjusted the default settings by the time the update rolls out, in order to be “more optimal” ;~). Second, we’re now also getting incremental differences in what the JPEG engines in the camera actually do. In this case, we’re comparing EXPEED3 withe EXPEED4, and those chips have completely different performance and architectures (one would hope that 4 is better than 3). While the Flat Picture Control and Clarity setting aren’t involved in the default settings, other things behind the scenes sometimes are. New sharpening, compression, or even linearity routines definitely show up in new imaging ASICs over time as the camera companies get better at their algorithmic software. If an updated camera doesn’t look better than the previous version, I’d seriously wonder what the camera maker’s engineers were doing in the intervening months/years.
- NEF comparisons — Here’s the thing that most people don’t understand: raw converters are also constantly tweaking what they do with different raw file formats. Adobe, in particular, will issue “preliminary support” for a new file format and then adjust it over time as they learn more about the camera’s actual performance. Adobe’s own demosaic has had three different iterations over the years. Part of the problem is that the camera makers aren’t helping the converter makers at all: full disclosure of the attributes of a sensor, including accurate Bayer profiling, would help immensely. But they don’t do that, so everyone has to go through a long trial and error period before they get close to optimal results from a raw file. I suspect that’s even true when the camera maker controls the converter, as in the case of Capture NX-D. D810’s were still in testing phases while Capture NX-D was being locked down in 1.0.0 form.
Many of the raw comparisons also try to “take out” extra noise reduction in an attempt to show you the native noise levels. With converters such as Adobe’s, there is a base level of noise reduction applied at defaults, and I believe that level varies amongst cameras. In other words, converters have “default settings,” too, and the same problems apply to using them as do to JPEGs.
In short, what you see in “side-by-side” comparisons on the net early on in a camera’s release may or may not actually be reflective of any real difference. To some degree, this matches my own experience: it generally takes me several months to “optimize” my processing of raw files with a new camera, even with stable and mature tools that are not changing my workflow.
Sometimes that even means exposing a camera a bit differently to get optimal results. The clearest example of that was the Fujifilm S5 Pro, which had a very different highlight-to-shadow relationship than virtually any other camera. To some degree, the D800 was another of those, as it clearly had a bit of highlight latitude that wasn’t there in the D700.
I’ve always believed in comparing “optimal results.” What’s the very best I can get out of Camera X versus Camera Y (or converter X versus converter Y)? I’ll argue that none of the comparisons you’ll see this week between the D800 and D810 will tell you what that is.