At the request of Zacuto in Chicago, Bob Primes, ASC led a group of over 100 intrepid volunteers and ASC colleagues in the creation of The Single Chip Camera Evaluation. Basically, a well-planned and fairly comprehensive shootout between the Arri Alexa, Sony F35, RED MX, Phantom Flex, WeissCam, Sony F3, Panny AF100. Canon 5D, 1D, 7D, Nikon D7000 and Kodak 5213 film.
The following write-up is my assessment of the presentation of the event. As the organizers point out, many of the results are subjective and there was no clear "winner," though several of the panelists expressed a pretty un-restrained love for the clarity, beauty and simplicity of the Alexa, closely followed for their adoration of the F35.
To start the event - held at the end of the day Tuesday at NAB to a packed, standing room only auditorium of appreciative attendees - they showed the video of the tests themselves. The organizers hope to soon release a Bluray DVD of the video. For more on the availability and more specific results of the tests, please visit thescce.org.
Here are my quick notes on the video itself:
They broke the testing down to several specific categories including: Sharpness, low light capability, exposure latitude, highlight detail, shadow detail, color reproduction, skin reproduction, shutter artifacts.
In the sharpness test, they were looking at fine detail and contrast. They used a Siemens star chart. In the sharpness test, the RED ONE and film were tops, while the 1D, 5D, 7D and Nikon were among the worst. The results were displayed as a graph.
As for the Comparative sharpness test, the image was blown up about 230% to determine how sharp the image was. Obviously, cameras with a larger size - like the 4K Red image faired a lot better being blown up. The results, however, were subjective and open to interpretation. Someone may disagree with me on my impressions. Many high level DPs were in the audience and I walked out of the event with Art Adams and Adam Wilt. Hopefully they will post their impressions of the test as well. With that caveat, I thought that, F35, Alexa, RED, film, and Phantom were good. While the Panny 100 and the slew of DSLRs were the worst.
Throughout the test, there were several times that the audience broke into knowing laughter or at least audible recognition of the fault of a specific camera. The softness of the 5D in this test was the first place that this communal realization happened.
Let me just point out that of ALL the cameras tested, the only one I personally OWN is the 5D, so please don't feel I'm slamming the DSLRs due to some elitist bias. I WANT the 5D to be come out ahead! But I'm realistic and trying to be objective.
In the low light Sensitivity test, they wanted to see which images could resolve an image in the lowest possible light - apart from the base noise floor. This was about the only category in the entire slate of tests that the Canon 5D came out on top, objectively.
In the real world shooting test for low light, it was hard to see a difference. In these subjective, comparative tests, the D.P.s determined one "best" image to use as a comparison to go back and forth between the other cameras, kind of as a reference. They explained that choosing that reference was subjective, but, in some since, was representative of the best camera performance in that particular test. In the case of the "real world" low light test, they chose the Alexa as the reference. The 5D was at the bottom, subjectively, though there wasn't a lot of time to judge nuances.
The Dynamic range test was an objective test based on how many stops of range could be seen. Film and Alexa were at the top. Actually, Alexa had the best latitude with film at something like 14 stopsand the Alexa beating it with something like 14.2 stops. The Red was also very good. The F3 and DSLRs also rated well.
In the Highlight detail real world test, they shot a scene with a bright window in the background. They exposed to blow out the window in the camera, then used color correction to bring back whatever they could using a Power Window. In this test, which was subjective, the Phantom. Weisscam, F3, AF100, and all of the DSLRs were bad. Again, a knowing group chuckle went up at the 5D. The organizers wanted to point out that the Weisscam and one of the other RAW cameras had some problem where they were unable to use the RAW camera data and had to use the HD-SDI. Obviously that's a BIG disadvantage to test a RAW camera with only the HD-SDI image, so those results should be viewed with a kind of an asterisk - like Barry Bonds' batting record.
In the lowlight shadow detail the F3 was excellent. The RED MX was used as the reference image. Film was actually bad. As Robert Primes pointed out, film actually has phenomenal highlight detail, but isn't actually that good in the shadow detail. This test proved that point. The F3 looked good, as did the AF100 good. The DSLRs also all looked good.
The Ringer chart - by DSC I believe - showed the color sharpness and moir© issues. This was subjective, but the F35 was used as the reference and was nearly perfect looking. The Weisscam was bad the F3 was nice, the Panny was nice, the 5D was bad, the 1D was worse, the 7D was bad and the Nikon was decent.