BACK TO #NerdyFilmTechStuff

REPLIES TO SOME INQUIRIES ABOUT THE RESOLUTION DEMO

Thanks A---

I’m not sure what you mean “easily see the difference” [between 2k source for 4k and 6k source for 4k in Part 2 of the Resolution Demo]. The premise is not “it is impossible to forensically tell the difference.” The premise is "the overall perceptual experience for the viewer is interchangeable.” I’ve shown the demo uncompressed (not the online compressed version you’ve seen) on a huge screen to well established film makers and studio execs who sat at normal theatrical viewing distance (which means they’re looking at double the comfortable linear size or quadruple area size when it’s 4k 1-to-1) and they were gasping, laughing, and hooting at [contrary to cultural conditioning] how interchangeable the images are.

I can’t see your screen or your setup, how far you’re sitting, or what exactly you’re looking at; and don’t know what your definition of “easily see” is. Again, the claim is not “you can not see anything changing at all when you compare back to back,” but “does the difference between these two have any non-negligible difference to a cinema audience’s experience when we know they won’t be able to do an a/b comparison” and also "is the difference in pixel count even as perceptually relevant as the difference in scaling algorithm?"

Given that the whole thing is about audience experience, not forensic analysis, I can tell you that my own experience with my own eyes plus that of everyone I’ve spoke to who’s seen it but you has reported a perceptual experience of practical interchangeability (even if not of literal identity).

As for the article you sent about resolution of the human eye, I think you are both misunderstanding the article and ignoring the stated scope of my demo, whose topic is limited to “the perceptual experience of the audience for the purposes of cinema” only and is not about visual acuity in general. So, there are a few things that render untrue your statement that “we are still far from human eye resolution.”

1. From [what very little I know about the physiology aspect], the article you sent is correct that human visual acuity (limit of human perception) is around 1 arc-minute which is about .017º. When I get my personal first choice seats at the cinema, I’m at about 68º, which means that each pixel in my view is .04º for 2k DCP or about .02º for 4k. There are almost always literally about 20 times as many people behind me as in front of me (if not more) if the theater is not sold out. So, most people are viewing around 50º or less — and that’s their CHOICE seat for best cinema experience, not a compromise (since it’s not sold out). So for them, that puts each pixel at .03º for 2k or .01º for 4k. So if 1 arc-second is the limit of human acuity, “far from the human eye resolution” is quite a hyperbole, because in fact even at my close viewing distance, I am close (okay granted “close" is not a well defined term) to the limit, but most people who get their first choice seats are viewing pixels smaller than the limit of human acuity (.01º pixels when the limit is .017º) when they view 4k and are a hair’s breadth from it at 2k. And of course, we always view pixels even way smaller on big screen TVs.

2. Human acuity is not binary (on/off) — it’s analog, so it tapers off and doesn’t have a hard cut. The fact that there is any perception at all at .017º does not mean it’s good perception. It’s at the absolute limit of what can just barely be perceived. Things very close to the bottom limit but not quite at it are in the long tail of the falloff and are extremely close to undiscernable.

3. This is actually the most important point of the 3: Have you actually tested whether you can see single pixels at various distances? I have. Do you remember from the demo that when you have a photographed image and not generated graphics, you never have a one-pixel edge but a gradient? In fact the higher the resolution of the camera, the MORE pixels the gradient is made out of, and the farther you are from having a one-pixel edge. This is very important. The human visual acuity of 1 arc-second (.017º) only works at high contrast that no photographed image ever has. I’ve tested this with my own eye thusly: if I stand at a distance from my TV so that the viewing angle of the TV is a bit under 30º which puts a single HD pixel at somewhere near the limit of .017º and turn on a single red or or white pixel on a black screen, then I can see that one pixel. However, unlike generated graphics, no photographed image can have a fully black pixel next to a fully white (or red) one — there’s always a gradient (and there’s almost never full black and full white even at the extremes of the gradient). So if I change the contrast to something you could actually have in a photographed shot (or if I embed the extreme pixel in the complexity of a photographed shot instead of a black field), then I can no longer see the single pixel until I get way up to the TV, closer than I’d ever sit comfortably. This is what I meant when I carefully chose my words in the demo to say that when we went from SD to HD we surpassed the limits of human perception "for practical purposes for cinema" (I pointedly did not say that we literally surpassed the technical/nominal limit of human acuity).

The 3 bullet points above are all very techy. But here’s less rigorous but more meaningful test that I’ve done (have you): I’ve watched both of my demos at a theatrical viewing distance (around 68º) on the same projector back to back, comparing the 2k and 4k DCPs and I can’t tell the difference (and neither could the post-house professionals watching it with me — they didn’t even know what order they’d been played in). This is a controlled variable test: same camera, same mastering, same projector, AND it was at a post-house where the projector is kept in rigorous focus (unlike at a multiplex). Everything was the same and variables were contolled; the only thing changing was the DCP: 4k DCP versus 2k DCP (made from the same 4K master and scaled back to 4K by the projector).

One last thing to think about: in the cinema, 4k is so finicky that if the projector lens is the tiniest bit out of focus, the resolution of a 4K DCP will instantly dip to the same or lower resolving power than a 2k DCP. But how come we hear theaters advertise “4k” but never advertise that they rigorously check focus (which they don’t!)? In most practical applications, you can get a clearer image if you have someone checking focus on a 2k projector than if you watch 4k at a commercial multiplex with no expert/dedicated projectionist. Buying the 4k projector once is cheaper than having an expert projectionist always on staff, so guess which one they’d like you to think makes a bigger difference?

-Steve

Thanks, F---

[replying to inquiry as to whether ACES was used in The Resolution Demo or Display Prep Demo]

No, I didn’t use ACES at all. ACES is a framework for doing color management, it doesn’t define what you create within that framework. It’s like using Final Draft instead of Text Editor to write your screenplay: it enforces a certain kind of formatting but doesn’t dictate or change what content you create — you won’t have a different script if you write in Final Draft, you’ll just have standardized formatting. If I’d have used ACES, I still would have had to create all my own custom color and spatial transformations from scratch. The only difference would have been I’d have had to do a lot of extra steps for no reason other than to be compliant with the framework — like convert from log to linear and then right back to log for no reason other than to say “I did the IDT” or whatever.

-Steve

Hi, O---

[The questioner had seen different version of Interstellar and felt that in perceptual clarity, 15-perf 65mm source footage shown on 15-perf 65mm print performed the best, but was surprised that 4k DCP of the 35mm anamorphic source performed better than 35mm anamorphic source on a 15-perf 65mm print]

Thanks so much for the kind words. When I hear things like what you said there in your first paragraph, it makes me really happy, as that’s the very reason I spent so much time on this thing.

I can’t speak specifically to Interstellar as I don’t know precisely the pipeline that the footage went through, but I can answer very well about a hypothetical situation of the same details: it would not be the least bit surprising for 4k DCP to have more real resolution and higher perceptual sharpness and clarity than the anamorphic shots on 15p65mm.

The 35mm would likely have been scanned at such a high resolution that the scanner is a negligible layer of degradation. For example, scanned at 6k with ArriScanner or 4k with Scanity, which are both well oversampled [for the 35mm neg source]: that is, resolute enough they they’re actually beyond the analog resolution failure of the film (since you can actually zoom in on the 4k and see the film falling apart before you see pixels). So the scanner itself is not really a degradation and downstream of it in the pipeline, in terms of spatial fidelity, there are really no additional degradation in the pipeline all the way to the DCP (at least no perceptual ones — compressing to JPEG2000 is technically a spatial degradation, but it’s visually lossless).

This lack of additional degradations is very much not the case for 35mm scanned and then filmed out to 15p65mm, then printed. The film-out and the print are both degredations. And not only that, but in this case the film-out is even more of a degradation than a normal film-out because (to my limited knowledge) they never developed a laser film recorder for 15-perf 65mm, so if it’s filmed out to 15p65mm, then it may have gone through the pretty serious degradation of an old CRT film recorder (though of course I don't know this for sure). On top of that, of course, rendering an image on print stock is much more of a degradation than negative or intermediate stock (I’ve even heard from a reliable source that tests have shown that a traditional 35mm print of camera-neg—>interpolative—>internegative—>print is visibly lower resolving power than a 1k digital image, but I cannot verify this as I have neither tested it myself nor seen the purported test).

So, although you are correct that scaling algorithms must be used (i.e. native scan size to mastering/film-out size), I would guess that contribution of scaling algorithm is much smaller than the contribution of these additional degradations in the scan—>film-out—>print process.

Hope that helps.

-Steve

Hi, --,

[In response to a questioner who said he/she "can't tell anything" from the compromised downloadable version of the Resolution Demo and was asking me to post a 4k uncompressed version]

Thanks for your interest. I think you and I both know that “can’t tell anything” is quite a hyperbole and a better description would be “it’s slightly compromised and it’s not the best way to view it rigorously” (and that’s why I have the disclaimer that says this online distribution version has been compromised from the hero format to be able to distribute to more people). After all, BluRay sized H264 (which is what the online version is) is still higher quality than most people will ever view most movies and is perhaps the highest quality that is easily mass distributed — certainly much better than the highest qualities from Netflix, iTunes, etc.

I am considering scheduling more theatrical DCP screenings in LA and perhaps other cities but do not yet have any firm plans — maybe you can come to one if they happen, so you can see the full-quality version. I do not currently plan on distributing 4k DCP’s or 4k uncompressed files of Part 1 to the public.

There would be no reason to distribute 4k files of Part 2, because Part 2 is mastered in HD size, not 4k. In fact Part 2 is only properly viewed at HD 1-to-1 size (otherwise when I say in part 2 “we’re 1-to-1 pixels here,” then it’s not actually 1-to-1, as it’s been scaled to 4k). So, yes the online version of Part 2 is also slightly compromised by compression, but not by re-sizing (whereas the online version of Part 1 is compromised by both compression and re-sizing).

Having seen both the full-quality and the online versions myself many times (including screenings of the full-sized DCP for some very established filmmakers and studio execs), the only place in the whole thing where there is more than a tiny difference (in my opinion) in perceptual experience between the full-quality version in the cinema and the online version at home is in the section that’s about compression. Because, the compressed image in that section is now twice-compressed, so the compression artifacts that I’m showing there are compounded in the online version [but this doesn't really matter anyway, because there are many different kinds/amounts of compression and I sort of picked that amount at random anyway].

I suppose you won't want to just take my word for it (fair enough) that watching the compromised online version with at-home viewing-angels is only barely perceptually different from the full quality version with cinema viewing-angles, but that’s my opinion; maybe you’ll get to see a DCP so you can judge for yourself.

Don’t forget though, that using compressed files as a SOURCE from which to color grade, then master, then sub-master, then re-compress for consumer distribution is a much bigger degradation than using an uncompressed pipeline and then merely compressing in the very last step to make a viewing copy (and the latter is what's going on with the demo). BluRay quality is pretty damn good viewing format when it comes from a professional pipeline.

-Steve

Hi, V---

[In response to a question about a thin white line visible in Part 2 of the Resolution Demo at 00:08:51, when comparing 6k source to 2k source for 4k master]

Thanks for bringing this to my attention, I hadn’t noticed it. I can say with certainty that it’s the screen-capture that I did. Remember: unlike Part 1, Part 2 is a screen-capture video (in a visually lossless codec, but screen capture nonetheless). So unlike Part 1, it’s not been rendered out through a cinema mastering pipeline. [In Part 2, when I'm saying "professionally mastered 4k," I mean "professionally mastered through the pipe up to this point where it's being screen captured": the professionally mastered image is being screen captured, not properly rendered out to a file. If it was rendered, you wouldn't be abel to see my screen controls and my mouse cursor and whatnot. So although the screen capture software is visually lossless in that it looks the same if you a/b the source and destination, it's subject to some non-professional errors]

The reason I know for sure that it’s a screen-capture glitch and not something else is that on the one hand the line is still there on the full size master (not the web compression that you were looking at) but on the other hand the line is NOT in the actual Nuke script and image that I was recording when I did the demo.

So, if your intent is to master a film rather than making a screen-capture video, you can ignore this — it has nothing to do with scaling or K-count or anything. Just a glitch with the screen capture codec/software.

Thanks again for finding this.