Why does the ISO rating of my camera change with different gamma curves?
What happens if you turn the gain up on your camera?
Suppose you’re looking at your correctly exposed image and you add 6db of gain (in old money).
Your picture gets brighter, and will be over exposed.
The logical thing to do now is to close the iris by a stop, and you’re back to where you started from.
Except you’re not...because you’ve closed the iris by a stop, your photo-sites on the sensor are only half as full. You’re compensating for this by adding electronic gain, which also adds noise, but your sensor now has an extra stop of headroom before the photo-sites fill up.
Changing gain re-maps the output of your sensor. This why cameras have a different ISO rating when used with log curves that can capture more dynamic range.
The Sony F5 is rated at 800 ISO with a ‘709 gamma curve. When you switch to S-log the ISO rating changes to 2000. This is equivalent to turning the gain up, and ‘tricks’ you in to closing the iris by a stop or so, leaving more headroom for the sensor.
Another TLA to go with your HFR and your DoF for all you DoPs
High Dynamic Range may be the next big thing. You may feel you’ve seen enough next big things to last for the rest of your career, but that won't hold back the tide.
Is it going to give the consumer a real and obvious improvement in viewing experience?
HDR is obvious to the casual viewer, particularly in a side by side comparison. You sometimes have to look carefully to check you're looking at UHD rather than HD whereas with HDR the benefit is clear. Assuming you’re looking at a high contrast scene of course. We may find we are inundated with high contrast images in order to show off peoples HDR credentials; a bit like all those objects leaping out of the screen to show off 3D. Real life can pretty low contrast.
What is it?
The dynamic range of an image is the space between the darkest discernible tones and the brightest.
DR of your display is usually limited by the volume of light your screen can push out at one end, and the noise floor of your system, or the minimum black level your screen produces, or ambient lighting in your viewing area at the other end.
If you want to increase DR of your display you can increase the light output of your screen, reduce noise, improve black level reproduction, and improve viewing conditions. Ideally all of the above.
It’s not just about the display though. You need to have all your ducks in a row. For the end user to see the benefit you have to capture an HDR image, record it, hold on to it through post and transmission; and deliver it to an HDR display.
Fortunately, many cameras are already able to capture a much higher dynamic range than current displays can deliver. 14 stops seem to be this week’s favourite figure to put on the back of your camera brochure, and that's about right for the new generation of HDR displays.
You just need a way of squeezing all that dynamic range through the doorway of existing recording and transmission paths. Again, rather fortunately, we sort of already have that. Log or RAW recording modes are the kind of thing you need.
The exact shape of the transfer characteristic you should use may require a bit of tweaking. The BBC have some ideas about that, as have Dolby, but the general principle is as follows:
There are no confirmed standards yet, so it may need little while to settle down.
As an example:
The HDC-4300 4K studio camera sends RAW data down the fibre to the BPU base station, holding on to all the dynamic range available from the sensor. The BPU can then process that data using an s-log type gamma curve and wide colour space. This is then sent via a standard SDI interface to a monitor such as the BVM-X300 which can unpack it from its s-log box and use the reverse electo-optical transfer function (EOTF) to display it in HDR.
Easy, as long as you have an HDR camera and monitor, and a connection that doesn't cut your signal down to 8 bit.
So what do we mean by an HDR display?
Most display manufacturers are aiming to increase DR by increasing the amount of light the screen can push out. The luminance of a screen is usually measured in candelas/m2 or ‘nits’.
There’s no agreed standard for the output of an HDR display at present, but here are some typical numbers:
Current standards aim to deliver a signal suitable for a 120 nit display.
A monitor in a grading suite ought to be set to about 80 nits.
CRTs would typically deliver up to 100 nits.
Current LCDs might be capable of 300 to 500 nits.
Sony’s BVM-X300 HDR capable monitor can output 1000 nits, as can some consumer TVs (as of Oct. 2015)
The Dolby reference monitors can deliver about 4000 nits, though that may be a bit much for home viewing, unless you like to watch TV whilst covered in factor 40 sunblock.
Is it going to be painful and expensive to implement?
From the production point of view...not particularly. Transmission may be more difficult, as you could really do with at least 10 bit encoding to describe the extrea range of an HDR signal. Displaying HDR images will mean a new telly....but it will soon be illegal to own consumer goods more than two weeks old anyway, so that’s all good. Probably worth waiting for an HDR compatible model if you're thinking of buying an UHD TV though.
Is there an easier way to increase the dynamic range of your viewing experience?
Yes. Turn the lights off in your living room.