Would you let the trainee grade your movie?

I was listening to this great Team Deakins podcast episode where Roger Deakins and partner James interviewed the amazing Joachim “JZ” Zell, vice chair of the ACES project, founder of the HBA and color management extraordinaire, and remembered the dailies process in the old film days before some people started to rethink it to avoid a terrible invisible workflow monster that can bite you pretty hard.

Back in the days of shooting film, the rushes were developed in the lab and then going through a telecine machine to make a video tape that could be sent to editorial to cut the movie. That was the broken link in the Digital Intermediate workflow: The telecine is a creative machine, as opposed to a scanner: a scanner (at least in theory) would always output the same picture if the pictogram was scanned several time, with a direct relationship between the density of the negative and the code value of the pixel. The telecine is capturing the film negative in real-time, it’s less stable and less precise but good enough for offline editing. It has as lot of ways to tweak the image, as it was designed to do so: you could change colors, contrast even draw shapes and create something very different from the film negative. In film labs the negative were telecine’d overnight, mostly by junior colorists and trainees, so the production and the editor could see what was shot as soon as possible. Too often unsupervised, the trainee could learn all the tools and be creative… For most engineers, it wasn’t a problem, because the telecine is not supposed to be the ground truth but just a “preview” for editing. They just dismissed the way the brain works, in a lot of situation the director, the editor and even the DoP (director of photography) would fall in love with that “preview”, or just get used to it, because they had seen it for weeks and months in the editorial room, most of the time on uncalibrated TVs. And then on the first day of the DI, after the camera negative had been carefully been scanned for high precision and the film lab LUT been applied, everybody shouts because that is not the color of the movie! And so, they ask the highly skilled, well paid colorist to redo the look that the trainee did one night, sometimes the humiliation going up to having the director bringing the TV from the editorial room in the grading room for color matching…

As mentioned in the podcast, it was not only a matter of taste, it could also ruin a whole movie if that grading was hiding some technical problems, like screwing with contrast and/or compensating for underexposed footage. This is still a pretty common problem even now in digital, with some DITs grading things on-set and generating LUTs that may look good on the monitor, but don’t show the reality of the signal: it may be slightly lifted, showing an image that may look good on a small monitor, but technically pushing the signal that is in reality under-exposed, leaving the camera recording under-exposed footage… VERY bad surprise when getting that on a large screen and the noise jumps at your eyes! Also remember that LUTs are not magical, and they should be made by someone understanding their limitations.

Systems were developed to bring back the color decision flowing the right way: as the one developed by Efilm described in the podcast, so that the DoP would agree on a “look” with the senior colorist, then a color scientist developing LUTs and defining a colorflow so that the telecine machine was showing the colors as they were meant to look like, and all modifications made by the telecine operator could be tracked. The ASC-CDL was developed as an industry standard to convey those modifications and if people were willing to put together a color managed workflow, it was possible to keep the info through the production process, from set to telecine, editorial, VFX and back to the lab for grading and finishing. It can still be used, though not so many facilities implemented it. I developed one of the first implementations of ASC-CDL as a Lustre plugin back in the days. These days it’s mostly LUTs that are exchanged between facilities, and Baselight has done a great job with their plugins and BLG format to exchange color info while keeping it editable. Now, the ACES metadata file is adding some extra flexibility and lives in the ACES ecosystem where it’s much easier to keep track of information, in a full open way.

This is a great example of colorboration: it’s not the technology that is doing the job, and a workflow must describe not only the steps in the production process, but also the decision process. In that case, considering the camera tests as being out of the actual workflow and not making the look definition explicit could run into serious troubles. When shooting film the traditional way, the DoP would agree on a look after tests with the lab timer and would see the result after a long process; these days we have instant preview of the output of the camera, with a lot of tools and opportunities to make things look right, but also to make them go seriously wrong. In the future we may get an AI do the job, but in the mean time, make sure you talk to the right guy to setup your workflow so you get the expected color! Thank you Roger Deakins and JZ for sharing the knowledge.

PS: A little funny one, the colorist who did O’Brother Where Are Thou at Cinesite on that Pogle that JZ installed was Julius Friede, who then was hired by Eclair in Paris to help setup the digital intermediate workflow. I was in charge of implementing there Colossus, the color grading software that ultimately became Autodesk Lustre, and train the colorists. Among them was Yvan Lucas, who then went to Efilm and worked with JZ… small world!

Feel free to comment in the dedicated Rockflowers Q&A channel