Is "House of the Dragon" Getting Too… Bright?

Plus: The Pixar Sequel That Almost Wasn't

Greetings Hollywood tech nerds!

In this week’s post:

Subscribe to get Hollywood Tech Nerds magically delivered to your inbox every Tuesday!

Is House of the Dragon Getting Too… Bright?

The second season of Game of Thrones prequel House of the Dragon premiered on Sunday (no spoilers in this article, don’t worry), and IndieWire uses it as a pin to pose some interesting questions about the visual language of art in the era of streaming.

…the Season 1 team did more than employ chiaroscuro lighting to elevate this drama. The series’ look, feel, and lighting design was built into production designer Jim Clay’s impressive Red Keep set and the visual architecture of the entire series. While it’s a dance with a devil to put these darker images into a streaming world, darkness problems were never going to be solved by turning up the dial in the color grade or adding an extra light.

The Season 1 design is hardly the only way to tell this story, but when you maintain the same visual architecture and reach for a mushy middle ground that will play on any device, you run the risk of a muddy image. And the thing I find most depressing about the look of Season 2 is it has taken a huge step toward looking like everything else on TV and lost some of what made it visually distinct.

In rewatching Season 1 episodes after the Season 2 screeners, it’s easy to appreciate what we took for granted and see how lighting was part of an overall visual approach that elevated the drama. Season 2 looks worse, but the heavier burden is what the change represents. This a rabid fan base that tunes in every Sunday night. If there was a chance for a show to get people to put their phones down, google how to adjust their TV settings and give themselves over to an onscreen viewing experience, it was this franchise. Instead, it has bent the knee.

The IW article is very negative on the lighting changes to season 2, but I take a slightly different view. It IS very difficult to account for the various ways consumers are watching streaming services. Putting aside the basic idiocy of watching something like House of the Dragon on a phone, even in the realm of “high quality TV” there are still vastly different possible results.

Is it an HDR TV? Is the HDR DolbyVision or HDR10 or maybe HDR10+? Is the HDR mapped correctly? For instance, when Game of Thrones first appeared in UHD on Max, the scenes with flames in the background would cause glitches on some TVs. Is the viewer’s Internet fast enough? Does the viewer’s TV simply have stupid settings, like my Toshiba Amazon Fire TV which automatically turns on motion smoothing when you open certain apps???

We’re long past the days when the “look” of something was locked to its print and the biggest variable in whether it was viewed correctly was the strength of the projector bulb. Whether a creator’s intent is conveyed is sometimes now determined by a work’s endpoint, and until the industry better confronts its paradox of choice and removes some of the burden from the viewership, the question of how to contend with best representing a work’s visual language will remain.

The Pixar Sequel That Almost Wasn’t

Disney-Pixar’s Inside Out 2 did huge business this weekend, it along with Bad Boys doing serious to work to fix this summer’s box office woes. It got me to thinking about Pixar’s first sequel Toy Story 2, itself a giant hit that very nearly wasn’t!

You’ve probably heard this story in one form or another, but back in 2012 The Next Web wrote up a substantive history of the numerous disasters that beset TS2. The first one was that a huge chunk of the movie was accidentally deleted during production!

The Toy Story 2 crew, about 150 people in the animation, lighting and modeling departments of Pixar, had been hard at work for some time on the movie. Simultaneously another 200-250 people were at work finishing up A Bug’s Life, which would be released that Fall.

One day, [Oren] Jacob was in the office of Larry Cutler — along with Larry Aupperle. In what is a crazy stroke of luck, they happened to be looking at a directory in which the assets for the character Woody were stored, when they noticed, on a refresh, that there were suddenly less and less files. “He had an error, I forget the exact [one]. It was like, “Directory no longer valid,” because he’s in a place that had just been deleted. Then he thought to walk up [a directory] and he walked back up and then we saw Hamm, Potato Head, and Rex. Then we looked at it again and there was just Hamm and then nothing.”

The command that had been run was most likely ‘rm -r -f *’, which — roughly speaking — commands the system to begin removing every file below the current directory. This is commonly used to clear out a subset of unwanted files. Unfortunately, someone on the system had run the command at the root level of the Toy Story 2 project and the system was recursively tracking down through the file structure and deleting its way out like a worm eating its way out from the core of an apple.

That’s right, in a move familiar to tech and media businesses the world over, someone wasn’t paying attention to the directory they were in and made a deletion error. Simple to fix with backups, right?

Pixar, at the time, did not continuously test their backups. This is where the trouble started, because the backups were stored on a tape drive and as the files hit 4 gigabytes in size, the maximum size of the file was met. The error log, which would have told the system administrators about the full drive, was also located on the full volume, and was zero bytes in size. This meant that new data was being written to the drive, but it was ‘pushing’ the older files off. But no-one at Pixar knew this yet.

Wow! The next time you’re beating yourself up for not saving more frequently, just remember even a high level tech/entertainment operation like Pixar can completely blow it. It happens to everyone, from your shoppinglist.docx to a multimillion dollar sequel! So how did they solve it?

[Galyn] Susman, the Supervising Technical Director…, had given birth to her son Eli shortly before, and had been working from home. This meant that she had a Silicon Graphics workstation at her house. It was either an Indigo 2 or an Octane, and it was loaded up with a full copy of the movie.

In order for her to work on the movie while out, they had plugged the machine up to the local network and copied the whole file tree over. Then she would receive incremental updates over her ISDN internet connection. For those not in the know, that was like two 56kbps modems duct-taped together (welcome to 1998).

The last update that her machine had gotten could have been as old as a couple of weeks, but at this point, the Pixar team had an incomplete backup and a corrupt tree full of files, and they needed anything they could get their hands on to fix the problem. This was the difference between rebuilding every missing file from scratch and, well, shipping the movie on time.

The next time some control-freak executive in your office claims there’s no benefit to working from home, just send them this article. The entire movie was saved because one of the Pixar employees just happened to have been working from home (in 1998!).

Check out the full article for some of the even crazier subsequent details, like that the film was completely reworked once again about nine months before release. Who would have thought Toy Story 2 and Apocalypse Now had such similarly stressful productions?

Here’s a round-up of cool and interesting links about Hollywood and technology:

How Expats recreated a long-gone Hong Kong… twice! (link)

Theater chain chief Michael O’Leary on how to boost box office. (link)

Taking the AI fight to Big Tech. (link)

Also, don’t forget to take our reader survey. Please and thank you! (link)