Why Robot No Make Movie?

PLUS: Frame Rate Flame Bait

Greetings Hollywood tech nerds!

In this week’s post:

Subscribe to get Hollywood Tech Nerds magically delivered to your inbox every Tuesday!

Why Robot No Make Movie?

I’ve been assiduously trying to avoid writing too much about AI stories because what else can be written about this tediously over-discussed topic? I can’t even enjoy Super Bowl commercials without this slop being force-fed to me!

However, I also find cleaning my bathtub tedious, yet clean I must, and not because I enjoy a long bubble bath. Those are lies! So I at least found this entertainment-related AI story from Variety interesting: “Visual Effects Workforce Isn’t Feeling AI Pinch Yet.”

Despite the dramatic proclamations from such Hollywood luminaries as Ben Affleck and Jeffrey Katzenberg about AI transforming the visual effects and animation industries, the data tells a more measured story.

While AI promises to revolutionize the notoriously expensive and labor-intensive VFX process, its actual impact on production — particularly for high-end feature films and premium television — appears to be much more incremental.

Our analysis of findings collected for the Visual Effects World Atlas reveals surprising stability among the largest VFX studios. Employment at the 10 biggest studios peaked in April 2023, just before the WGA strike, and has since declined by only 9%.

Does AI promise “to revolutionize the notoriously expensive and labor-intensive VFX process?” What does that mean? What does that look like?

AI already has revolutionized these things over the years. To use a very simple example in the realm of audio, just look at a free audio software program like Audacity. Audacity has functions called Noise Reduction and Noise Gate. Let’s say you’ve recorded something important, like, I don’t know, a podcast dedicated to hologram T-shirts that you could get via Ghostbusters cereal in the late 80s.

To your shock and horror, when you play back the audio, the background contains the hum of a motorcycle your unhinged aunt left running in the garage as she brawled with a garbage man she was convinced was stealing from her. While this would be considered ruined in the past, with Audacity you would just highlight a clean sample of the motorcycle sound and - with the tools above - you could remove it entirely from your podcast.

This functionality is essentially AI: you are training the software on what sound to remove. It has also been present in Audacity for over 10 years, there’s nothing new or shocking about this tool or others that creative people use in their workflows.

The problem is that AI hype around generative AI has made it seem like we are just around the corner from being able to type “The Avengers fight 20 Robert Downey Jr. variants” and having it come out looking exactly as needed, thus eradicating the need for VFX artists. But this isn’t the case, generative AI continues to be a slot machine, and the supposed killer app that rids the moneymen of these troublesome humans is still yet to be created. However, AI/machine learning tools will continue to improve and make tasks easier.

Frame Rate Flame Bait

I really loved this video mega-essay from Media Division on creating films that feel cinematic. It’s basically a film school masterclass with lots to offer even seasoned professionals to consider.

Watch the entire thing! The section I wanted to focus on a bit was the mysterious and magical qualities of frame rates. Many tech-minded modern directors from Ang Lee to Peter Jackson have experimented with High Frame Rate (HFR) production and projection, yet it never seems to take off. Why’s that? Well, see below:

This is a 60fps clip from Ang Lee’s Billy Lynn’s Long Halftime Walk, which was shot and in some cases projected at 120fps.

But… it looks weird. Right? According to the film’s Wikipedia page, “Lee… wanted the film to be an "immersive" and "realistic" experience…” However IMO in practice it has the complete opposite effect. When I saw The Hobbit: An Unexpected Journey in HFR in 2012 I felt the same way. It seemed like actors walking around on sets!

James Cameron probably had the best approach, mixing HFR and standard 24fps on his second Avatar film:

“We’re using [high frame rate] to improve the 3D where we want a heightened sense of presence, such as underwater or in some of the flying scenes,” Cameron told audiences via video call at the Busan International Film Festival in South Korea. “For shots of just people standing around talking, [high frame rate] works against us because it creates a kind of a hyper-realism in scenes that are more mundane, more normal. And sometimes we need that cinematic feeling of 24fps.”

I’m very excited to see what he has in store for Avatar 3!

Here’s a round-up of cool and interesting links about Hollywood and technology:

Oscars consider requiring disclosure of AI usage. (link)

Rethinking our expectations of tech. (link)

TikTok is back on US app stores. (link)