Seth MacFarlane Uses Augmented Reality to Create "Ted"

Plus: Day for Night Done Right

Happy New Year Hollywood tech nerds!

In this week’s post:

Subscribe to get Hollywood Tech Nerds magically delivered to your inbox every Tuesday!

Seth MacFarlane Uses Augmented Reality to Create Ted

My first experience with augmented reality was all the way back in 2016 when Pokémon Go came on the scene; I have many happy memories of riding the bus around LA, catching Pokémon and taunting my Aunt Joan with my significantly-higher score. Who will “never amount to anything” now, Aunt Joan???

Anyway, I assumed AR’s use cases would never extend far beyond kitsch video gaming, but my prognosticatory powers were once again proved poor by this great piece in Fast Company on Seth MacFarlane’s new show Ted utilizing AR to shoot scenes with the eponymous teddy bear character.

The series, a prequel to the Ted movies debuting Thursday on Peacock, harnesses an augmented reality tool called ViewScreen Studio, developed by Fuzzy Door Tech, the production company’s technology unit. It allows the film crew to see digital assets, like the title animated teddy bear from Ted, in real time as human actors interact with them, making it easier to accurately frame shots around the digital elements and speeding up production.

“We not only make them visible, we allow the camera operators to see the scene through their viewfinders,” Faith Sedlin, president of Fuzzy Door Tech, tells Fast Company

The article continues, “The technology even allowed MacFarlane, who directs the series and plays Ted, to act in the series without needing to leave his workstation or don special motion capture equipment. ViewScreen was able to capture MacFarlane’s facial expressions, animating them onto the bear, without him needing the cumbersome suit, while crew members separately controlled Ted’s virtual body with a video game controller.”

Side note: does Seth MacFarlane sleep? This man is involved with like ten different TV shows; I can barely watch ten different TV shows. I guess this is why MacFarlane is a huge film/TV director/producer and I’m feuding with a one-legged woman in Minnesota!

Aunt Joan aside, I love this type of tech story, taking an innovation with seemingly-limited use cases and finding an entirely new one for it. Now if you’ll excuse me, there’s a Mimikyu down the street and Aunt Joan is catching up!

Day for Night Done Right

I just wanted to take a second to share a fun post I found on r/Filmmakers from filmmaker Borja Moya, who shares some details about how he used day for night to shoot scenes on his short horror film “Trapped.”

From his Reddit post:

I used a Sony A7IV, so… I couldn't change the color temperature in post. I had to commit to a color temp and prep even more….

I [shot] at 2800K and then I played with the overall levels, saturation and contrast. I did this in prep, then loaded a LUT onto the monitor. This was crucial because I could see how much I could push the image on set.

In these shots you can see I added light to the neighbor’s window, and paid attention to little details, like the light coming from the screen of the phone. If it’s dark it should be brighter.

In order to get clean shadows I had to shoot below native ISO, which gives you great shadows by sacrificing dynamic range stops in the highlights.

Click through to the link above to see the original footage and the final product. Very impressive, somebody get a time machine and call the Game of Thrones production team!

Here’s a round-up of cool links about Hollywood and technology:

The Hollywood Reporter’s animation roundtable with Seth Rogen, Kemp Powers, and more! (link)

Get ready for the great AI disappointment. (link)

Variety’s 2024 content-spending projections. (link)