The Tribeca Film Festival will debut a bunch of short films made by AI
The Tribeca Film Festival will debut five short films made by AI, as detailed by The Hollywood Reporter. The shorts will use OpenAI’s Sora model, which transforms text inputs into create video clips. This is the first time this type of technology will take center stage at the long-running film festival. “Tribeca is rooted in the foundational belief that storytelling inspires change. Humans need stories to thrive and make sense of our wonderful and broken world,” said co-founder and CEO of Tribeca Enterprises Jane Rosenthal. Who better to chronicle our wonderful and broken world than some lines of code owned by a company that just dissolved its dedicated safety team to let CEO Sam Altman and other board members self-police everything? The unnamed filmmakers were all given access to the Sora model, which isn’t yet available to the public, though they have to follow the terms of the agreements negotiated during the recent strikes as they pertain to AI. OpenAI’s COO, Brad Lightcap, says the feedback provided by these filmmakers will be used to “make Sora a better tool for all creatives.” Introducing Sora, our text-to-video model.Sora can create videos of up to 60 seconds featuring highly detailed scenes, complex camera motion, and multiple characters with vibrant emotions. https://t.co/7j2JN27M3WPrompt: “Beautiful, snowy… pic.twitter.com/ruTEWn87vf— OpenAI (@OpenAI) February 15, 2024 When we last covered Sora, it could only handle 60 seconds of video from a single prompt. If that’s still the case, these short films will make Quibi shows look like a Ken Burns documentary. The software also struggles with cause and effect and, well, that’s basically what a story is. However, all of these limitations come from the ancient days of February, and this tech tends to move quickly. Also, I assume there’s no rule against using prompts to create single scenes, which the filmmaker can string together to make a story. We don’t have that long to find out if cold technology can accurately peer into our warm human hearts. The shorts will screen on June 15 and there’s a conversation with the various filmmakers immediately following the debut. This follows a spate of agreements between OpenAI and various media companies. Vox Media, The Atlantic, News Corp, Dotdash Meredith and even Reddit have all struck deals with OpenAI to let the company train its models on their content. Meanwhile, Meta and Google are looking for similar partnerships with Hollywood film studios to train its models. It looks like we are going to get this “AI creates everything” future, whether we want it or not.This article originally appeared on Engadget at https://www.engadget.com/the-tribeca-film-festival-will-debut-a-bunch-of-short-films-made-by-ai-181534064.html?src=rss
The Tribeca Film Festival will debut five short films made by AI, as detailed by The Hollywood Reporter. The shorts will use OpenAI’s Sora model, which transforms text inputs into create video clips. This is the first time this type of technology will take center stage at the long-running film festival.
“Tribeca is rooted in the foundational belief that storytelling inspires change. Humans need stories to thrive and make sense of our wonderful and broken world,” said co-founder and CEO of Tribeca Enterprises Jane Rosenthal. Who better to chronicle our wonderful and broken world than some lines of code owned by a company that just dissolved its dedicated safety team to let CEO Sam Altman and other board members self-police everything?
The unnamed filmmakers were all given access to the Sora model, which isn’t yet available to the public, though they have to follow the terms of the agreements negotiated during the recent strikes as they pertain to AI. OpenAI’s COO, Brad Lightcap, says the feedback provided by these filmmakers will be used to “make Sora a better tool for all creatives.”
Introducing Sora, our text-to-video model.
Sora can create videos of up to 60 seconds featuring highly detailed scenes, complex camera motion, and multiple characters with vibrant emotions. https://t.co/7j2JN27M3W
Prompt: “Beautiful, snowy… pic.twitter.com/ruTEWn87vf— OpenAI (@OpenAI) February 15, 2024
When we last covered Sora, it could only handle 60 seconds of video from a single prompt. If that’s still the case, these short films will make Quibi shows look like a Ken Burns documentary. The software also struggles with cause and effect and, well, that’s basically what a story is. However, all of these limitations come from the ancient days of February, and this tech tends to move quickly. Also, I assume there’s no rule against using prompts to create single scenes, which the filmmaker can string together to make a story.
We don’t have that long to find out if cold technology can accurately peer into our warm human hearts. The shorts will screen on June 15 and there’s a conversation with the various filmmakers immediately following the debut.
This follows a spate of agreements between OpenAI and various media companies. Vox Media, The Atlantic, News Corp, Dotdash Meredith and even Reddit have all struck deals with OpenAI to let the company train its models on their content. Meanwhile, Meta and Google are looking for similar partnerships with Hollywood film studios to train its models. It looks like we are going to get this “AI creates everything” future, whether we want it or not.This article originally appeared on Engadget at https://www.engadget.com/the-tribeca-film-festival-will-debut-a-bunch-of-short-films-made-by-ai-181534064.html?src=rss
What's Your Reaction?