Is Hollywood on board or not? Early signs indicate that major studios and talent agencies are now starting to circle around OpenAI’s latest product, Sora 2, an invite-only TikTok-style video app debuting September 30 that lets users scan their faces and place themselves in hyper-realistic clips.
Creative Artists Agency, the major talent firm run by Bryan Lourd and representing stars like Brad Pitt and Scarlett Johansson, is the latest to publicly draw a line in the sand on Sora 2, which can generate character clips from major studios featuring the likenesses of star talent. The CAA statement, which is not attributed to any executive, takes a broader approach than simply stating that the major agency is excluding its clients from OpenAI’s latest tool. In fact, it doesn’t explicitly use the words “opt-out” at all, but presents Sora 2 as a “misuse” of an emerging technology that “exposes our customers and their intellectual property to significant risks.”
The missive, which emphasizes “control, authorization of use and compensation” for its customers, left the possibility for OpenAI to develop a “solution” to the copyright problem of its own platform.
CAA’s statement takes a slightly different approach than longtime rival agency WME. This memo, issued by Head of Digital Strategy Chris Jacquemin to guide agents, states that “we have informed OpenAI that all WME customers will be excluded from the latest Sora AI update, regardless of whether intellectual property rights holders have deactivated the intellectual property our customers are associated with.”
Other major agencies, including United Talent Agency and Gersh, have yet to take a public position.
Days earlier, even the normally reserved Motion Picture Association, the main lobbying group representing Disney, Netflix, Paramount, Amazon MGM Studios, Sony, Universal and Warner Bros. Discovery, has spoken out against OpenAI’s current plan for Sora 2. MPA chief Charles Rivkin said in an October 6 shot that OpenAI “must recognize that it is its responsibility – not that of the rights holders – to prevent any infringement of the Sora 2 service” and that Altman’s team “must take immediate and decisive action to resolve this issue.”
The next question is whether Altman will backtrack or compromise more. On October 3, the head of OpenAI at least nodded to the issue of rights holders in a tellingly titled post titled “Sora Update #1,” acknowledging that the company wants to “let rights holders decide how to proceed (our goal is of course to make it so compelling that many people want it). There may be edge cases of generations that shouldn’t make it.”
Altman added: “Please expect a very high rate of change from us. »
The full unsigned CAA memo from October 8 is below:
“CAA is unwavering in its commitment to protecting its customers and the integrity of their creations. The misuse of new technologies has consequences that extend far beyond entertainment and media, posing serious and harmful risks to individuals, businesses and societies around the world. It is clear that Open AI/Sora exposes our customers and their intellectual property to significant risks. The question is whether OpenAI and its Partner companies believe that humans, writers, artists, actors, directors, producers, musicians and athletes deserve to be paid and credited for the work they create?
Or does Open AI believe it can simply steal it, disregarding global copyright principles and openly rejecting the rights of creators, as well as the many people and companies who fund the production, creation and publication of these humans’ work? In our opinion, the answer to this question is obvious. Control, authorization of use and compensation are fundamental rights of these workers. Anything less than protecting creators and their rights is unacceptable.
We are open to Open AI’s solutions to these critical issues and remain steadfast in our work with IP companies and leaders, creative guilds and unions, as well as state and federal legislators and global policymakers, to address these challenges and chart an aligned path forward.