Creative Artists Agency (CAA), one of the top entertainment and sports talent agencies, is hoping to be at the forefront of AI protection services for celebrities in Hollywood. With many stars having their digital likeness used without permission, CAA has built a virtual media storage system for A-list talent — actors, athletes, comedians, directors, musicians, […]
© 2024 TechCrunch. All rights reserved. For personal use only.
Creative Artists Agency (CAA), one of the top entertainment and sports talent agencies, is hoping to be at the forefront of AI protection services for celebrities in Hollywood.
With many stars having their digital likeness used without permission, CAA has built a virtual media storage system for A-list talent — actors, athletes, comedians, directors, musicians, and more — to store their digital assets, such as their names, images, digital scans, voice recordings, and so on. The new development is a part of “theCAAvault,” the company’s studio where actors record their bodies, faces, movements, and voices using scanning technology to create AI clones.
CAA teamed up with AI tech company Veritone to provide its digital asset management solution, the company announced earlier this week.
The announcement arrives amid a wave of AI deepfakes of celebrities, which are often created without their consent. Tom Hanks, a famous actor and client on CAA’s roster, fell victim to an AI scam seven months ago. He claimed that a company used an AI-generated video of him to promote a dental plan without permission.
“Over the last couple of years or so, there has been a vast misuse of our clients’ names, images, likenesses, and voices without consent, without credit, without proper compensation. It’s very clear that the law is not currently set up to be able to protect them, and so we see many open lawsuits out there right now,” Shannon said.
A significant amount of personal data is necessary to create digital clones, which raises numerous privacy concerns due to the risk of compromising or misusing sensitive information. CAA clients can now store their AI digital doubles and other assets within a secure personal hub in the CAAvault which can only be accessed by authorized users, allowing them to share and monetize their content as they see fit.
“This is giving the ability to start setting precedents for what consent-based use of AI looks like,” CAA’s head of strategic development, Alexandra Shannon, told TechCrunch. “Frankly, our view has been that the law is going to take time to catch up, and so by the talent creating and owning their digital likeness with [theCAAvault]… there is now a legitimate way for companies to work with one of our clients. If a third party chooses not to work with them in the right way, it’s much easier for legal cases to show there was an infringement of their rights and help protect clients over time.”
Notably, the vault also ensures actors and other talent are rightfully compensated when companies use their digital likenesses.
“All these assets are owned by the individual client, so it is largely up to them if they want to grant access to anybody else… It is also completely up to the talents to decide the right business model for opportunities. This is a new space, and it is very much forming. We believe these assets will increase in value and opportunity over time. This shouldn’t be a cheaper way to work with somebody… We view [AI clones] as an enhancement rather than being for cost savings,” Shannon added.
CAA also represents Ariana Grande, Beyoncé, Reese Witherspoon, Steven Spielberg, and Zendaya, among others.
The use of AI cloning has sparked many debates in Hollywood, with some believing it could lead to fewer job opportunities, as studios might choose digital clones over real actors. This was a major point of contention during the 2023 SAG-AFTRA strikes, which ended in November after members approved a new agreement with AMPTP (Alliance of Motion Picture and Television Producers) that recognized the importance of human performers and included guidelines on how “digital replicas” should be used.
There are also concerns surrounding the unauthorized use of AI clones of deceased celebrities, which can be disturbing to family members. For instance, Robin Williams’ daughter expressed her disdain for an AI-generated voice recording of the star. However, some argue that, when done ethically, it can be a sentimental way to preserve an iconic actor and recreate their performances in future projects for all generations to enjoy.
“AI clones are an effective tool that enables legacies to live on into future generations. CAA takes a consent and permission-based approach to all AI applications and would only work with estates that own and have permissions for the use of these likeness assets. It is up to the artists as to whom they wish to grant ownership of and permission for use after their passing,” Shannon noted.
Shannon declined to share which of CAA’s clients are currently storing their AI clones in the vault, however, she said it was only a select few at the moment. CAA also charges a fee for clients to participate in the vault, yet didn’t say exactly how much it costs.
“The ultimate goal will be to make this available to all our clients and anyone in the industry. It is not inexpensive, but over time, the costs will continue to come down,” she added.
Leave a Reply