From risk to revenue: talent & creatives in the age of AI

August 6, 2025
Hester Bates

In the age of AI, your face, your voice, and your personality are more valuable and more vulnerable than ever. As a creator, influencer, actor, athlete, or public figure, you’ve spent years building an identity, and that identity is now prime material for AI. Whether it’s voice cloning, deepfakes, AI-generated avatars, or digital twins, technology can now replicate you in ways that were unthinkable to most of us five years ago.

Why this matters

Until recently, a talent’s IP was fairly straightforward: your content, your likeness, your name. But AI has blurred these lines. Today, someone could:

  • Train a model on your voice and use it in an ad you didn’t agree to.
  • Create a synthetic version of your face and license it without your consent.
  • Repurpose content you posted years ago for paid campaigns you were never paid for.
  • Use AI to create versions of you in languages you don’t even speak.

It’s scary but unless you’ve clearly defined your rights and have the systems in place to monitor them, there may be little you can do to stop it.

What rights do you have?

Talent has more rights than they think, but still far fewer than they should.

In the UK and many other markets, there’s no specific “personality rights” law yet. That means your protection depends on a patchwork of IP law, contract law, and data rights. You might also have signed away more than you realise in old contracts. You might not even know if your image is being used without your permission.

That’s why at TrueRights, we’re pushing for statutory personality rights, a legal framework that protects you and your identity and ensures you have the power to decide how it’s used.

What could go wrong?

Unfortunately, this isn’t hypothetical or a situation that may arise further down the road, as AI develops. Steven Bartlett, entrepreneur and host of The Diary of a CEO, has spoken publicly about crypto scams that have used AI-generated videos of his face and voice to promote products and investment schemes he has nothing to do with. These videos are incredibly convincing, and they’ve misled fans and consumers, all while exploiting his likeness, without consent.

This is just one example. Over the past few months, we’ve seen creators' voices used in scam ads; AI influencers model clothing they never wore; and brands run paid campaigns using creators’ old content, without renewing usage rights, or even telling them. Without the right safeguards, AI will be used to scale such exploitation. Talent will lose everything from control to revenue, to trust. 

It’s not all bad 

AI doesn’t have to be a threat. In fact, at TrueRights we believe it will be a new frontier for talent.

With the right tools and agreements in place, talent will now be able to license a digital version of themselves and earn revenue while they sleep. Imagine creating multilingual content without ever stepping into a studio or collaborating with brands faster, more creatively, and more globally because you’ve built the legal and commercial structure to do it safely. The key is consent, control, and compensation.

What talent can do now
  • Review your contracts. Make sure they’re clear on AI usage and digital rights.
  • Track your usage. Keep tabs on where and how your content is being used, especially in paid campaigns.
  • Set boundaries. Define what you are and aren’t comfortable licensing.
  • Ask for support. Talent agencies, platforms, and partners should be helping you navigate this.
  • Join the conversation. Push for stronger protections. Help shape the future of talent rights.

At TrueRights, we’re building tools and frameworks to help you do all of the above, from usage pricing and tracking to licensing templates and from legal education to policy change. If you want to understand your rights or your risks in the age of AI, get in touch. We’d love to help.