Voice artists Paul and Linnea sue Lovo for cloning their voices without consent, raising legal issues on AI and personality rights.
Key Takeaways
- AI companies can clone voices without explicit consent, leading to legal disputes over personality rights.
- Licenses for voice recordings must be specific and respected to prevent misuse in AI applications.
- Artists and voice actors are increasingly fighting back against unauthorized use of their work by AI firms.
- The case underscores the evolving legal landscape around AI, intellectual property, and personal identity.
- Transparency and consent are crucial when using human voices for AI-generated content.
Summary
- Voiceover actors Paul and Linnea discovered their voices were cloned and sold by AI company Lovo without their permission.
- They recorded generic scripts for a freelancing site, unaware their voices would be used for AI voice cloning.
- Paul and Linnea filed a class action lawsuit against Lovo for stealing their voices and identities without proper consent or compensation.
- The legal case centers on rights of publicity, where a person's voice is considered part of their personality, not copyright.
- The couple's limited licenses for voice use were allegedly violated by Lovo's broader commercial exploitation.
- Lovo co-founder Tom Lee publicly explained their voice cloning technology but the company did not respond to BBC interview requests.
- The voices have been removed from Lovo's website, but ads with Paul’s cloned voice still exist online.
- This lawsuit is part of a growing trend of artists suing AI companies to protect their creative work and livelihoods.
- The case highlights ethical and legal challenges posed by AI technologies in creative industries.
- Paul and Linnea emphasize the need to stand up against exploitation disguised as innovation.











