WHY WE NEED A.I. REGULATION NOW


 

A.I. IS HURTING VOICES ACTORS NOW

In the Department of Industry Science and Resources “Safe and Responsible AI in Australia -
Discussion paper”, much is said of “high risk” applications of AI. "High Risk" mainly refers to medical applications and future concerns like self-driving vehicles. Of course, advancements in AI could have great benefits in medicine and productivity. However, the positive potential of AI technology also comes with a reverse side that must be considered.

When it comes to applications of AI technology in communications, media, and creative sectors
the damage  AI is already being felt putting these categories into a level of imminent risk,
actual real world damage to industry, labour displacement, media trust and intellectual property theft.

With an epidemic of violence towards women, AI voice clones can become one more tool for those who impose coercive control over others. The damage AI voice clones can do to a person's reputation and identity can be irreversible, which is why this technology needs appropriate regulation.

A I-4.png

WHAT IS AN AI VOICE?

According to AI Developer ELEVEN LABS who is a leader in the creation of AI voices, "AI voice cloning is a technology that creates synthetic copies of human voices. It analyses audio recordings to mimic the tone, pitch, and characteristics of a person's voice."

Technology is getting better every day, in some instances indistinguishable from the real thing, but AI needs your voice to do that. Remember it's YOUR VOICE, YOUR CHOICE. 

AAVA acknowledges that this technology is here and some people and organisations will choose to work with it on occasion. Our aim is that this will be done ethically with the rights and considerations of humans put first. AAVA urges AI developers to adhere to practices that respect the artist at all times.

A I-6.png

CONTRACTS - WHAT TO LOOK FOR

In the US 2024 "The State of Voice Over Survey", 6% of Voice Actors claimed that a synthetic version of their voice or a character they voiced had been created without their permission, and that is only the performers who have been made aware of it. This statistic is only set to rise as the take-up of this technology increases. If you see any of the following language in a contract, chances are the client is looking to use your voice to create an AI Voice:
  • “Simulation”
  • “Synthesisation”
  • “Digital double”
  • “Machine Learning”
  • "Clone"

If a contract mentions any of these things, there is a good chance that cloning of your voice is being considered. You should also be mindful of any contract that uses phrases such as "in perpetuity". This means indefinitely, or put simply, forever. 

A I-5.png

IS AAVA ANTI-AI?

AAVA is not anti-AI. But we are pro-human.
AI technology has the potential to make our lives better, but there are risks to using this technology without guardrails and regulation. At AAVA we understand that there are some jobs that a person simply can not do. In those circumstances, the use of AI technology should always be employed ethically.

A I.png

The legislative landscape in the U.S. is very different from Australia, and states have more power to legislate on various matters. Two state laws have passed recently in the U.S.; CA AB2602 in California and S7676B in New York, which afford Voice Actors (and performers in general) stronger protections when it comes to generative AI.

Per state laws in CA and NY, any contract for an AI digital replica must include a “Reasonably Specific Description of Intended Uses” if the talent is (1) Not working under a union contract or (2) Not represented by a lawyer.

This law doesn't only protect those in California and New York, if you're a performer engaging with a production based in those states, you will be protected by those laws too!

A comprehensive explanation of AB2602 and S7676B can be found at this page on the website of NAVA (The National Association of Voice Actors USA) website.

Find out more about Ethical AI

Ai VOICE CHOICE.png

CLICK HERE