WHY WE NEED A.I. REGULATION NOW
A.I. IS HURTING VOICES ACTORS NOW
Discussion paper”, much is said of “high risk” applications of AI. "High Risk" mainly refers to medical applications and future concerns like self-driving vehicles. Of course, advancements in AI could have great benefits in medicine and productivity. However, the positive potential of AI technology also comes with a reverse side that must be considered.
When it comes to applications of AI technology in communications, media, and creative sectors
the damage AI is already being felt putting these categories into a level of imminent risk,
actual real world damage to industry, labour displacement, media trust and intellectual property theft.
With an epidemic of violence towards women, AI voice clones can become one more tool for those who impose coercive control over others. The damage AI voice clones can do to a person's reputation and identity can be irreversible, which is why this technology needs appropriate regulation.
WHAT IS AN AI VOICE?
Technology is getting better every day, in some instances indistinguishable from the real thing, but AI needs your voice to do that. Remember it's YOUR VOICE, YOUR CHOICE.
AAVA acknowledges that this technology is here and some people and organisations will choose to work with it on occasion. Our aim is that this will be done ethically with the rights and considerations of humans put first. AAVA urges AI developers to adhere to practices that respect the artist at all times.
CONTRACTS - WHAT TO LOOK FOR
-
“Simulation”
-
“Synthesisation”
-
“Digital double”
-
“Machine Learning”
-
"Clone"
If a contract mentions any of these things, there is a good chance that cloning of your voice is being considered. You should also be mindful of any contract that uses phrases such as "in perpetuity". This means indefinitely, or put simply, forever.
IS AAVA ANTI-AI?
The legislative landscape in the U.S. is very different from Australia, and states have more power to legislate on various matters. Two state laws have passed recently in the U.S.; CA AB2602 in California and S7676B in New York, which afford Voice Actors (and performers in general) stronger protections when it comes to generative AI.
Per state laws in CA and NY, any contract for an AI digital replica must include a “Reasonably Specific Description of Intended Uses” if the talent is (1) Not working under a union contract or (2) Not represented by a lawyer.
This law doesn't only protect those in California and New York, if you're a performer engaging with a production based in those states, you will be protected by those laws too!
A comprehensive explanation of AB2602 and S7676B can be found at this page on the website of NAVA (The National Association of Voice Actors USA) website.