Press "Enter" to skip to content

“I Died That Day” Lawmakers in the US are getting calls from dead kids

Called “very strange indeed.” Lawmakers in the US are getting calls from dead kids, causing additional dilemmas in the legality issue of deepfakes. Commencing on the sixth anniversary of the Parkland school tragedy, which claimed 17 lives, a number of deepfake, automated calls from the kids who were killed are all part of an effort to increase public awareness of gun safety and advocate for stricter gun laws. However, the topic of this piece isn’t gun control; instead, it’s a pivotal point in artificial intelligence and the use of deepfakes to disseminate messages and communicate with the dead — but will this usage turn out to be illegal?

Joaquin Oliver lost his life in the massacre at the Parkland school. The 17-year-old’s voice can now be heard once more, thanks to AI. Manny Oliver, the father of Joaquin, said, “This is a United States problem, and we have not been able to fix it,” Manny told me in an interview filmed at their house. “If we need to use creepy stuff to fix it, welcome to the creepy.”

The Olivers have a nonprofit website where you can pick among six AI-generated calls, enter your ZIP code, and place a call to one of your elected representatives — with a voice from the dust, so to speak. A journalist from The Wall Street Journal, Joanna Stern, sat down with Joaquin’s parents to ask some poignant and deep-searching questions about using this avenue of the deepfake and why they use their son’s legacy this way. Stern states that now that she has talked to the parents, she understands them and their motivation.

WSJ’s Stern experimented with an AI voice generator from ElevenLabs last year and even fooled her bank and her family. She typed what she wanted her audio deepfake to say. The parents of these deceased kids used this same service with excellent results.

These parents were unable to get a good audio recording of their children. Most of the samples taken from home films had music or background noise. However, the state of technology has advanced to such an extent that creating nearly flawless facsimiles with found audio only requires a few seconds to produce.

What should a voice from the dead say?

Some feel like this is an ethical dilemma  — should one put words in someone else’s mouth when they are not there to defend the position? Joaquin’s parents followed the instructions he posted on social media. He had talked about shielding kids from guns and school shootings before he passed away. Joaquin’s father, Manny, stated that his son wished to be known as someone “big—like Ali, like Mandela, like Lennon.” He sees making this deepfake of his son as a means of fulfilling his son’s request.

All of the parents in the project feel that these deepfakes would be wanted by their children and that this is a way to help cement their children’s legacies and maybe bring new protections for kids at risk of being killed by guns. Still, many parents whose children had been killed in the shootings decided not to participate. These parents don’t want a “bot of their child,” someone said.

Like with other applications of AI, one concern is that it might be misused. Imagine these same kids’ voices being exploited to promote hate speech or false information. The Federal Communications Commission recently declared that using audio deepfakes in scam robocalls is prohibited, citing instances in which individuals were duped by voices that appeared to be familiar relatives or coworkers.

To the disapproval of their families, audio deepfakes of deceased celebrities, including Robin Williams and George Carlin, have been making the rounds on the internet lately.

The debate on deepfakes, for good or ill, will likely continue with intense and supportive arguments on both sides of the issue.

Featured Image Credit:

Source: ReadWriteWeb