“Everybody is vulnerable to attack, and anyone can do the attacking,” said Hany Farid, a professor at the University of California, Berkeley, who focuses on digital forensics and misinformation.Manipulating recorded sounds and images isn't new. But the ease with which someone can alter information is a recent phenomenon. So is the ability for it to spread quickly on social media.
The bogus audio forced Eiswert to go on leave, while police guarded his house, authorities said. Angry phone calls inundated the school, while hate-filled messages accumulated on social media. But given AI's growing capabilities, Farid said the Maryland case still serves as a “canary in the coal mine," about the need to better regulate this technology.That’s partly because the technology has improved so quickly. Human ears also can’t always identify telltale signs of manipulation, while discrepancies in videos and images are easier to spot.over the phone to get ransom money from parents, experts say.
Farid said more needs to be done. For instance, all companies should require users to submit phone numbers and credit cards so they can trace back files to those who misuse the technology.“You modify the audio in ways that are imperceptible to the human auditory system, but in a way that can be identified by a piece of software downstream,” Farid said.