New research has shown that thanks to the anatomy of our vocal cords, vocal deepfakes are always easier to recognize than synthetic replicas of real voices.
University of Florida researchers have developed a method to simulate images of the apparent movements of a human vocal tract (opens in new tab) while a voice clip – real or fake – is being played.
Patrick Traynor, professor of computer and information science and engineering, and graduate student Logan Blue wrote that they and their colleagues found that simulations triggered by vocal deepfakes are not caused by “the same anatomical limitations that humans have”. were restricted, with some measurements of the vocal tract having “the same relative diameters and consistency as a drinking straw”.
Although scientists are beginning to detect voice deepfakes using simulations and anatomical comparisons, the risk of deepfakes fooling an ordinary person – which could lead to identity theft – remains a problem.
Ordinary people don’t have access to these tools yet. Even if they ever do, users will still struggle to interpret this data until intuitive and widely used audio-based detection tools come into use.
Because deepfakes are so difficult for normal eyes and ears to detect, expert advice on how to do this is not yet widely known or available. People are also less prepared to be sanely critical of what they see and hear over the internet, the phone, or other media that create some disconnect between what’s really happening.
“Unbelief by default,” in which people become skeptical of anything they see and hear that doesn’t come from a trusted source, is a useful tactic here. The problem within the problem is that not everyone will adopt this strategy as they don’t understand the threat and refuse to engage.
Media literacy has been an important skill for a number of years, as anyone can come across electoral disinformation or baseless conspiracy theories, but Schools are not interested in teaching it (opens in new tab)and the problem of closing this adult skills gap still remains.
Through this skills gap, fake news has proliferated and become embedded in our societies and relationships with loved ones. For this reason, anyone concerned about their loved one’s media literacy should consider investing Identity theft protection for families.
The rise of the persuasive audiovisual deepfake has once again increased the need for a structured, widespread program to educate users in media literacy and the importance of critical thinking on anything with just a razor-thin veil of authenticity.
About the conversation (opens in new tab)
https://www.techradar.com/news/voice-deepfakes-are-getting-easier-to-spot Voice deepfakes are getting easier to spot