VOICE DEEPFAKES

VOICE DEEPFAKES

WHY IN NEWS ?

  • Recently, the Indian government instructed “social media intermediaries” to remove morphed videos or deepfakes from their platforms within 24 hours of a complaint being filed, in accordance with a requirement outlined in the IT Rules 2021.

WHAT ARE VOICE DEEPFAKES ?

  • A voice deepfake is one that closely mimics a real person’s voice.
  • The voice can accurately replicate tonality, accents, cadence, and other unique characteristics of the target person.
  • People use AI and robust computing power to generate such voice clones or synthetic voices.
  • Sometimes it can take weeks to produce such voices, according to Speechify, a text-­to-­speech conversion app.

HOW ARE THEY CREATED ?

  • The creation of deepfakes needs high-­end computers with powerful graphics cards, leveraging cloud computing power.
  • Powerful computing hardware can accelerate the process of rendering, which can take hours, days, and even weeks, depending on the process.
  • Besides specialised tools and software, generating deepfakes need training data to be fed to AI models.
  • This data are often original recordings of the target person’s voice.
  • AI can use this data to render an authentic­ sounding voice, which can then be used to say anything.

POSITIVE USES OF DEEPFAKES

  • The technology is powerful, and no doubt, needs to have guardrails to defend against abuse.
  • It has been used as a way to help people who have lost their voice from throat diseases or other medical issues to get their voice back.
  • From business perspective, It can be used to create a brand mascot or provide variety to content such as weather and sports reports in the broadcast world.
  • Entertainment companies can also bring back past talent or incorporate the voice of a historical figure’s voice into their programming.

POSSIBLE THREATS FROM VOICE DEEPFAKES ?

  • With no barriers to creating AI-synthesized text, audio and video, the potential for misuse in identity theft, financial fraud and tarnishing reputations has sparked global alarm.
  • It can be put to illegal activities as to defraud users, steal their identity, and to engage in various other illegal activities like phone scams and posting fake videos on social media platforms.
  • Making deepfake voices to impersonate others without their consent is a serious concern that could have devastating consequences.
  • Voice deepfakes used in filmmaking have also raised ethical concerns about the use of the technology. For ex. Morgan Neville’s documentary film on the late legendary chef Anthony Bourdain used voice­ cloning software to make Bourdain say words he never spoke. This sparked criticism.
  • Improvement of voice capturing technology as making the data fed to AI models more accurate and leading to more believable deepfake voices can be more scary for illicit illegal uses.

IMPACT OF VOICE DEEPFAKES ON ELECTIONS:

There are several audio clips of political leaders available online.

And what is required to create a flawless audio clone is hardly a one or two minute clip.

During elections, this misinformation can influence public opinion, damage the reputation of candidates, and manipulate the democratic process.

Also, sections of the Indian Penal Code and IT Act can be invoked by police against those spreading misinformation and fake news.

HOW CAN WE DETECT VOICE DEEPFAKES ?

  • Detection of voice deepfakes need highly advanced technologies, software, and hardware to break down speech patterns, background noise, and other elements.
  • USE OF BLOCKCHAIN TECHNOLOGY : Research labs use watermarks and blockchain technologies to detect deepfake technology.
  • USE OF SOPHISTICATED SOFTWARES : “DEEPTRACE”, uses a combination of antivirus and spam filters that monitor incoming media and quarantine suspicious content.
  • Researchers at the University of Florida developed a technique to measure acoustic and fluid dynamic differences between original voice samples of humans and those generated synthetically by computers. They estimated the arrangement of the human vocal tract during speech generation and showed that deepfakes often model impossible or highly unlikely anatomical arrangements.

WAY FORWARD

  • Different countries around the globe have passed legislations to curb the misuse of deepfake tech.
  • The EU has issued guidelines for the creation of an independent network of fact-checkers to help analyse the sources and processes of content creation.

SYLLABUS: GS 3,MAINS–SCIENCE & TEECHNOLOGY, INTERNAL SECURITY

SOURCE : THE HINDU

CIVIL SERVICES EXAM