Deepfakes: Delving into the world of Voice cloning and scams

Deepfakes: Delving into the world of Voice cloning and scams

Deepfakes and voice cloning- should we be worried?

In the 21st century, the raving news includes deepfakes, voice cloning, and the scams involved, but how much do we know about deepfakes and voice cloning? What are its beneficial uses and scams involved? Deepfakes are videos where the face or body of an individual has been digitally altered so that they appear different, voice cloning on the other hand is an audio deepfake where audios are altered via artificial intelligence to sound like specific people saying things that they haven’t. The most renowned case of deep fakes is that of Mark Zuckerberg taunting Facebook users about accessing and harboring their personal data, stating “I have total control over billions of people’s stolen data.” Read on to find out more about the roots of deep fake and voice cloning. 

Beneficial Deepfakes and Voice Cloning

Deepfakes and Voice cloning have spread across a plethora of industries and have multiple beneficial uses, from helping to maintain the confidentiality of users to providing aid to individuals with health issues, with each passing day there are more and more uses of this advancing technology. One example of a beneficial use of deepfakes is the Zero Malaria Britain advert by David Beckham where he raises awareness for the fight against malaria in nine different languages. This shows us that deepfakes can help us to upscale and make marketing strategies more inclusive. Furthermore, this is done at a lower cost compared to hiring voice actors and in-person actors, while aiming for a more personalized experience for customers.  

Deepfakes can also be used for a more educational purpose, recent developments have enabled AI-generated lecture videos to be produced from text-based content and audio narration which makes it more interactive and effective for visual learners. Additionally, voice cloning techniques have also provided historians with means to replicate the voices of historical figures so that they can give their speeches/ stories in their voices. Deepfakes and voice cloning also contribute to the creation of AI-generated teachers that can match the student’s preference based on gender, voice, personality, and other factors, hence taking into consideration the best learning method suited to the student. 

The Vices and Scams

One of the leading disadvantages or scams of Deepfake technology is the possibility of identity theft and the spreading of misinformation or non-credible sources. This is especially true when politicians and democratic parties are emulated to say incorrect information and political propaganda that can serve to be the root cause of loss of trust in public institutions and manipulation of media systems. The rise of social media platforms has also accelerated the ability of identity theft of a plethora of celebrities worldwide. The most notoriously accurate incident to this date is that of President Obama created by the University of Washington. Just imagine the implications of being able to control the speech of the leader of one of the world’s superpower nations! Unfortunately, the victims of this malicious act are seldom innocent civilians and internet users, diminishing the reliance on online news and ruining the reputation of trustworthy politicians and celebrities. 

The circulation of pornographic and indecent content created through Deepfakes of celebrities’ faces over different bodies, is a drastic misuse of deepfake technology and can tarnish the reputation, dignity, and mental health of the associated people. As of 2019, 96% of deep fakes present on the internet were sexual in nature, and in effect, all of those were of non-consenting women, however, the repercussions on these women’s lives were devastating (Deepfake Adult Content Is a Terrifying and Serious Issue, June 2023). While at first glance AI pornography might seem harmless, the perpetrators who utilize the video commit sexual acts that can be classed as sexual assault. Moreover, it can also lead to the disruption of the victim’s lives for example, in the United States a teacher was dismissed from her school when the parents of her students discovered an explicit video of her that was AI-generated, this led to her losing her job and damaging her reputation. 

How can we avoid these scams?

So, the most sought-after question is, how can we avoid these scams? And what are some tools that we can use to protect ourselves against this AI-generated vice? In order to delve into the solution, we need to consider the process of creating a deepfake. Deepfake synthesizers replicate a 2D face, and then warp it to fit the 3D point of view of the video; observing which way the nose is pointing is a significant giveaway. Other subtle pointers to look out for include jerky movements, shifts in skin tones, strange or no blinking in the video, poor lip-synching, shifts in lighting, and digital artifacts in the image.

State of the art detection technology such as Facebook’s Reverse Engineering and Microsoft’s Video Authenticator, is now helping us to authenticate videos and images using the grayscale elements and AI model fingerprints. These tools inspect a variety of aspects of the media, such as eye movements, facial expressions, and skin textures, to disclose if the media has been tampered with. However, the most effective way of avoiding these scams is to educate yourself and those around you (especially the older generation) on how deepfakes work and the challenges they can bring about and ensure that you are media literate and always find citations of trusted sources. 


Through this blog, you are armed with the knowledge and resources to combat this 21st century AI-generated cyber-attack and also learn how to use deepfakes and voice cloning for virtuous purposes. To conclude, while like any other online platform and advancing technologies, deepfakes have its own share of vices, they also have the potential to change the world for the better and create a modernization route like never seen before. 

Written by: Subiksha Sivachandran

Click to rate this post!
[Total: 0 Average: 0]

Leave your thought