Deepfake videos are the thread to the world and is getting out of hand.

Deepfakes also known as (portmanteau of “deep learning” and “fake”) are synthetic media that have been digitally manipulated to replace one person’s likeness convincingly with that of another. Deepfakes are the manipulation of facial appearance through deep generative methods. While the act of creating fake content is not new, deepfakes leverage powerful techniques from machine learning and artificial intelligence to manipulate or generate visual and audio content that can more easily deceive. The main machine learning methods used to create deepfakes are based on deep learning and involve training generative neural network architectures, such as autoencoders, or generative adversarial networks (GANs). In turn the field of image forensics develops techniques to detect manipulated images.

Deepfakes have garnered widespread attention for their potential use in creating child abuse material, celebrity po****raphic videos, revenge p***, fake news, bull****, and financial fraud. The spreading of disinformation and hate speech through deepfakes has a potential to undermine core functions and norms of democratic systems by interfering with people’s ability to participate in decisions that affect them, determine collective agendas and express political will through informed decision-making. This has elicited responses from both industry and government to detect and limit their use.

AI-generated Deepfake technology has been making waves across the internet. Recently, a couple of videos emerged on social media wherein Anushka Sharma and Aishwarya Rai Bachchan were seen talking about an investment opportunity. However, both clips were later removed.

In now-deleted viral clips, Anushka and Aish donned pink dresses and spoke about investing in a project. However, after investigation, the video turned out to be a deepfake created using artificial intelligence. The original videos got altered to make deepfakes promoting what looked like a financial scam. In both clips, audio was manipulated and replaced with fake ones.

Earlier, Priyanka Chopra also became the victim of this disturbing trend. The manipulated video featured PeeCee during a brand promotion interview, wherein she divulged her yearly earnings. From Rashmika Mandanna, Katrina Kaif to Kajol and Alia Bhatt, multiple cases of AI-generated deepfake videos have stirred up social media in recent times. Following this, many B-town celebs, including Amitabh Bachchan, Mrunal Thakur, Naga Chaitanya, Rashmika Mandanna, and others, expressed concern over such fake clips.

Even Prime Minister Narendra Modi also reacted to the issue and called the AI-powered technological feature ‘problematic.’ He said, ‘A new crisis is emerging due to deepfakes produced through artificial intelligence. This will go in the direction of a big challenge. Rashmika Mandanna, Priyanka Chopra Jonas and Alia Bhatt are among the stars who have been targeted by such videos, in which their faces or voices were replaced with someone else’s.

They are the latest in a string of deepfake videos which have gone viral in recent weeks.

From traditional entertainment to gaming, deepfake technology has evolved to be increasingly convincing and available to the public, allowing the disruption of the entertainment and media industries.

Investigation into the deepfake video of actress Rashmika Mandanna has hit a wall as US-based tech companies, whose portals were purportedly used to make and share the AI-edited/deepfake video, have not provided details to take the case forward, police sources said.

A senior police officer said they had detained a man from Bihar and seized his device. They allegedly found a URL and details of the Instagram reel which the suspect allegedly used to create the deepfake using an AI tool.

“We had sent the URL of the Instagram account and reel to Meta authorities. They came back to us saying they don’t have data for the said account. We told them the suspect deleted the account and information and asked if they could hand us their old data. They have not been cooperating with us. Similarly, we found a URL hosted on and sent their team a letter to help us with the website but they replied saying they don’t have records for the URL. These are companies that usually save old data,” said the officer.

2 thoughts on “Deepfake videos are the thread to the world and is getting out of hand.”

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.