Hania Aamir tracks down 'Indian culprit' behind viral explicit deepfake video
Pakistani actress Hania Aamir has identified the individual responsible for spreading her AI-generated deep fake videos, taken swift action to report the person, and urged her fans to do the same.
Hania Aamir, known for her charm and popularity in Pakistan’s entertainment industry, became the centre of controversy when explicit videos featuring her went viral on social media.
While initially unclear whether she was the person in the footage, it soon became evident that the videos were deepfakes—created using AI to superimpose her face onto someone else's body.
The actress responded strongly to the incident on Instagram, calling out the dangers of AI misuse.
"This AI is extremely dangerous. Are there no laws to stop this?" Aamir asked, referring to the video, which she confirmed was not hers.
In a recent update, Hania Aamir revealed that she had tracked down the individual responsible for sharing the deepfake content.
The actress shared a screenshot of the Instagram account in question, belonging to a user named "Anureet Sandhu," who had amassed over 22,000 followers.
The account featured more than 100 posts, many of which were AI-generated videos featuring her face.
The actress asked her followers to help by reporting the account, stating, "She has blocked me, but can you all report this account?"
Soon after her post, the account's name was changed to "Core Sandhu" in an apparent attempt to evade detection.
The user later renamed the account once more to "Sandhu Core."
The videos have since been removed, and her fans responded in large numbers by reporting the profile.
Further investigation revealed the account was created in India and had its location listed as Chandigarh.
According to a screenshot shared by the actress, the account had changed its name 18 times since it was set up in September 2022.
The incident has reignited conversations about the ethical challenges of AI technology and the need for stronger regulations to prevent its misuse.