In a startling revelation that underscores the growing menace of deepfake technology, a manipulated video featuring renowned Aaj Tak news anchor Chitra Tripathi has gone viral, falsely portraying her as predicting a Congress victory in the upcoming 2024 Lok Sabha elections. The video, which has been widely circulated on social media platforms, has been confirmed as fake by multiple fact-checking organizations and by Tripathi herself.
Chitra Tripathi Viral Video Original
The Viral Video: A Fabricated Narrative
The video in question shows Chitra Tripathi seemingly criticizing Prime Minister Narendra Modi and citing a supposed News 24 survey indicating that over 80% of respondents favor the opposition I.N.D.I.A alliance over the ruling NDA. The clip suggests a significant shift in public opinion ahead of the general elections.
However, upon closer examination, fact-checkers have determined that the video is a doctored version of an old broadcast. The original footage is from a July 2023 episode of Tripathi’s debate show “Dangal” on Aaj Tak, where she was discussing the ethnic violence in Manipur. The viral video has been manipulated by overlaying a fake audio track that mimics Tripathi’s voice, inserting fabricated statements about the political scenario and a non-existent News 24 survey .
Fact-Checkers Debunk the Video
Several reputable fact-checking organizations, including BOOM Live, FACTLY, and NewsMeter, have investigated the viral video and confirmed its inauthenticity. Their analyses reveal discrepancies between the lip movements and the audio, indicating that the voiceover was artificially generated. Furthermore, there is no record of any such survey conducted by News 24 predicting a Congress victory in the 2024 elections .
BOOM Live traced the original footage to a specific segment of the “Dangal” show, confirming that the manipulated video uses visuals from that episode with a fabricated audio overlay . Similarly, FACTLY’s investigation highlighted that the original discussion was centered around the Manipur violence and not the general elections .
Chitra Tripathi Responds
Chitra Tripathi has publicly denounced the video as fake and has called for strict action against those responsible for its creation and dissemination. In a post on her official X (formerly Twitter) account, she stated:
“फेक न्यूज़ है , मेरे वीडियो का ग़लत इस्तेमाल किया जा रहा है. इन लोगों के खिलाफ कड़े एक्शन की जरुरत है.”
Translation: “This is fake news; my video is being misused. Strict action is needed against these people.”
Tripathi has tagged the Delhi Police and the Ministry of Electronics and Information Technology in her post, urging authorities to investigate the matter and hold the perpetrators accountable .
Deepfake Technology
This incident is part of a growing trend where deepfake technology is used to create misleading content involving public figures. Previously, similar manipulated videos have targeted celebrities like Kajol, Rashmika Mandanna, and Sachin Tendulkar. The increasing sophistication of deepfake tools poses significant challenges to information integrity and public trust.
Cybersecurity experts warn that such technologies can be exploited to spread misinformation, influence public opinion, and even interfere with democratic processes. The Chitra Tripathi video serves as a stark reminder of the potential dangers posed by deepfakes, especially in the context of political discourse and upcoming elections.
Calls for Regulatory Measures
The viral spread of the fake video has sparked discussions about the need for stricter regulations and technological solutions to detect and prevent the dissemination of deepfake content. Advocates are urging social media platforms to implement more robust verification mechanisms and for legislative bodies to establish clear legal frameworks addressing the creation and distribution of manipulated media.
As the 2024 general elections approach, ensuring the authenticity of information becomes paramount. The Chitra Tripathi incident underscores the urgency for collaborative efforts between media organizations, technology companies, and government agencies to combat the spread of misinformation and protect the integrity of democratic processes.
Conclusion
The deepfake video involving Chitra Tripathi is a concerning example of how technology can be misused to fabricate narratives and mislead the public. It highlights the pressing need for awareness, vigilance, and proactive measures to safeguard against such manipulations. As society grapples with the implications of deepfake technology, collective responsibility and concerted action are essential to uphold truth and trust in the digital age.