DEEPFAKE DECEPTION: HOW AI-GENERATED VIDEO OF DONALD TRUMP DUPED A KARNATAKA LAWYER OF ₹5.93 LAKH

Author: Thirisha S, School of Excellence in Law

ABSTRACT


This paper investigates a cyber fraud case in India where artificial intelligence was exploited to impersonate former U.S. President Donald Trump in a video used to lure victims into a fake investment scheme. A Karnataka-based advocate lost ₹5.93 lakh in the scam, this study highlighting how generative AI can be weaponized in financial fraud. The case provides critical insight into emerging threats at the intersection of deepfake technology and digital financial crime.


INTRODUCTION


In the age of artificial intelligence, the boundary between reality and fabrication is increasingly blurred. Generative AI models, once confined to academic and entertainment settings, are now being repurposed for malicious intent. One such example emerged from Karnataka, India in 2025, where a local advocate was among more than 200 victims scammed through a convincing deepfake video of Donald Trump. The video promoted a fraudulent investment scheme called “Trump Hotel Rental,” resulting in the lawyer losing nearly ₹6 lakh. This case study demonstrates how AI-driven disinformation and identity manipulation can be weaponized to exploit public trust and financial systems.


THE INCIDENT: A BRIEF OVERVIEW


In early 2025, several social media users in Karnataka came across short videos and ads promoting a seemingly lucrative online investment platform known as “Trump Hotel Rental.” These videos featured what appeared to be Donald Trump endorsing the app, inviting users to invest and earn daily profits. The video showed Trump speaking about a hotel-based revenue model, promising daily returns of up to 3%.


A 38-year-old advocate from Haveri, intrigued by the video, downloaded the app and made an initial payment of ₹1,500 as an activation fee.

Over the course of three months, from January to April 2025, he was gradually convinced to invest a total of ₹5.93 lakh, lured by small but consistent daily returns shown in the app interface.

However, when he attempted to withdraw his funds, he was met with additional charges—purportedly for “withdrawal taxes.” After paying the extra fees, the app suddenly disappeared, along with the website and customer service contacts.


ANATOMY OF THE SCAM


The operation behind this scam was sophisticated and methodically executed, combining AI-generated content with traditional phishing and investment fraud tactics.


Deepfake Technology:
The central tool of deception was a hyper-realistic AI-generated video of Donald Trump. Using deepfake software, scammers were able to replicate Trump’s face and voice, lending fake legitimacy to the investment platform. The video was circulated widely on social media platforms like YouTube Shorts and Instagram Reels.


Social Engineering and App Interface:
Once users clicked on the link accompanying the video, they were redirected to download an Android application. The app mimicked professional financial platforms, complete with user dashboards, transaction histories, and growth charts. Victims could see their “profits” accumulate, which increased their confidence in the system.


Escalating Payments and Trust Building:
To gain trust, the scammers initially displayed modest returns (around ₹30 per day). These small but regular profits encouraged users to reinvest higher amounts. The app even showed bonuses for referrals, pushing victims to invite friends and colleagues—further expanding the scam’s reach.


Vanishing Act:
 When users attempted to withdraw their accumulated profits, the app imposed unexpected “tax” fees. After paying, victims discovered that the app had vanished, along with the domain and all forms of contact. The sudden disappearance left them with no recourse.


VICTIM PROFILE AND WIDER IMPACT


While the Haveri-based lawyer’s case gained attention due to the relatively large sum involved, he was not alone. More than 200 people across Karnataka, including residents of Bengaluru, Mangaluru, and Tumakuru, were duped. The total estimated loss exceeded ₹2 crore. Victims came from diverse backgrounds—lawyers, government employees, and small business owners—indicating that this scam did not solely rely on targeting digitally illiterate populations.


The victims’ trust was largely driven by the perceived credibility of Donald Trump, who, despite being a controversial figure, is globally recognized and often associated with real estate investments. The deepfake video played a key psychological role in reducing skepticism.


CYBERSECURITY AND LEGAL RESPONSE:


Following multiple complaints, Karnataka’s Cybercrime, Economic Offenses and Narcotics (CEN) unit launched an investigation. However, identifying and prosecuting the perpetrators is challenging due to the cross-border nature of digital fraud, anonymity provided by VPNs, and the use of decentralized servers.


Law enforcement officials have issued public warnings and advised people to avoid clicking on unsolicited links or investing in schemes that promise unusually high returns. Nevertheless, the case underscores the urgent need for advanced cybercrime detection units trained in AI and digital forensics.


Implications and Policy Recommendations
This case illustrates the dangerous intersection of artificial intelligence and financial fraud. Several broader implications emerge:
Deepfake Threats: AI-generated impersonations are no longer limited to misinformation or pranks they are now powerful tools for fraud, identity theft, and cyber extortion.


Digital Literacy: There is a pressing need to educate citizens about the potential risks of deepfake media and online financial schemes.


Platform Accountability: Social media platforms must improve their detection mechanisms for synthetic media and regulate financial advertisements more strictly.


Legal Framework: Existing cyber laws may not adequately address AI-driven fraud. Governments need to update legal definitions and enforcement protocols for crimes involving deepfakes.


Conclusion


The scam involving a deepfake video of Donald Trump and the loss of ₹5.93 lakh by a Karnataka lawyer is a harrowing example of the dangers posed by AI-generated content when used maliciously. It underscores the need for multi-pronged responses: improved public awareness, robust cybersecurity infrastructure, platform responsibility, and updated legal frameworks. As AI continues to evolve, so must the mechanisms to detect and deter its misuse. This case should serve as a wake-up call for regulators, tech companies, and consumers alike.

FAQS


What is the Trump AI scam
Fake videos using AI-generated Trump clips are used to promote investment scams like “Trump Hotel Rental” or “Golden Eagle’s Project.”


How does it work?
Scammers use deepfake Trump videos to lure users into downloading apps, investing money, and promising high returns. The money then vanishes.


Where has it happened?
In India (e.g., Karnataka), over ₹2 crore lost; in the U.S., people lost thousands via fake coin schemes.


Why use AI/Trump’s image?
Trump’s popularity and realistic AI make the scams look legit and trustworthy.


Is it illegal?
Yes. These are criminal frauds. The U.S. passed a law banning deepfake misuse in May 2025. India investigates under cybercrime laws.


How to stay safe?
Avoid suspicious ads, don’t download unknown apps, verify info, and report scams to authorities.


What should victims do?
File a cybercrime report, alert platforms, and spread awareness to prevent others from falling for it.

References


https://m.economictimes.com/news/international/us/ai-generated-video-of-donald-trump-becomes-new-tool-for-cyber-fraud/articleshow/121417859.cms
https://www.hindustantimes.com/india-news/how-ai-generated-video-of-donald-trump-duped-a-karnataka-lawyer-of-5-93-lakh-details-101748274008281-amp.html
https://indianexpress.com/article/cities/bangalore/ai-generated-trump-video-dupe-people-karnataka-10029617/lite/
https://www.oneindia.com/bengaluru/ai-generated-trump-video-used-in-karnataka-to-dupe-over-200-in-investment-scam-4164151.html

Leave a Reply

Your email address will not be published. Required fields are marked *