A deceptive video circulating on social media platforms like X and Instagram falsely portrays Polish President Karol Nawrocki endorsing an investment platform called Bitcoin Trader AI. The clip, which claims the president is promoting a limited-time government program, has been identified as a manipulated deepfake created using artificial intelligence. This incident highlights a growing trend of online scams leveraging the likeness of prominent political figures to defraud investors.
The Deepfake and Its Origins: Misinformation Around Bitcoin Trader AI
The fabricated video alleges that President Nawrocki stated, “Exactly after midnight today is the last chance to become part of the government’s Bitcoin Trader AI programme.” However, Polish fact-checking organization Demagog has determined the clip is a digitally altered version of a speech delivered in November 2025. The original footage showed Nawrocki explaining his use of veto power and advocating for government consultation on draft legislation.
A reverse image search confirmed Demagog’s findings, revealing the source material was unrelated to cryptocurrency investments. President Nawrocki, a member of the Law and Justice (PiS) party, has a history of political clashes with current Polish Prime Minister Donald Tusk, and frequently utilizes his constitutional veto power. He has recently vetoed legislation concerning Ukrainian refugee benefits, animal welfare, and the implementation of the EU’s Digital Services Act.
AI-Powered Manipulation
Demagog’s analysis indicates the deepfake was created by manipulating Nawrocki’s mouth movements and voice to align with the advertisement’s script. This demonstrates the increasing sophistication and accessibility of AI tools used to generate convincing, yet entirely false, content. The use of AI in these scams makes it more difficult for individuals to discern legitimate information from fraudulent schemes.
While Nawrocki has not publicly endorsed any cryptocurrency platform, he did veto a bill in December 2025 aimed at regulating the crypto-assets market in Poland. He argued at the time that the proposed regulations were overly restrictive. This action, while not an endorsement, has been exploited by scammers to lend a false sense of legitimacy to their schemes.
A Wider Pattern of Political Deepfake Scams
This incident is not isolated. Investigations have revealed a concerning pattern of similar deepfakes targeting other European leaders. Euronews reports that German Defence Minister Boris Pistorius has also been featured in AI-generated investment scams.
Additionally, scammers have created YouTube videos mimicking Dutch politician Geert Wilders, using synthetic voice cloning and AI-generated visuals to promote fraudulent trading platforms. These schemes typically redirect victims to websites requesting personal information or initial deposits, resulting in significant financial losses – often thousands of dollars – according to consumer protection groups.
The EU’s Digital Services Act, which Nawrocki recently refused to sign into law, includes provisions designed to compel online platforms to swiftly remove illegal content, including financial scams. This refusal, coupled with the rise of these deepfakes, creates a challenging environment for combating online fraud. The Act aims to create a safer digital space for users, but its implementation is crucial for effectiveness.
The proliferation of these scams underscores the growing threat of AI-powered misinformation. The ease with which convincing deepfakes can be created is accelerating the pace of these fraudulent activities and eroding public trust.
As authorities continue to investigate these incidents, consumers are urged to exercise extreme caution when encountering investment opportunities promoted by public figures online. Verify information through official channels and be wary of promises of guaranteed returns or limited-time offers. Staying informed about the latest scam tactics and practicing healthy skepticism are essential defenses against these increasingly sophisticated threats.
The situation remains fluid, and further investigation is needed to identify the perpetrators behind these deepfake scams. Consumers should remain vigilant and report any suspicious activity to the appropriate authorities.

