President Nawrocki promotes an alleged government crypto program? AI scam is spreading on social media!

A video is circulating on the Internet in which Polish President Karol Nawrocki allegedly promotes the Bitcoin Trader AI investment platform. This is a deepfake that is becoming loud not only in Poland, but also abroad. What’s more, scammers don’t stop at just one ad – how can you avoid falling for it?

When AI works for fraudsters

The viral clip shows Nawrocki allegedly saying: “At exactly midnight today is the last chance to become part of the government’s Bitcoin Trader AI program.” A classic scam using deepfake technology.

The original recording comes from November 2025, when the president justified the use of the veto and called on the government to consult draft laws with him at an early stage.

The demagogue confirmed the manipulation: Nawrocki’s mouth movements and voice do not match the original recording. AI did its job – it superimposed an advertisement for an alleged cryptocurrency platform over the president’s real speech. Interestingly, another video appeared on social media, this time using President Nawrocki’s interview conducted on a bus during the presidential campaign.

President Nawrocki and other politicians used by scammers

President Nawrocki is not the first European politician whose image is promoted by crypto-scams. Similar deepfakes used the image of German Defense Minister and Dutch politician Geert Wilders. The pattern is always the same: victims are redirected to an external website where they are asked for personal information or to pay a deposit. Losses? Potentially thousands of dollars.

AI democratizes not only content creation, but also… fraud. Deepfake technology means that convincing scam ads are created in minutes, not days.

What should you pay attention to so as not to be fooled?

Could President Nawrocki advertise any investment platform? Although the most effective defense against deepfakes is logical thinking, it is worth paying attention to purely technical issues. Although deepfake technology and genAI tools are becoming more and more perfect, matching lip movements to the generated voice is often still imperfect.

Always pay attention to whether your mouth is saying exactly the words that are heard in the audio version. Moreover, pay attention to the quality of the sound – the more distorted it is and the more unnatural the intonation, the greater the chance that the visible clip is a classic deepfake scam.