Truth & Goodness
Great Happiness Costs So Little. The Truth About a Culture of Complaining
05 April 2026
Until recently, a video recording was treated as almost unquestionable proof. Now even authentic footage can be dismissed with ease. How to spot a deepfake when awareness of the technology itself is already starting to destroy trust?
Until quite recently, a video recording was treated almost as unquestionable proof. If you saw something on film, it had happened. But what happens when every video can be dismissed as fake? A research team from the University of Würzburg set out to answer that question. Lara Grohmann, Franziska Halle and Professor Markus Appel conducted an experiment whose results were published in Psychology of Popular Media. The study is part of a growing debate in which the question of how to spot a deepfake no longer seems sufficient.
More than 180 participants were shown authentic videos of politicians whose speech sounded slightly slurred, suggesting that they might have been under the influence of alcohol. In one group, the politician apologised for the incident. In the other, the politician denied it and claimed that the recording was a deepfake. The results proved striking. Politicians who denied the authenticity of the recording received higher ratings for competence, leadership ability and credibility. What is more, people who saw the denial were more likely to believe that the video had been manipulated, even though it was genuine. This is consistent with recent reporting on the Würzburg study.
The researchers described this mechanism as the liar’s dividend — a phenomenon in which public awareness of deepfakes allows public figures to dismiss authentic recordings as fabricated. All that is needed is a seed of doubt, and a real video begins to lose credibility. Faced with widespread knowledge of deepfakes, the mind reaches for shortcuts. Instead of analysing the recording itself, people trust the authority that questions the evidence. As a result, authentic video loses its evidentiary power. The term and the study’s interpretation are both reflected in the publication record and related summaries.
What is interesting is not only what the Würzburg study showed directly, but also what it suggested indirectly. The politician’s deepfake claim did not directly reduce trust in the media. Yet people who believed the recording had been manipulated also showed lower levels of general trust in media overall. It is not misinformation alone that destroys trust. It is uncertainty.
These findings align closely with Nina Schick’s arguments in Deepfakes: The Coming Infocalypse. Schick warned as early as 2021 of an approaching “infocalypse” — a moment when we would no longer be able to distinguish truth from fabricated evidence. She also pointed to a crucial paradox: the more we know about deepfakes, the more we begin to suspect even authentic material. Schick discusses that broader collapse of trust in interviews and conversations about synthetic media.
Schick stressed that deepfake technology is not the problem in itself. The problem is the human capacity for manipulation, now reinforced by the power of artificial intelligence. As a result, deepfakes have become a serious threat to democracy, human rights and even the very idea of community. The loss of trust in video evidence genuinely weakens society, leading to polarisation, apathy and easier emotional manipulation.
People start to question the authenticity and veracity of authentic media, which is quite devastating in a world where trust in digital media is absolutely essential for the functioning of society and politics.
– Schick warned of the danger in an interview for the 80,000 Hours podcast.
The loss of trust in visual evidence is not only a technological problem. It is a deeply human one. A society that can no longer distinguish truth from falsehood loses the capacity to act together. Democracy requires a shared basis of facts — and that basis is beginning to crack.
Learning how to spot a deepfake still matters, but it no longer answers the most urgent question. Today, the real problem is that any recording — real or fake — can be challenged. Another question matters just as much: what should we do with the fact that any recording — real or fake — can now be challenged? If authentic video can be dismissed as a deepfake, and fabricated material can be accepted as truth, then what can trust still rest on?
Media literacy, critical thinking and tools for content verification are now essential. But building a culture of trust matters just as much — one grounded in human honesty, source transparency and a willingness to accept uncomfortable truths. In a world where everything can be forged, truth becomes the most valuable resource of all — along with the courage to defend it.
Read this article in Polish: Nawet prawdziwe nagranie to deepfake. Czeka nas infokalipsa
Truth & Goodness
05 April 2026
Zmień tryb na ciemny