In his striking article, “Lying Increases Trust in Science,” philosopher B. V. E. Hyde unpacks a paradox at the heart of public science communication: transparency, long hailed as the cornerstone of trustworthiness, can sometimes erode public trust. His key argument is that while transparency is vital, what is disclosed matters greatly. Revealing bad news—errors, failed predictions, or internal conflict—can damage trust, even when honesty is the communicator’s intent.
This leads to what Hyde calls the transparency paradox: To remain trusted, institutions might be incentivized to withhold or distort negative information, creating a moral and epistemic hazard.
🔍 Key Claims
-
Transparency ≠ Automatic Trust
Hyde distinguishes between being trustworthy and being trusted. Transparent institutions may paradoxically be less trusted because the public interprets fallibility as incompetence. -
“Lying works,” but it’s wrong
While lying or selective silence can increase trust in the short term (by avoiding the display of fallibility), Hyde cautions against this practice—morally and strategically. -
The deeper issue is public idealization
People often expect science to be infallible, unified, and purely objective. But science is an iterative, uncertain, and often contested process. It’s this gap between public idealization and scientific reality that produces disillusionment when negative disclosures occur. -
**The real fix is to recalibrate expectations, not obscure reality.
✅ Commendations
-
Theoretical sophistication: Hyde’s nuanced distinction between trust and trustworthiness pushes beyond oversimplified narratives.
-
Courageous moral stance: The paper acknowledges uncomfortable truths (yes, lies can be effective), but never condones deception.
-
Cross-disciplinary reach: Drawing from empirical psychology, sociology, philosophy, and science studies, it bridges theoretical and practical debates.
⚠️ Critical Reflections
-
Empirical ambiguity: The claim that lies or omissions build trust is inferred, not directly tested. Experimental validation is needed.
-
Social context underplayed: The focus on public naïveté might underestimate the impact of political polarization, media distortion, or institutional betrayal.
-
Vague solutions: While Hyde calls for better public understanding, he leaves open how institutions can realistically reshape public perceptions.
🚀 The Way Forward: From Naïve Trust to Informed Confidence
If Hyde is right—and the transparency paradox stems from a mismatch between how science works and how it’s perceived—then the solution isn’t to lie better. It’s to communicate better and educate deeper. Here's a roadmap:
1. Teach Scientific Uncertainty as Strength, Not Weakness
Most science education glosses over how knowledge evolves. Students memorize facts, not how those facts were contested or refined. Curricula should emphasize uncertainty, error correction, and probabilistic reasoning—hallmarks of scientific progress.
Example: Use historical case studies (e.g., germ theory, climate modeling) to show how dissent and failure are integral, not antithetical, to science.
2. Embrace Narrative Transparency
Scientific institutions can present bad news more effectively by contextualizing it. Rather than “We were wrong,” frame it as “We’ve learned more.” When framed as part of a larger narrative of progress, mistakes can increase credibility.
Example: In vaccine updates, explain how variant-driven changes reflect responsiveness, not original failure.
3. Build Institutional Reflexivity
Trustworthy institutions should proactively audit their own transparency practices. Are they disclosing selectively? Are press releases cherry-picking findings? Are they equipping communicators to handle uncertainty gracefully?
Example: Journals and universities can develop “Trust Impact Statements”—disclosures about how results are communicated and what limitations are acknowledged.
4. Promote Two-Way Engagement, Not Top-Down Communication
Science communication should move beyond public lectures and expert statements. Citizen science, public deliberation forums, and co-production of knowledge empower people to engage with science on their terms—reducing idealization and increasing ownership.
Example: Climate councils involving local citizens and researchers making collective decisions about adaptation plans.
5. Separate Institutional Trust from Scientific Trust
One of the most insidious dynamics is misplaced mistrust. People might distrust science because they distrust governments, corporations, or media outlets associated with it. Clarifying which institution is making the claim—and on what evidence—helps preserve the epistemic core of science from political fallout.
🧠 Final Thought
B. V. E. Hyde’s “Lying Increases Trust in Science” is not about endorsing lies—it’s a warning: the current social contract between science and society is fragile, skewed by unrealistic ideals and shaped by selective transparency.
Rather than succumbing to dishonesty or panic, the way forward is through cultural transformation—teaching that science, like democracy, is built on deliberation, disagreement, and revision. Trust won’t come from perfection—it will come from honest complexity.
🔗 Read the Full Article:
Hyde, B. V. E. (2025). Lying increases trust in science. Theory and Society. Springer link | ResearchGate
No comments:
Post a Comment