Press "Enter" to skip to content

Institutional Authority Has Vanished. Wikipedia Points to the Answer.

Much as his predecessors warned Americans against tobacco and opioid abuse, Surgeon General Vivek Murthy issued a formal advisory last Thursday that misinformation—such as the widespread propaganda now sowing doubts about coronavirus vaccines on social media—is “an urgent threat to public health.” It is, but the discussion soured quickly. After President Joe Biden said social-media platforms that turn users against vaccines are “killing people,” an anonymous Facebook official told CNN that “the White House is looking for scapegoats for missing their vaccine goals.” When Press Secretary Jen Psaki said the White House is “flagging problematic posts for Facebook,” conservatives and Twitter contrarians inferred that the government was telling the company to censor people. The journalist Glenn Greenwald described the effort as “fascism.”

Greenwald and Facebook are minimizing a genuine problem: An “infodemic”—involving the viral spread of misinformation, as well as the mingling of facts with half-truths and falsehoods in a fractured media environment—has compounded the COVID-19 pandemic. But critics of Murthy’s initiative and Biden’s comments are right about one thing: The official health establishment has made the infodemic worse through its own inability to cope with conflicting scientific views. In the early days of the pandemic, experts at the World Health Organization, CDC Director Robert Redfield, National Institute of Allergy and Infectious Diseases Director Anthony Fauci, and then–Surgeon General Jerome Adams discouraged mask wearing and only belatedly reversed course; some of the same voices later pooh-poohed the notion that the coronavirus first began spreading after escaping from a Chinese research lab—a possibility now being taken far more seriously.

[Daniel Engber: Don’t fall for these lab-leak traps]

Anti-vaccination propagandists and social-media provocateurs alike have exploited these missteps to great effect; even those inclined to trust the government have lost some confidence in official pronouncements. If the Biden administration hopes to reverse that, it should ask itself: What could the CDC do differently if the lab-leak hypothesis first surfaced today?

What the United States needs if it hopes to combat misinformation is a better system for communicating with the public—a system that keeps up with continuous changes in scientific knowledge; that incorporates expertise from people in a variety of fields, not just those anointed with official titles at well-known institutions; and that weaves dissenting perspectives into a larger narrative without overemphasizing them.

Fortunately, the internet has produced a model for this approach: Wikipedia. The crowdsourced reference site is the simplest, most succinct summary of the current state of knowledge on almost any subject you can imagine. If an agency such as the CDC launched a health-information site, and gave a community of hundreds or thousands of knowledgeable people the ability to edit it, the outcome would be far more complete and up-to-date than individual press releases. The same model—tapping distributed expertise rather than relying on institutional authority—could be useful for other government agencies that find themselves confronting rumors.

[Renée DiResta: Virus experts aren’t getting the message out]

The idea of making government websites more like Wikipedia may sound far-fetched, even comical. People of a certain age—people such as me—remember our teachers telling us, “Wikipedia is not a source!” And yet, over two decades, Wikipedia has flourished. Though perhaps still not citable for academic work, the site provides reliable, up-to-date information about millions of topics, backed by robust sourcing. And it meets the needs of the moment: the incorporation of a wide swath of voices; transparency about who is saying what; and a clear accounting, via the “Talk” page accompanying each entry, of every change to the consensus narrative.

An officially sanctioned but broadly sourced version of Wikipedia for health matters could also serve as a robust resource for Facebook, Twitter, YouTube, and other social-media companies to point their users to. Tech platforms are currently expected to counter misinformation by amplifying authoritative sources, but they are also aware that simply linking to the CDC’s and WHO’s official sites is not resonating with many audiences. When internet users show that they trust crowdsourced information more than any one agency’s pronouncements, figuring out how to generate the best crowdsourced information possible is a matter of urgency.

As a researcher, I study misinformation, but I’m also concerned about threats to freedom of expression. Although health misinformation can cause significant harm to communities, heavy-handed content moderation—even when intended to limit that harm—exacerbates deep distrust and fears of censorship. Knowledge evolves. New facts should change people’s minds. Sometimes—as with masks—the loudest calls to reconsider the prevailing consensus come from those outside of government.

Murthy’s advisory recognizes this: “It is important to be careful and avoid conflating controversial or unorthodox claims with misinformation,” he writes. “Transparency, humility, and a commitment to open scientific inquiry are critical.” Forthrightly acknowledging that consensus does change and that, at key moments, the government does not yet know all the facts might help rebuild the public’s trust; at the very least, it might minimize the impact of the tedious “Gotcha!” tweets that present two seemingly conflicting headlines as evidence of wholesale expert, media, and government incompetence.

[Renée DiResta: The anti-vaccine influencers who are merely asking questions]

Wikipedia, with its army of 97,000 volunteers contributing to COVID-related pages, has already been forced to confront the challenges of the lab-leak hypothesis—an emblematic example of the challenge of trying to fact-check online information when scientific consensus is in flux or has not yet formed. The “Talk” page linked to the Wikipedia entry on the origin of the coronavirus provides visibility into the roiling editing wars. Sock-puppet accounts descended, trying to nudge the coverage of the topic to reflect particular points of view. A separate page was created, dedicated specifically to the “COVID-19 lab leak hypothesis,” but site administrators later deleted it—a decision that remains in dispute within the Wikipedia community. The “Talk” pages for some pandemic-related entries have been labeled with one of the site’s standard warnings: “There have been attempts to recruit editors of specific viewpoints to this article. If you’ve come here in response to such recruitment, please review the relevant Wikipedia policy on recruitment of editors, as well as the neutral point of view policy. Disputes on Wikipedia are resolved by consensus, not by majority vote.”

On June 17, the site’s supreme court, the Arbitration Committee, made the decision to place COVID-19 pages under “discretionary sanctions,” a rubric that involves a higher standard of administrator oversight and greater “friction” in the editing process, and that is in place for other topics, such as abortion, the Arab-Israeli conflict, and Falun Gong. But the point is that Wikipedia has developed a consistent framework to handle these turbulent topics. The site has clearly articulated guidelines to foster the incorporation of the most accurate information and provide visibility into exactly how the current version of any entry came about. These are significant achievements.

Maintaining and expanding the site requires countless hours of volunteer labor. Because laypeople may not be able evaluate the significance of highly technical scientific findings, a Wikipedia-style communications model for government would require tapping a variety of reputable contributors, including people outside of government, as initial authors or editors, who would then invite others to join the effort, perhaps for a set term. The editorial conversations—the process of mediating consensus—would be viewable by everyone, so allegations of backroom dealing would not be credible.

Ultimately, Wikipedia remains a platform on which consensus develops in full public view. In fact, some other platforms—including YouTube—chose to point to Wikipedia quite prominently beginning in 2018, in efforts to direct people toward reliable information as they watched videos discussing various conspiracy theories, such as one about the 1969 moon landing. Wikipedia is regularly the top link in search results, suggesting that internet users rely on it even though they understand the limitations of a source written—and constantly rewritten—by pseudonymous volunteer authors. During the pandemic, platforms have struggled to decide which posts are misinformation and how to direct users to authoritative sources. A Wikipedia-style deployment of distributed expertise and transparent history is promising regardless of whether we’re talking about how a novel coronavirus spreads or what happened to some ballots in a dumpster or what really transpired in the latest viral protest video.

Although biden blamed Facebook and other social-media platforms for the spread of misinformation, Murthy’s advisory offers useful advice to everyone in the media ecosystem. Limiting the spread, he declares, “is a moral and civic imperative that will require a whole-of-society effort.” Physicians can use social media themselves, to counter bad information with good. Journalists can avoid publishing clickbait headlines and more carefully evaluate studies that have yet to be peer-reviewed. Tech platforms can redesign algorithms and product features to surface reliable information about health. And individual social-media users can think before they share things online.

The surgeon general’s exhorting ordinary Americans to do their part in stopping viral misinformation is a remarkable acknowledgment that, in the modern information environment, the distribution of power has shifted. The unfortunate irony is that a surgeon general’s advisory may not break through the noise—or may immediately become fodder in a roiling, unending online battle.

Public officials who hope to solve problems in this environment need to be willing to try new tactics—and not just on matters of health. Any message that agencies put before citizens will be richer if shaped by processes that account for the changed relationship between fact and opinion, between expertise and influence, and between the public and its leaders.

Renée DiResta is the technical research manager at the Stanford Internet Observatory.

This article was originally published in The Atlantic. Sign up for its newsletter.

source: NextGov