How Wikipedia survives while the rest of the internet breaks

When armies invade, hurricanes form, or governments fall, a Wikipedia editor will typically update the relevant articles seconds after the news breaks. So quick are editors to change “is” to “was” in cases of notable deaths that they are said to have the fastest past tense in the West. So it was unusual, according to one longtime editor who was watching the page, that on the afternoon of January 20th, 2025, hours after Elon Musk made a gesture resembling a Nazi salute at a rally following President Donald Trump’s inauguration and well into the ensuing public outcry, no one had added the incident to the encyclopedia.

Then, just before 4PM, an editor by the name of PickleG13 added a single sentence to Musk’s 8,600-word biography: “Musk appeared to perform a Nazi salute,” citing an article in The Jerusalem Post. In a note explaining the change, the editor wrote, “This controversy will be debated, but it does appear and is being reported that Musk may have performed a Hitler salute.” Two minutes later, another editor deleted the line for violating Wikipedia’s stricter standards for unflattering information in biographies of living people.

But PickleG13 was correct. That evening, as the controversy over the gesture became a vortex of global attention, another editor called for an official discussion about whether it deserved to be recorded in Wikipedia. At first, the debate on the article’s “talk page,” where editors discuss changes, was much the same as the one playing out across social media and press: it was obviously a Nazi salute vs. it was an awkward wave vs. it couldn’t have been a wave, just look at the touch to his shoulder, the angle of his palm vs. he’s autistic vs. no, he’s antisemitic vs. I don’t see the biased media calling out Obama for doing a Nazi salute in this photo I found on Twitter vs. that’s just a still photo, stop gaslighting people about what they obviously saw. But slowly, through the barbs and rebuttals and corrections, the trajectory shifted.

Wikipedia is the largest compendium of human knowledge ever assembled, with more than 7 million articles in its English version, the largest and most developed of 343 language projects. Started nearly 25 years ago, the site was long mocked as a byword for the unreliability of information on the internet, yet today it is, without exaggeration, the digital world’s factual foundation. It’s what Google puts at the top of search results otherwise awash in ads and spam, what social platforms cite when they deign to correct conspiracy theories, and what AI companies scrape in their ongoing quest to get their models to stop regurgitating info-slurry — and consult with such frequency that they are straining the encyclopedia’s servers. Each day, it’s where approximately 70 million people turn for reliable information on everything from particle physics to rare Scottish sheep to the Erfurt latrine disaster of 1184, a testament both to Wikipedia’s success and to the total degradation of the rest of the internet as an information resource. 

“It’s basically the only place on the internet that doesn’t function as a confirmation bias machine.”

But as impressive as this archive is, it is the byproduct of something that today looks almost equally remarkable: strangers on the internet disagreeing on matters of existential gravity and breathtaking pettiness and, through deliberation and debate, building a common ground of consensus reality.

“One of the things I really love about Wikipedia is it forces you to have measured, emotionless conversations with people you disagree with in the name of trying to construct the accurate narrative,” said DF Lovett, a Minnesota-based writer and marketer who mostly edits articles about local landmarks and favorite authors but later joined the salute debate to argue that “Elon Musk straight-arm gesture controversy” was a needlessly awkward description. “It’s basically the only place on the internet that doesn’t function as a confirmation bias machine,” he said, which is also why he thinks people sometimes get mad at it. Wikipedia is one of the few platforms online where tremendous computing power isn’t being deployed in the service of telling you exactly what you want to hear.

Whether Musk had made a Nazi salute or was merely awkward, the editors decided, was not for them to say, even if they had their opinions. What was a fact, they agreed, was that on January 20th, Musk had “twice extended his right arm toward the crowd in an upward angle,” that many observers compared the gesture to a Nazi salute, and that Musk denied any meaning behind the motion. Consensus was reached. The lines were added back. Approximately 7,000 words of deliberation to settle, for a time, three sentences. This was Wikipedia’s process working as intended.

It was at this point that Musk himself cannonballed into the discourse, tweeting that the encyclopedia was “legacy media propaganda!”

This was not Musk’s first time attacking the site — that appears to have been in 2019, when he complained that it accurately described him as an early investor in Tesla rather than its founder. But recently he has taken to accusing the encyclopedia of a liberal bias, mocking it as “wokepedia,” and calling for it to be defunded. In so doing, he has joined a growing number of powerful people, groups, and governments that have made the site a target. In August, Republicans on the US House Oversight Committee sent a letter to the Wikimedia Foundation requesting information on attempts to “inject bias” into the encyclopedia and data about editors suspected of doing so.

When governments have cowed the press and flooded social platforms with viral propaganda, Wikipedia has become the next target, and a more stubborn one. Because it is edited by thousands of mostly pseudonymous volunteers around the world — and in theory, by anyone who feels like it — its contributors are difficult for any particular state to persecute. Since it’s supported by donations, there is no government funding to cut off or advertisers to boycott. And it is so popular and useful that even highly repressive governments have been hesitant to block it.

Instead, they have developed an array of more sophisticated strategies. In Hong Kong, Russia, India, and elsewhere, government officials and state-aligned media have accused the site of ideological bias while online vigilantes harass editors. In several cases, editors have been sued, arrested, or threatened with violence.

When several dozen editors gathered in San Francisco this February, many were concerned that the US could be next. The US, with its strong protections for online speech, has historically been a refuge when the encyclopedia has faced attacks elsewhere in the world. It is where the Wikimedia Foundation, the nonprofit that supports the project, is based. But the site has become a popular target for conservative media and influencers, some of whom now have positions in the Trump administration. In January, the Forward published slides from the Heritage Foundation, the think tank responsible for Project 2025, outlining a plan to reveal the identities of editors deemed antisemitic for adding information critical of Israel, a cudgel that the administration has wielded against academia. 

“It’s about creating doubt, confusion, attacking sources of trust,” an editor told the assembled group. “It came for the media and now it’s coming for Wikipedia and we need to be ready.”


In 1967, Hannah Arendt published an essay in The New Yorker about what she saw as an inherent conflict between politics and facts. As varieties of truth go, she wrote, facts are fragile. Unlike axioms and mathematical proofs that can be derived by anyone at any time, there is nothing necessary about the fact, to use Arendt’s example, that German troops crossed the border with Belgium on the night of August 4th, 1914, and not some other border at some other time. Like all facts, this one is established through witnesses, testimony, documents, and collective agreement about what counts as evidence — it is political, and as the propaganda machines of the 20th century showed, political power is perfectly capable of destroying it. Furthermore, they will always be tempted to, because facts represent a sort of rival power, a constraint and limit “hated by tyrants who rightly fear the competition of a coercive force they cannot monopolize,” and at risk in democracies, where they are suspiciously impervious to public opinion. Facts, in other words, don’t care about your feelings. “Unwelcome facts possess an infuriating stubbornness,” Arendt wrote.

This infuriating stubbornness turns out to be important, though. A lie might be more plausible or useful than a fact, but it lacks a fact’s dumb arbitrary quality of being the case for no particular reason and no matter your opinion or influence. History once rewritten can be rewritten again and becomes insubstantial. Rather than believe the lie, people stop believing anything at all, and even those in power lose their bearings. This gives facts “great resiliency” that is “oddly combined” with their fragility. Having a stubborn common ground of shared reality turns out to be a basic precondition of collective human life — of politics. Even political power seems to recognize this, Arendt wrote, when it establishes ideally impartial institutions insulated from its own influence, like the judiciary, the press, and academia, charged with producing facts according to methods other than the pure exercise of power.

Wikipedia has come to play a similar role of factual ballast to an increasingly unmoored internet, but without the same institutional authority and with its own methods developed piecemeal over the last two decades for arriving at consensus fact. How to defend it from political attacks is not straightforward. At the conference, many editors felt both that attacks from the Trump administration were a genuine threat and that being cast as “the resistance” risked jeopardizing the encyclopedia’s position of trusted neutrality.

“I would really argue not to take the attack approach, to really take the passive approach,” said one editor when someone broached the idea of actively debunking some of the false information swamping the rest of the internet. “People see us as credible because we don’t attack, because we are just providing information to everyone all the time in a boring way. Sometimes boring is good. Boring is credible.”

Even the editor at the summit who had been most directly affected by the Trump administration urged against a direct response. Jamie Flood had been a librarian and outreach specialist at the National Agricultural Library, where among other duties she led group trainings and uploaded research on topics like germplasm and childhood nutrition to Wikipedia. Museums and libraries around the world employ such “Wikipedians in residence” to act as liaisons with the encyclopedia’s community for the same reason that the World Health Organization partnered with Wikipedia during the covid-19 pandemic to make the latest information available: if you want research to reach the public, there is no better place.

Along with several other Wikipedians employed by the federal government, Flood had just been laid off by DOGE, collateral damage in a general dismantling of research and archival institutions. “I’m a casualty of this administration’s war on information,” Flood said.

“‘Imagine a world where all knowledge is freely available to everyone.’”

Still, Wikipedia absolutely should not counterattack, Flood said. “Wikipedia is always in the background. They’re not making a big statement, and I don’t think they should. I’ve been training people for a long time and I still go back to this early quote of Jimmy Wales, one of the founders: ‘Imagine a world where all knowledge is freely available to everyone.’ That’s enough. That’s a statement in and of itself. In a time of misinformation, in a time of suppression, having this place where people can come and bring knowledge and share knowledge, that is a statement.”

Wikipedia should be, in other words, as stubborn as a fact. But then, facts are fragile things. 


A common refrain among Wikipedians is that the site works in practice but not in theory. It seems to flout everything we’ve learned about human behavior online: anonymous strangers discussing divisive topics and somehow, instead of dissolving into factions and acrimony, working together to build something of value.

The project’s origins go back to 1999. Wales, a former options trader who had founded a laddish web portal called Bomis, wanted to start a free online encyclopedia. He hired an acquaintance from an Ayn Rand listserv that Wales previously ran, a philosophy PhD student named Larry Sanger. Their first attempt, called Nupedia, was not so different from encyclopedias as they have existed since Diderot’s Encyclopédie in 1751. Experts would write articles that went through seven stages of editorial review. It was slow going. After a year, Nupedia had just over 20 articles.

In an attempt to speed things along, they decided to experiment with wikis, a web format gaining popularity among open-source software developers that allowed multiple people to collaboratively edit a project. (Wiki is the Hawaiian word for “quick.”) The wiki was intended to be a forum where the general public could contribute draft articles that would then be fed into Nupedia’s peer-review pipeline, but the experts objected and the crowdsourced site was given its own domain, Wikipedia.com. It went live on January 15th, 2001. Within days, it had more articles than all of Nupedia, albeit of varying quality. After a year, Wikipedia had more than 20,000 articles.

“…write about what people believe, rather than what is so

There were few rules at first, but one that Wales said was “non-negotiable” was that Wikipedia should be written from a “neutral point of view.” The policy, abbreviated as NPOV, was imported from the “nonbias policy” Sanger had written for Nupedia. But on Wikipedia, Wales considered it as much a “social concept of cooperation” as an editorial standard. If this site was going to be open to anyone to edit, the only way to avoid endless flame wars over who is right was, provocatively speaking, to set questions of truth aside. “We could talk about that and get nowhere,” Wales wrote to the Wikipedia email list. “Perhaps the easiest way to make your writing more encyclopedic is to write about what people believe, rather than what is so,” he explained.

Ideally, the neutrality principle would allow people of different views to agree, if not on the matter at hand, then at least on what it was they were disagreeing about. “If you’ve got a kind and thoughtful Catholic priest and a kind and thoughtful Planned Parenthood activist, they’re never going to agree about abortion, but they can probably work together on an article,” Wales would later say.

This view faced an immediate challenge, which is that people believe all sorts of things: that the Earth is 6,000 years old, that climate change is a scam, that the Holocaust was a hoax, that the Irish potato famine was overblown, that chiropractors are all charlatans, that they have discovered a new geometry, and that Mother Teresa was a jerk.

In response, the early volunteers added another rule. You can’t just say things; any factual claim needs a citation that readers can check for themselves. When people started emailing Wales their proofs that Einstein was wrong about relativity, he clarified that the cited source could not be your own “original research.” Sorry, Wales wrote to an Einstein debunker, it doesn’t matter whether your theory is true. When it is published in a physics journal, you can cite that.

Instead of trying to ascertain the truth, editors assessed the credibility of sources, looking to signals like whether a publication had a fact-checking department, got cited by other reputable sources, and issued corrections when it got things wrong. 

At their best, these ground rules ensured debates followed a productive dialectic. An editor might write that human-caused climate change was a fact; another might change the line to say there was ongoing debate; a third editor would add the line back, backed up by surveys of climate scientists, and demand peer-reviewed studies supporting alternate theories. The outcome was a more accurate description of the state of knowledge than many journalists were promoting at the time by giving “both sides” equal weight, and also a lot of work to arrive at. A 2019 study published in Nature found that Wikipedia’s most polarizing articles — eugenics, global warming, Leonardo DiCaprio — are the highest quality, because each side keeps adding citations in support of their views. Wikipedia: a machine for turning conflict into bibliographies. 

Coupled with some technical features of wikis, like the ability for anyone to edit anyone else’s writing, and some early administrative rules, like not being allowed to undo someone else’s edit more than three times per day, users were practically forced to talk through disagreements and arrive at “consensus.” This became Wikipedia’s governing principle.

This may make the process sound more peaceful than it is. Disputes were constant. Early on, Sanger, who had remained partial to a more hierarchical, expert-driven model, clashed repeatedly with editors he decried as “anarchists” and demanded greater authority for himself, which the editors rejected. When revenue from Bomis dried up after the dot-com crash, Wales laid Sanger off and took over management of the project.

Wales governed from a greater remove, appearing only occasionally to broker peace between warring editors, resolve an impasse, or reassure people that they didn’t need to spend time devising procedures to screen out a sudden influx of neo-Nazis that were planning to overwhelm discussion, because if they showed up, “I will personally ban them all if necessary, and that’s that.” Editors sometimes ironically referred to him as their “God King” or “benevolent dictator,” but he described his role as a sort of constitutional monarch safeguarding the community as it developed the processes to fully govern itself. Because Wikipedia was under a Creative Commons license, anyone who didn’t like the way the project was run could copy it and start their own, as a group of Spanish users did when the possibility of running ads was raised in 2002. The next year Wales established a nonprofit, the Wikimedia Foundation, to raise funds and handle the technical and legal work required to keep the project running. The encyclopedia itself, however, would be entirely edited and managed by volunteers.

In early 2004, Wales delegated his moderating powers to a group of elected editors, called the Arbitration Committee. From that point onward, he was essentially another editor, screenname Jimbo Wales, liable to have his edits undone like anyone else. He attempted several times to update his own birthdate to reflect the fact that his mother says he was born slightly before midnight on August 7th, 1966, not on August 8th, as his birth certificate read, only to be reprimanded for editing his own page and trying to cite his own “original research.” (After several years of debates and citable coverage from reliable sources, August 7th eventually won, with a note explaining the discrepancy.)

AGF

Over the ensuing two decades, editors amended policies to cope with conspiracy theorists, revisionist historians, militant fandoms, and other perennial goblins of the open web. There were the three core content guidelines of Neutral Point of View, Verifiability, and No Original Research; the five pillars of Wikipedia; and a host of rules around editor conduct, like the injunction to avoid ad hominem attacks and assume good faith of others, defined and refined in interlinked articles and essays. There are specialized forums and noticeboards where editors can turn for help making an article more neutral, figuring out whether a source was reliable, or deciding whether a certain view was fringe or mainstream. By 2005, the pages where editors stipulated policy and debated articles were found to be growing faster than the articles themselves. Today, this administrative backend is at least five times the size of the encyclopedia it supports.

The most important thing to know about this system is that, like the neutrality principle from which it arose, it largely ignores content to focus on process. If editors disagree about, for example, whether the article for the uninhabited islands claimed by both Japan and China should be titled “Senkaku Islands,” “Diaoyu Islands,” or “Pinnacle Islands,” they first try to reach an agreement on the article’s Talk page, not by arguing who is correct, but by arguing which side’s position better accords with specific Wikipedia policies. If they can’t agree, they can summon an uninvolved editor to weigh in, or file a “request for comment” and open the issue to wider debate for 30 days.

If this fails and editors begin to quarrel, they might get called before the Arbitration Committee, but this elected panel of editors will also not decide who is right. Instead, they will examine the reams of material generated by the debate and rule only on who has violated Wikipedia process. They might ban an editor for 30 days for conspiring off-Wiki to sway debate, or forbid another editor from working on articles about Pacific islands over repeated ad hominem attacks, or in extreme cases ban someone for life. Everyone else can go back to debating, following the process this time.

As a result, explosive political controversies and ethnic conflicts are reduced to questions of formatting consistency. But because process decides all, process itself can be a source of intense strife. The topics of “gun control” and “the Balkans” are officially designated as “contentious” due to recurring edit wars, where people keep reverting each other’s edits without attempting to build consensus; so, too, are the Wikipedia manual of style and the question of what information belongs in sidebars. In one infamous battle, debate over whether to capitalize “into” in the film title Star Trek Into Darkness raged for more than 40,000 words.

Because disputes on Wikipedia are won or lost based on who has better followed Wikipedia process, every dispute becomes an opportunity to reiterate the project’s rules and principles

In 2009, law professors David A. Hoffman and Salil K. Mehra published a paper analyzing conflicts like these on Wikipedia and noted something unusual. Wikipedia’s dispute resolution system does not actually resolve disputes. In fact, it seems to facilitate them continuing forever.

These disputes may be crucial to Wikipedia’s success, the researchers wrote. Online communities are in perpetual danger of dissolving into anarchy. But because disputes on Wikipedia are won or lost based on who has better followed Wikipedia process, every dispute becomes an opportunity to reiterate the project’s rules and principles.

Trolls who repeatedly refuse to follow the process eventually get banned, but initial infractions are often met with explanations of how Wikipedia works. Several of the editors I spoke with began as vandals only to be won over by someone explaining to them how they could contribute productively. Editors will often restrict who can work on controversial topics to people who have logged a certain number of edits, ensuring that only those bought into the ethos of the project can participate.

In 2016, researchers published a study of 10 years of Wikipedia edits about US politics. They found that articles became more neutral over time — and so, too, did the editors themselves. When editors arrived, they often proposed extreme edits, received pushback, and either left the project or made increasingly moderate contributions.

This is obviously not the reigning dynamic of the rest of the internet. The social platforms where culture and politics increasingly play out are governed by algorithms that have the opposite effect of Wikipedia’s bureaucracy in nearly every respect. Optimized to capture attention, they boost the novel, extreme, and sensational rather than subjecting them to increased scrutiny, and by sending content to users most likely to engage with it, they sort people into clusters of mutual agreement. This phenomenon has many names. Filter bubbles, epistemological fragmentation, bespoke realities, the sense that everyone has lost their minds. On Wikipedia, it’s called a “point of view split,” and editors banned it early. You are simply not allowed to make a new article on the same topic. Instead, you must make the case for a given perspective’s place amid all the others while staying, literally, on the same page.


In February, the conservative organization Media Research Center released a report claiming that “Wikipedia Effectively Blacklists ALL Right-Leaning Media.” It was essentially a summary of a publicly available policy page on Wikipedia that lists discussions about the reliability of sources and color codes them according to the latest consensus — green for generally reliable, yellow for lack of clear consensus, and red for generally unreliable. ProPublica is green because it has an “excellent reputation for fact-checking and accuracy, is widely cited by reliable sources, and has received multiple Pulitzer Prizes.” Newsweek is yellow after a decline in editorial standards following its 2013 acquisition and recent use of AI to write articles. Newsmax, the One America News Network, and several other popular right-leaning sources are red due to repeatedly publishing stories that were proven wrong. (As are some left-leaning sources, like Occupy Democrats.) The New York Post (generally unreliable, but marginally reliable on entertainment) used the report as the basis for an editorial titled “Big Tech must block Wikipedia until it stops censoring and pushing disinformation.”

The page is called Reliable sources/Perennial sources, as in sources that are perennially discussed. Editors made the page in 2018 as a repository for past discussions that they could refer to instead of having to repeatedly debate the reliability of the Daily Mail — the first publication to be deprecated, the year before — every time someone tried to cite it. It is not a list of preapproved or banned sources, the page reads. Context matters, and consensus can change.

But to Wikipedia’s critics, the page has become a symbol of the encyclopedia’s biases. Sanger, the briefly tenured cofounder, has found a receptive audience in right-wing activist Christopher Rufo and other conservatives by claiming Wikipedia has strayed from its neutrality principle by making judgments about the reliability of sources. Instead, he argues, it should present all views equally, including things “many Republicans believe,” like the existence of widespread fraud in the 2020 election and the FBI playing a role in the January 6th Capitol attack.

Last spring, the reliable source page collided with one of the most intense political flashpoints on Wikipedia, the Israel-Palestine conflict. In April, an editor asked whether it was time to reevaluate the reliability of the Anti-Defamation League in light of changes to the way it categorizes antisemitic incidents to include protests of Israel, among other recent controversies. About 120 editors debated the topic for two months, producing text equal to 1.9 The Old Man and the Seas, or “tomats,” a standard unit of Wikipedia discourse. The consensus was that the ADL was reliable on antisemitism generally but not when the Israel-Palestine conflict was involved.

Unusually for a Wikipedia administrative process, the decision received enormous attention. The Times of Israel called it a “staggering blow” for the ADL, which mustered Jewish groups to petition the foundation to overrule the editors. The foundation responded with a fairly technical explanation of how Wikipedia’s self-governing reliability determinations work.

In the year since, conservative and pro-Israel organizations have published a series of reports examining the edit histories of articles to make a case that Wikipedia is biased against Israel. In March, the ADL itself issued one such report, called “Editing for Hate,” claiming that a group of 30 “malicious editors” slanted articles to be critical of Israel and favorable to Palestine. As evidence, the report highlights examples like the removal of the phrase “Palestinian terrorism” from the introduction of the article on Palestinian political violence.

Yet the edit histories show that these examples are often plucked from long editing exchanges, the outcome of which goes unmentioned. The “terrorism” line that the ADL cited was indeed removed — it had also only just been added, was added back shortly after being cut, then was removed again, added back, and revised repeatedly before editors brokered a compromise on the talk page.

Breitbart, Pirate Wires, and other right-leaning publications now regularly mine Wikipedia’s lengthy debates for headlines like “How Wikipedia Launders Regime Propaganda,” accusing the site of being a mouthpiece for the Democratic Party, or “Cover Up: Wikipedia Editors Propose Deleting Page on Iran Advocating for Israel’s Destruction,” despite the article having just been created and the outcome being to merge the contents into the article on Iran-Israel relations. These reports are a dependable source of viral outrage on X. The strategy also appears effective at convincing lawmakers. In May, Rep. Debbie Wasserman Schultz (D-FL) and 22 other members wrote to the Wikimedia Foundation citing the ADL report and demanding Wikimedia “rein in antisemitism, uphold neutrality.”

The August letter from House Republicans requesting information on attempts to influence the encyclopedia, data on editors who had been disciplined by Arbcom, and other records also cited the ADL report. 

While some search for bias in the minutiae of edit histories, others try to encompass all of Wikipedia. Last year, a researcher at the conservative Manhattan Institute scraped Wikipedia for mentions of political terms and public officials and used a GPT language model to analyze them for bias. The report found “a mild to moderate” tendency to associate figures on the political right with more negative sentiment than those on the left. The study, which was not peer reviewed, has become a regular fixture in claims of liberal bias on Wikipedia.

The report still illustrates the challenges of evaluating the neutrality of a text as vast and stripped of subjective opinion as Wikipedia. An examination of the datasets shows that the passages GPT classified as non-neutral are often anodyne factual statements: that a lawmaker won or lost an election, represented a certain district, or died. It also conflated unrelated people of the same name, so, for example, most of the non-neutral statements about Mike Johnson concern not Mike Johnson the current Republican House Majority Speaker but a robber in a 1923 silent film, a prog-rock guitarist, multiple football players, and a famous yodeler. 

But the more fundamental question is whether balanced sentiment or balanced anything across the contemporary political spectrum is the correct expectation for a project that operates by a different standard, one based on measures of reliability. Supposing the sentiment readings do reflect a real imbalance, is that due to the biases of editors, biases in their sources, or some other external imbalance, like a tendency by right-leaning politicians to express negative sentiments of fear or anger (a possibility the report raises, then dismisses).

Wikipedia has a long history of attempting to disentangle and correct its various biases. The site’s editor community has been overwhelmingly white, male, and based in the United States and Europe since the site began. In 2018, 90 percent of editors were men, and only 18 percent of biographies in the encyclopedia were of women. That year, the Canadian physicist Donna Strickland won a Nobel Prize, and people turning to Wikipedia to learn about her discovered she lacked an article.

Women have been historically excluded from the sciences, underrepresented in coverage of the sciences, and therefore underrepresented in the sources Wikipedia editors can cite

But the causal connection between these facts was not straightforward. Women have been historically excluded from the sciences, underrepresented in coverage of the sciences, and therefore underrepresented in the sources Wikipedia editors can cite. An editor had tried to make an article on Strickland several months before the Nobel but was overruled due to a lack of coverage in reliable sources. “Wikipedia is a mirror of the world’s biases, not the source of them. We can’t write articles about what you don’t cover,” tweeted then-executive director Katherine Maher.

Wikipedia’s sourcing guidelines are conservative in their deference to traditional institutions of knowledge production, like established newsrooms and academic peer review, and this means that it is sometimes late to ideas in the process of moving from fringe to mainstream. The possibility that covid-19 emerged from a lab was relegated to a section on conspiracy theories and is only now, after reporting by reliable sources, gaining a toehold on the covid pandemic article. Similarly, as awareness grew of the ways Western academic and journalistic institutions have excluded the perspectives of colonized people, critics argued that Wikipedia’s reliance on these same institutions made it impossible for the encyclopedia to be truly comprehensive.

Not all the bias comes from the project’s sources, though. A study that attempted to control for offline inequalities by examining only contemporary sociologists of similar achievement found that male academics were still more likely to have articles. As volunteers, editors work on topics they think are important, and the encyclopedia’s emphases and omissions reflect their demographics. Minor skirmishes in World War II and every episode of The Simpsons have an article, some of which are longer than the articles on the Ethiopian civil war or climate change in the Maldives. In an effort to fill in these gaps, the foundation has for several years funded editor recruitment and training initiatives under the banner of “knowledge equity.”

“Most editors on Wikipedia are English-speaking men, and our coverage is of things that are of interest to English-speaking men,” said a retired market analyst in Cincinnati who has been editing for over 20 years. “Our sports coverage is second to none. Video games, we got it covered. Wars, the history of warfare, my god. Trains, radio stations… But our coverage of foods from other countries is very low, and there is an absolute systemic bias against coverage of women and people of color.” For her part, she tries to fill gaps around food, creating new articles whenever she encounters a Peruvian chili sauce or African fufu that lacks one.

Yet these initiatives have come under attack as “DEI” by conservative influencers and Musk, who called for Wikipedia to be defunded until “they restore balance.”

If you think something is wrong on Wikipedia, you can fix it yourself

These accusations of bias, familiar from attacks on the media and social platforms, encounter some unique challenges when leveled against Wikipedia. Crucially, if you think something is wrong on Wikipedia, you can fix it yourself, though it will require making a case based on verifiability rather than ideological “balance.”

Over the years, Wikipedia has developed an immune response to outside grievances. When people on X start complaining about Wikipedia’s suppression of UFO sightings or refusal to change the name of the Gulf of Mexico to Gulf of America, an editor often restricts the page to people who are logged in and puts up a notice directing newcomers to read the latest debate. If anything important was missed, they are welcome to suggest it, the notice reads, provided their suggestion meets Wikipedia’s rules, which can be read about on the following pages. That is, Wikipedia’s first and best line of defense is to explain how Wikipedia works. 

Occasionally, people stick around and learn to edit. More often, they get bored and leave.


It was not unusual for skirmishes to break out over the Wikipedia page for Asian News International, or ANI. It is the largest newswire service in India, and as its Wikipedia article explains, it has a history of promoting false anti-Muslim and pro-government propaganda. It was these facts that various anonymous editors — not logged into Wikipedia accounts, so appearing only as IP addresses — attempted to remove last spring. 

As typically happens, an experienced editor quickly reinstated the deleted sentences, noting that they had been removed without explanation. Then came another drive-by edit: actually, ANI is not propaganda and very credible, someone wrote, citing a YouTube video. Reverted: YouTube commentary is not a reliable source. Then another IP address, deleting a sentence about ANI promoting a false viral story about necrophilia in Pakistan. Reverted again. Another IP address, deleting the mention of propaganda with the explanation that the sources were “leftist dogs and swine.”

As the edit battle escalated, an editor locked the page so that only people who were logged in and had made a certain number of edits could make changes, ending the barrage of IP addresses.

Two months later, ANI sued. 

The lawsuit revealed that several of the IP addresses had belonged to representatives of ANI attempting to remove unflattering information about the company. Blocked from doing so, ANI sued for defamation under a recent amendment to India’s equivalent of Section 230 that places stricter requirements on platforms to moderate content. When the Wikimedia Foundation declined to reveal the identities of three editors who had defended the page, the presiding judge said he would ask the government to block the site, threatening to cut off the country with the highest number of English Wikipedia readers after the US and the UK. “If you don’t like India,” the judge said, “please don’t work in India.”

During the appeal, Wikimedia’s lawyer argued that disclosing the identities of editors would destroy the encyclopedia’s self-regulating system and expose contributors to reprisals. Also, he noted, the sentences in question, like every assertion on Wikipedia, were only summarizing other sources, and those sources — the publications The Caravan and The Ken — had not been sued for defamation. (As with editors, the foundation’s first response to external threats is often to explain how Wikipedia works.) The judge dismissed the argument, saying that journalism might be “read by a hundred people, you don’t bother about it… it does not have the gravitas.” Wikipedia, however, is read by millions.

By this point the case had garnered enough coverage to warrant its own Wikipedia page. This seemed to enrage the judge, particularly the line noting that the judge’s demand to reveal the identities of editors had been described as “censorship and a threat to the flow of information.” This “borders on contempt,” the judge said, demanding that the foundation take the page down within 36 hours. In a rare move, the foundation complied.

The case alarmed editors around the world. An open letter calling on the Wikimedia Foundation to protect the anonymity of the editors garnered more than 1,300 signatures, the most of any letter directed at the foundation. Nevertheless, last December, the foundation disclosed the editors’ identities to the judge under seal. Responding to outrage on Wikipedia’s editor forum, Wales asked for calm and urged people not to jump to conclusions.

The Wikimedia Foundation has historically taken a hard line against attempts to influence the project. In 2017, when the Turkish government demanded several articles be deleted, Wikipedia refused and was blocked for nearly three years as it fought to the country’s Constitutional Court and won. For the second half of 2024, the most recent data available, the foundation complied with about 8 percent of requests for user data, compared to Google’s 82 percent and Meta’s 77 percent. And the data provided was sparse, because Wikipedia retains almost none.

Instead of brute censorship, what has emerged is a sort of gray-zone information warfare

But attempts to influence the site have grown more sophisticated. The change is likely due to multiple factors: a global rise of political movements that wish to control independent media, the increased centrality of Wikipedia, and a technical change to the website itself. In 2015, Wikipedia switched to the encrypted HTTPS extension by default, making it impossible to see what pages users visited, only that they were visiting the Wikipedia domain. This meant that governments that had previously been censoring specific articles on opposition figures or historic protests had to choose between blocking all of Wikipedia or none of it. Almost every country save China (and Russia, for several hours) chose to not to block it. This was a victory for open knowledge, but it also meant governments had a greater interest in controlling what was written in the encyclopedia.

Instead of brute censorship, what has emerged is a sort of gray-zone information warfare. After mainland China quashed protests against the Hong Kong national security law in 2019, a battle began over how the protests would be remembered. Editors in mainland China — which can edit using VPNs — argued for the inclusion of state-friendly media that described the protests as “riots” or “terrorist attacks” while removing citations to independent media for unreliability and bias. In one case, an editor attempted to strip all citations to one of Hong Kong’s premier papers, Apple Daily, hours before it was shut down by the government. By conspiring offline and using fake accounts, they won elections to admin positions and with them the power to see other editors’ IP addresses, which they discussed using to reveal their opponents’ identities to the police. Shortly afterward, the Wikimedia Foundation banned or restricted more than a dozen editors operating from mainland China, saying that the project had been “infiltrated” and that “some users have been physically harmed as a result.”

Russia employed similar tactics after its invasion of Ukraine in 2022. State media and government officials attacked Wikipedia in the press with accusations of anti-Russian bias, promulgation of fake news, and foreign manipulation. The site remained accessible, but Russian search engines put a banner above it saying it was in violation of the law. Meanwhile, the government harassed the foundation with a series of fines for publishing “false” information about the military, which the foundation has refused to pay. Finally, on the encyclopedia, state-aligned editors pushed the government’s view while vigilantes doxxed and threatened their opposition. Last year, the head of Wikimedia Russia was declared a “foreign agent” and forced to resign from his job as a professor at Moscow State University.

In neighboring Belarus, editor Mark Bernstein was doxxed by a pro-Russian group in 2022, arrested, and sentenced to three years of home confinement. As many as five other editors have been detained by Belarusian authorities in recent months, according to media reports and editors.

As these battles continued, the Russian government supported the creation of a more compliant alternative, called Ruwiki, which launched early last year with the copying of 1.9 million articles from the originals, edited to reflect the government view. On Ruwiki, edits must comply with Russian laws and are subject to approval from outside experts. There, the map of Ukraine does not include Donetsk or Kherson, the war is a “special operation” in response to NATO aggression, and accounts of torture in Bucha are fake news.

Wikipedia remains online in Russia, but with Ruwiki, the government may now feel emboldened to block it. In May, at a hearing on media safety for children, the head of the Russian Duma Committee on the Protection of the Family said that the encyclopedia’s “interpretation of our historical events feels so hostile that we need to raise the issue of blocking this information resource,” and that the encyclopedia’s depiction of history is opposed to Russian “traditional, spiritual values.”

The goal of these campaigns is what the Wikimedia Foundation calls “project capture.” The term originates in an independent report the foundation commissioned in response to the takeover of the Croatian-language Wikipedia by a cabal of far-right editors.

In 2010, a group of editors won election to admin positions and began citing far-right alternative media to rewrite history. On Croatian Wikipedia, the Nazis invaded Poland to stop a genocide against the German people, Croatia’s role in the Holocaust is foreign propaganda, and Ratko Mladić was a decorated military leader whose conviction by the UN for genocide (briefly noted quite far down) was the result of an international conspiracy. When other editors attempted to correct the articles, the admins banned them for violating rules against hate speech or harassment.

The encyclopedia became so warped that it began receiving press coverage. The Croatian Minister of Education warned students not to use it. In an interview with a Croatian paper, Wales confirmed the foundation was aware of the problem and looking into it. Yet the foundation has a policy of allowing Wikipedia projects to self-govern, and interfering with Croatian Wikipedia risked opening a door to the many governments and companies that want things on Wikipedia changed.

Editors mounted a resistance and attempted to vote the admins out, but the admins defeated the attempt using votes from what were later revealed to be dozens of fake accounts. But because the admins were the only ones with the technical ability to trace IP addresses, the opposition had no way to prove this. The cabal now controlled all the levers of power. By 2019, nearly all of the editors who opposed them had been banned or harassed off the project.

In 2020, one of the few remaining dissident editors compiled a comprehensive textual and statistical analysis of editing patterns of dozens of accounts and filed a request for an admin to run IP traces to see if they were sock puppets. The admin stalled, then attempted to fudge the traces, but did so in such a transparent way that it was clear the accounts were indeed fakes.

This was the evidence required to procedurally break the cabal. High-ranking admins called “stewards” from other-language Wikipedias administered a new vote on banning the Croatian admins. This time, the admins lost. Their ringleader, username Kubura, was banned from all Wikipedia projects forever, a punishment that had been leveled against less than a dozen others in Wikipedia history. A local daily covered the incident with the headline “Kubura’s Downfall: Banned Globally, His Followers Retreat, Leaderless.”

Wikipedia’s processes are only effective if they are administered by people who believe in the spirit of the project

The foundation’s postmortem analysis compared the takeover to “state capture, one of the most pressing issues of today’s worldwide democratic backsliding.” The clique still cited the reliability of sources and invoked rules of debate, but it bent these processes to serve their nationalist purpose. As many governments have discovered, it is extremely difficult to insert propaganda into Wikipedia without running afoul of some rule or another. But what the Croatia capture showed is that Wikipedia’s processes are only effective if they are administered by people who believe in the spirit of the project. If they can be silenced or replaced, it becomes possible to steer the encyclopedia in a different direction. 

One editor I spoke with, who asked to remain anonymous for reasons that will be obvious, had been editing Wikipedia for several years while living in a Middle Eastern country where much other media is tightly controlled. One day he received a call from a member of the intelligence service inviting him to lunch. He cried for hours — everyone knew what this meant. 

The meeting was cordial but clear. They didn’t want him to stop editing Wikipedia. They wanted his help. They knew the encyclopedia has rules and you can’t just insert flagrant propaganda, but as a respected member of the community, maybe he could edit in ways that were a little friendlier to the government, maybe decide in its favor when certain topics came up for debate. In exchange, maybe the service could help him if he ever got in trouble with the police, for example, over his sexuality; he was gay in a country where that was illegal. 

He fled the country weeks later. He now edits from abroad, but he knows of five to 10 others who have faced arrest or intimidation over their editing. They must do constant battle with editors he believes to be government agents who push the state’s perspective, debating tirelessly for hours because it is literally their job. 

It’s a rare person who is able to uproot their life in the service of a volunteer side project. Understandably, many others faced with such threats become more cautious in their editing or stop altogether. Multiple editors based in India said that they now avoid editing topics related to their country. The ANI case had a chilling effect, as have recurring harassment campaigns. The far-right online publication OpIndia regularly accuses Wikipedia of “anti-Hindu and anti-India bias,” in ways that parallel attacks from the US right, down to citations of Manhattan Institute research and quotes from the disgruntled cofounder, Sanger. The organization has published the real names and employers of editors it accuses of being “leftists” or “Islamists,” leading at least one veteran editor to delete their account.

Even ancient history can be cause for reprisals. In February, after the release of a Bollywood action film about Chhatrapati Sambhaji Maharaj, a 17th-century king who fought the Mughals, accounts on X began whipping up outrage over several facts on Sambhaji’s Wikipedia page that they deemed to be anti-Hindu. When editors reversed attempts to delete the offending lines, another X user posted their usernames and called on government officials to investigate them. Days later, local press reported that the Maharashtra cyber police opened cases against at least four editors.

“If you issue cases and file complaints against editors, they tend not to edit those pages anymore”

“Various editors have left Wikipedia over this persecution, fearing their own safety,” said an Indian Wikipedia editor who asked to remain anonymous out of fear of retaliation. “I believe this is completely useful for the right wing, if you issue cases and file complaints against editors, they tend not to edit those pages anymore, fearing for their safety in real life.”

He still edits, but mostly sticks to the safer ground of the Roman Empire.


In April, the Trump administration’s interim US attorney for DC, Edward Martin Jr., sent a letter to the Wikimedia Foundation accusing the organization of disseminating “propaganda” and intimating that it had violated its duties as a tax-exempt nonprofit.

From a legal perspective, it was an odd document. The tax status of nonprofits is not generally the jurisdiction of the US attorney for DC, and many of the supposed violations, like having foreign nationals on its board or permitting “the rewriting of key, historical events and biographical information of current and previous American leaders,” are not against the law. Sanger is quoted, criticizing editor anonymity. In several cases, the rules Martin accuses Wikipedia of violating are Wikipedia’s own, like a commitment to neutrality. But the implied threat was clear.

“We’ve been anticipating something like this letter happening for some time,” a longtime editor, Lane Rasberry, said. It fits the pattern seen in India and elsewhere. He has been hearing more reports of threats against editors who work on pages related to trans issues and has been conducting security trainings to prevent their identities being revealed. Several US-based editors told me they now avoid politically contentious topics out of fear that they could be doxxed and face professional or legal retaliation. “There are more Wikipedia editors getting threats, more people getting scared,” Rasberry said.

Talking to editors, I encountered a confounding spread of opinions about the seriousness of the threat to Wikipedia, often in the same conversation. The site has sloughed off more than two decades of attacks, and so far the latest round is no different. The Heritage Foundation plan to dox editors has yet to materialize. Musk’s calls for his followers to stop donating have resulted in surges in donations, according to publicly available data.

In India, the High Court struck down the order to take down the article about ANI’s defamation case, though the case itself is ongoing. Wikipedia’s critics on the right and in the Silicon Valley elite often propose generative AI as the solution to Wikipedia’s perceived biases, for each user a bespoke source of ideologically agreeable information. Yet all these projects remain wholly reliant on Wikipedia, and so far the most aggressive such initiative, Musk’s Grok, has spent much of its existence flailing between fact-checking Musk’s own conspiracy theories and proclaiming itself MechaHitler.

But new threats continue to appear. In August, the foundation lost its case arguing for an exemption from the UK’s Online Safety Act, which would force Wikipedia to verify the identities of its editors, though it is continuing to appeal. In Portugal the foundation received a court order arising from a defamation case brought by Portuguese American businessman Cesar DePaço, who objected to information on his page about past criminal allegations and links to the far-right Portuguese party Chega. Complying with the ruling, the foundation struck several facts from his biography and disclosed “a small amount of user data” about eight editors. The foundation is now bringing the case before the European Court of Human Rights. And in the US, there is the recent House Oversight letter.

No matter the outcome, these cases contribute to a general increase in pressure on the project’s already strained editors. English Wikipedia has fewer than 40,000 active editors, defined as users who have made five or more edits in the last month. The number of active administrators, crucial to maintaining the site and enforcing policy, peaked in 2008 and now stands at around 450. AI threatens to squeeze the editor pipeline further. The more people who get information from AI summaries of Wikipedia rather than the site itself, the fewer people who will wander down a rabbit hole, encounter an error that needs correcting, and become editors themselves. 

“Wikipedia should not be taken for granted.”

At the same time, people are using AI to add plausible-looking but false or biased information to the encyclopedia, increasing the workload for editors. Harassment, ideological editing campaigns, government investigations, targeted lawsuits — even if they lead nowhere, they will make the prospect of editing more daunting and increase the odds that current editors burn out. “Wikipedia should not be taken for granted,” Rasberry said. “This is an existential threat.” 

The first reactions to the Martin letter on the Wikipedia editor forums were radical: the foundation should leave the US, maybe for France, or Iceland, or Germany. This would not be unprecedented, an editor pointed out. The Encyclopédistes fled to Switzerland when the ancien régime attempted to censor them. Maybe the site should go dark in protest. 

But moderation soon prevailed. “The community needs to chill on the blackout talk,” wrote an editor by the name of Tazerdadog. “We’re not there yet.” Right now, the best response to these threats is to double down on Wikipedia’s policies, particularly the refusal to be censored and its dedication to neutral point of view, they wrote. 

NPOV

“I 100% agree with you, Tazerdadog,” replied “Jimbo Wales.” “Emphasizing to the WMF that NPOV is non-negotiable is not really the issue.” In fact, Wales wrote, he is chairing a working group on strengthening the policy. The initiative was announced in March, framed as a response to the global rise in threats to sources of neutral information, and to a fragmentation of the public’s understanding of the very concepts of neutrality and facts. Wikipedia’s response, it seemed, would be to neutral harder. 

In May, I met Wales for coffee at a members club in Chelsea where he had been granted an honorary membership after giving a talk. (Wikipedia, as journalists have noted for years, did not make Wales a tech billionaire.) Extravagant bouquets of pastel flowers were arranged in an arch above the doorway and festooned the tables of the interior. Wales, dressed to meet his wife at the Chelsea Flower Show, matched the decor in a green linen suit and floral shirt. He does not, he said, normally dress like a leprechaun. 

He was not particularly concerned about the attacks on Wikipedia, he said, though he warned that he is “pathologically optimistic.” Wikipedia has been attacked since it began. It fought Turkey’s ban to the Constitutional Court and won. Even Russian Wikipedia has proven resilient. In the US, the government lacks much of the leverage it has deployed against other institutions. Wikipedia doesn’t rely on government funding, and protections for online speech are strong. In the last fiscal year, the foundation took in $170 million in donations, with an average size of about $10.

As for the accusations of bias, why not investigate? Whether the attacks are in good faith or bad, it doesn’t really matter, Wales said. The foundation had already decided that it was a good time, given the fragmented and polarizing world, to examine and bolster Wikipedia’s neutrality processes. Wales, leaning over the coffee table, seemed excited at the prospect. 

“If somebody turns up on a talk page and says, ‘Hey, this article is a mess, it’s wrong. It’s really biased,’ the right answer is to not scream at them and run and hide. The right answer is go, ‘Oh, tell me more. Let’s dig in. Where is it biased? How do we think about how do we fix that?’”

Let’s figure out the best methodologies for studying neutrality, Wales said. Let’s look at how editors evaluate the reliability of sources. Maybe Wikipedia does use the label “far-right” more than “far-left,” Wales said, a criticism that has been leveled at the site. Is that because the media uses the term more, and does Wikipedia use the term more or less than the media does, and does the media use the term more because there are more far-right movements in the world today? 

“You have to chew on these things. There’s no simple answers.”

But there are answers. If the social platforms and language models that increasingly shape our understanding of the world are inscrutable black boxes, Wikipedia is the opposite, maybe the most legible, endlessly explainable information management system ever made. For any sentence, there is a source, and a reason that that source was used, and a reason for that reason. 

“Let’s dig in,” Wales repeated. “Let’s assess the evidence. Let’s talk to a lot of different people. Let’s really try and understand.” Come, be part of the process. His working group is starting to discuss the best approach. The meetings, Wales acknowledged, have been very tedious so far.

As for the letter from the interim DC attorney, Trump withdrew Martin’s nomination in May, though he still has a position leading the Justice Department’s retribution-oriented “task force on weaponization.” In any case, the Wikimedia Foundation responded promptly. 

“The foundation staff spent a lot of passion writing it,” Wales said of the reply. “Then they ran it by me for review, and I was ready to jump in, but I was like, actually, it’s perfect.” 

“It’s very calm,” Wales said. “Here are the answers to your questions, here is what we do.” It explains how Wikipedia works.