This is an encyclopedic article discussing Wikipedia's reliability. For Wikipedia's own standpoint on reliability, see Wikipedia:General disclaimer.
The reliability of Wikipedia and its user-generated editing model, particularly its English-language edition, has been questioned and tested. Wikipedia is written and edited by volunteer editors, who generate online content with the editorial oversight of other volunteer editors via community-generated policies and guidelines. The reliability of the project has been tested statistically through comparative review, analysis of the historical patterns, and strengths and weaknesses inherent in its editing process.[3] The online encyclopedia has been criticized for its factual unreliability, principally regarding its content, presentation, and editorial processes. Studies and surveys attempting to gauge the reliability of Wikipedia have mixed results. Wikipedia's reliability was frequently criticized in the 2000s but has been improved; its English-language edition has been generally praised in the late 2010s and early 2020s.[4][5][6]
Select assessments of its reliability have examined how quickly vandalism—content perceived by editors to constitute false or misleading information—is removed. Two years after the project was started, in 2004, an IBM study found that "vandalism is usually repaired extremely quickly—so quickly that most users will never see its effects".[7][8] The inclusion of false or fabricated content has, at times, lasted for years on Wikipedia due to its volunteer editorship.[9][10] Its editing model facilitates multiple systemic biases, namely selection bias, inclusion bias, participation bias, and group-think bias. The majority of the encyclopedia is written by male editors, leading to a gender bias in coverage, and the make up of the editing community has prompted concerns about racial bias, spin bias, corporate bias, and national bias, among others.[11][12][13] An ideological bias on Wikipedia has also been identified on both conscious and subconscious levels. A series of studies from Harvard Business School in 2012 and 2014 found Wikipedia "significantly more biased" than Encyclopædia Britannica but attributed the finding more to the length of the online encyclopedia as opposed to slanted editing.[14][15]
Instances of non-neutral or conflict-of-interest editing and the use of Wikipedia for "revenge editing" has attracted attention to false, biased, or defamatory content in articles, especially biographies of living people.[16][17] Articles on less technical subjects, such as the social sciences, humanities, and culture, have been known to deal with misinformation cycles, cognitive biases, coverage discrepancies, and editor disputes. The online encyclopedia does not guarantee the validity of its information. It is seen as a valuable "starting point" for researchers when they pass over content to examine the listed references, citations, and sources. Academics suggest reviewing reliable sources when assessing the quality of articles.[18][19]
Wikipedia allows anonymous editing; contributors (known as "editors") are not required to provide any identification or an email address. A 2009 study of Dartmouth College in the English Wikipedia noted that, contrary to usual social expectations, anonymous editors were some of Wikipedia's most productive contributors of valid content.[32] The Dartmouth study was criticized by John Timmer of Ars Technica for its methodological shortcomings.[33]
Wikipedia trusts the same community to self-regulate and become more proficient at quality control. Wikipedia has harnessed the work of millions of people to produce the world's largest knowledge-based site along with software to support it, resulting in more than nineteen million articles written, across more than 280 different language versions, in fewer than twelve years.[34] For this reason, there has been considerable interest in the project both academically and from diverse fields such as information technology, business, project management, knowledge acquisition, software programming, other collaborative projects and sociology, to explore whether the Wikipedia model can produce quality results, what collaboration in this way can reveal about people and whether the scale of involvement can overcome the obstacles of individual limitations and poor editorship which would otherwise arise.[citation needed]
Wikipedia's degree of truthfulness extends from its technology, policies, and editor culture. Edit histories are publicly visible. Footnotes show the origins of claims. Editors remove unverifiable claims and overrule ("revert") claims not phrased in a neutral point of view (NPOV). Wikipedia editors also tend towards self-examination and acknowledge Wikipedia's flaws. Its open model permits article-tampering (vandalism) including short-lived jokes and longer hoaxes. Some editors dedicate as much time to trolling (creating vandalism, spam, and harassment) as others do improving the encyclopedia. The English Wikipedia's editor pool, roughly 40,000 active editors who make five edits monthly, largely skews male and white, leading to gender- and race-based systemic biases in coverage. Variations in coverage mean that Wikipedia can be both, as online communities professor Amy S. Bruckman put it, "the most accurate form of information ever created by humans" on the whole while short articles can be "total garbage".[35]
Academics view Wikipedia as representing a "consensus truth" in which readers can check reality in an age of contested facts. For example, when facts surrounding the COVID-19 pandemic rapidly changed or were debated, editors removed claims that did not adhere to the "verifiability" and "NPOV" guidelines.[35]
Fact-checking of Wikipedia is the process through which Wikipedia editors perform fact-checking of content published in Wikipedia, while fact-checking using Wikipedia is the use of Wikipedia for fact-checking other publications. The broader topic of fact checking in the context of Wikipedia also includes the cultural discussion of the place of Wikipedia in fact-checking. Major platforms including YouTube[36] and Facebook[37] use Wikipedia's content to confirm the accuracy of information in their own media collections. Seeking public trust is a major part of Wikipedia's publication philosophy.[38]
Wikipedia has grown beyond a simple encyclopedia to become what The New York Times called a "factual netting that holds the digital world together".[35] Common questions asked of search engines are answered using knowledge ingested from Wikipedia, and often credit or link to Wikipedia as their source. Wikipedia is likely the most important single source used to train generative artificial intelligence (AI) models, such as ChatGPT, for which Wikipedia is valued as a well-curated data set with highly structured formatting.[39] The accuracy of AI models depend on the quality of their training data, but these models are also fundamentally unable to cite their original source for their knowledge, thus AI users use Wikipedia knowledge without knowing that Wikipedia is its source. AI users also receive results that intertwine facts originating from Wikipedia with fictional data points (AI hallucinations), lowering the quality of information absent a real-time fact-check during information retrieval.[35]
Assessments
This section needs to be updated. Please help update this article to reflect recent events or newly available information.(March 2022)
Criteria for evaluating reliability
The reliability of Wikipedia articles can be measured by the following criteria:
Accuracy of information provided within articles
Appropriateness of the images provided with the article
Appropriateness of the style and focus of the articles[40]
Susceptibility to, and exclusion and removal of, false information
Comprehensiveness, scope and coverage within articles and in the range of articles
Several "market-oriented" extrinsic measures demonstrate that large audiences trust Wikipedia in one way or another. For instance, "50 percent of [US] physicians report that they've consulted ... [Wikipedia] for information on health conditions", according to a report from IMS Institute for Healthcare Informatics.[41]
Comparative studies
On October 24, 2005, the British newspaper, The Guardian, published a story entitled "Can you trust Wikipedia?" in which a panel of experts were asked to review seven entries related to their fields, giving each article reviewed a number designation from 0 to 10. Most of these reviewed articles received marks between 5 and 8. The most common critiques were poor prose, or ease-of-reading issues (three mentions), omissions or inaccuracies, often small but including key omissions in some articles (three mentions), and poor balance, with less important areas being given more attention and vice versa (one mention). The most common praises were factually sound and correct, no glaring inaccuracies (four mentions), and much useful information, including well-selected links, making it possible to "access much information quickly" (three mentions).[42]
In December 2005, the journal Nature published results of an attempted blind study seeking reviewer evaluations of the accuracy of a small subset of articles from Wikipedia and Encyclopædia Britannica. The non-peer-reviewed study was based on Nature's selection of 42 articles on scientific topics, including biographies of well-known scientists. Factual errors, omissions or misleading statements found in the sampled articles was 162 for Wikipedia and 123 for Britannica (4:3). For serious errors, such as misinterpretations of important concepts, 4 were found in Wikipedia, and 4 in Britannica (1:1). The study concluded that "Wikipedia comes close to Britannica in terms of the accuracy of its science entries",[25] although Wikipedia's articles were often "poorly structured".[25]
Encyclopædia Britannica expressed concerns, leading Nature to release further documentation of its survey method.[43] Based on this additional information, Encyclopædia Britannica denied the validity of the Nature study, stating that it was "fatally flawed". Among Britannica's criticisms were that excerpts rather than the full texts of some of their articles were used, that some of the extracts were compilations that included articles written for the youth version, that Nature did not check the factual assertions of its reviewers, and that many points the reviewers labeled as errors were differences of editorial opinion. Britannica further stated that "While the heading proclaimed that 'Wikipedia comes close to Britannica in terms of the accuracy of its science entries,' the numbers buried deep in the body of the article said precisely the opposite: Wikipedia in fact had a third more inaccuracies than Britannica. (As we demonstrate below, Nature's research grossly exaggerated Britannica's inaccuracies, so we cite this figure only to point out the slanted way in which the numbers were presented.)"[44]Nature acknowledged the compiled nature of some of the Britannica extracts, but denied that this invalidated the conclusions of the study.[45]Encyclopædia Britannica also argued that a breakdown of the errors indicated that the mistakes in Wikipedia were more often the inclusion of incorrect facts, while the mistakes in Britannica were "errors of omission", making "Britannica far more accurate than Wikipedia, according to the figures".[44]Nature has since rejected the Britannica response,[46] stating that any errors on the part of its reviewers were not biased in favor of either encyclopedia, that in some cases it used excerpts of articles from both encyclopedias, and that Britannica did not share particular concerns with Nature before publishing its "open letter" rebuttal.[47][48]
The point-for-point disagreement between these two parties that addressed the compilation/text excerpting and very small sample size issues—argued to bias the outcome in favor of Wikipedia, versus a comprehensive, full article, large sample size study favoring the quality-controlled format of Britannica—have been echoed in online discussions,[49][50] including of articles citing the Nature study, e.g., where a "flawed study design" for manual selection of articles/article portions, the lack of study "statistical power" in its comparing 40 articles from over 100,000 Britannica and over 1 million English Wikipedia articles, and the absence of any study statistical analyses (e.g., reported confidence intervals for study results) has also been noted.[51] Science communicator Jonathan Jarry said in 2024 that the study was historically important, and had been cited in almost every science paper on Wikipedia's reliability since then, but that research of this kind will only provide a "snapshot" and quickly become unreliable.[52]
In June 2006, Roy Rosenzweig, a professor specializing in American history, published a comparison of the Wikipedia biographies of 25 Americans to the corresponding biographies found on Encarta and American National Biography Online. He wrote that Wikipedia is "surprisingly accurate in reporting names, dates, and events in U.S. history" and described some of the errors as "widely held but inaccurate beliefs". However, he stated that Wikipedia often fails to distinguish important from trivial details, and does not provide the best references. He also complained about Wikipedia's lack of "persuasive analysis and interpretations, and clear and engaging prose".[53][nb 2]
A web-based survey conducted from December 2005 to May 2006 by Larry Press, a professor of Information Systems at California State University at Dominguez Hills, assessed the "accuracy and completeness of Wikipedia articles".[54] Fifty people accepted an invitation to assess an article. Of the fifty, seventy-six percent (76%) agreed or strongly agreed that the Wikipedia article was accurate, and forty-six percent (46%) agreed or strongly agreed that it was complete. Eighteen people compared the article they reviewed to the article on the same topic in the Encyclopædia Britannica. Opinions on accuracy were almost equal between the two encyclopedias (6 favoring Britannica, 7 favoring Wikipedia, 5 stating they were equal), and eleven of the eighteen (61%) found Wikipedia somewhat or substantially more complete, compared to seven of the eighteen (39%) for Britannica. The survey did not attempt a random selection of the participants, and it is not clear how the participants were invited.[55]
The German computing magazine c't performed a comparison of Brockhaus Multimedial, Microsoft Encarta, and the German Wikipedia in October 2004: Experts evaluated 66 articles in various fields. In overall score, Wikipedia was rated 3.6 out of 5 points (B-).[56] A second test by c't in February 2007 used 150 search terms, of which 56 were closely evaluated, to compare four digital encyclopedias: Bertelsmann Enzyklopädie 2007, Brockhaus Multimedial premium 2007, Encarta 2007 Enzyklopädie and Wikipedia. It concluded: "We did not find more errors in the texts of the free encyclopedia than in those of its commercial competitors."[57]
Viewing Wikipedia as fitting the economists' definition of a perfectly competitive marketplace of ideas, George Bragues (University of Guelph-Humber), examined Wikipedia's articles on seven top Western philosophers: Aristotle, Plato, Immanuel Kant, René Descartes, Georg Wilhelm Friedrich Hegel, Thomas Aquinas, and John Locke. Wikipedia's articles were compared to a consensus list of themes culled from four reference works in philosophy. Bragues found that, on average, Wikipedia's articles only covered 52% of consensus themes. No errors were found, though there were significant omissions.[58]
PC Pro magazine (August 2007) asked experts to compare four articles (a small sample) in their scientific fields between Wikipedia, Britannica and Encarta. In each case Wikipedia was described as "largely sound", "well handled", "performs well", "good for the bare facts" and "broadly accurate". One article had "a marked deterioration towards the end" while another had "clearer and more elegant" writing, a third was assessed as less well written but better detailed than its competitors, and a fourth was "of more benefit to the serious student than its Encarta or Britannica equivalents". No serious errors were noted in Wikipedia articles, whereas serious errors were noted in one Encarta and one Britannica article.[59]
In October 2007, the Australian magazine PC Authority published a feature article on the accuracy of Wikipedia. The article compared Wikipedia's content to other popular online encyclopedias, namely Britannica and Encarta. The magazine asked experts to evaluate articles pertaining to their field. A total of four articles were reviewed by three experts. Wikipedia was comparable to the other encyclopedias, topping the chemistry category.[60]
In December 2007, German magazine Stern published the results of a comparison between the German Wikipedia and the online version of the 15-volume edition of Brockhaus Enzyklopädie. The test was commissioned to a research institute (Cologne-based WIND GmbH), whose analysts assessed 50 articles from each encyclopedia (covering politics, business, sports, science, culture, entertainment, geography, medicine, history and religion) on four criteria (accuracy, completeness, timeliness and clarity), and judged Wikipedia articles to be more accurate on the average (1.6 on a scale from 1 to 6 versus 2.3 for Brockhaus, with 1 as the best and 6 as the worst). Wikipedia's coverage was also found to be more complete and up to date; however, Brockhaus was judged to be more clearly written, while several Wikipedia articles were criticized as being too complicated for non-experts, and many as too lengthy.[61][62][63]
In its April 2008 issue British computing magazine PC Plus compared the English Wikipedia with the DVD editions of World Book Encyclopedia and Encyclopædia Britannica, assessing for each the coverage of a series of random subjects. It concluded, "The quality of content is good in all three cases" and advised Wikipedia users "Be aware that erroneous edits do occur, and check anything that seems outlandish with a second source. But the vast majority of Wikipedia is filled with valuable and accurate information."[64]
A 2008 paper in Reference Services Review compared nine Wikipedia entries on historical topics to their counterparts in Encyclopædia Britannica, The Dictionary of American History and American National Biography Online. The paper found that Wikipedia's entries had an overall accuracy rate of 80 percent, whereas the other encyclopedias had an accuracy rate of 95 to 96 percent.[65]
A 2010 study assessed the extent to which Wikipedia pages about the history of countries conformed to the site's policy of verifiability. It found that, in contradiction of this policy, many claims in these articles were not supported by citations, and that many of those that were, sourced to popular media and government websites rather than to academic journal articles.[66]
In April 2011, a study was published by Adam Brown of Brigham Young University in the journal PS Political Science & Politics which examined "thousands of Wikipedia articles about candidates, elections, and officeholders". The study found that while the information in these articles tended to be accurate, the articles examined contained many errors of omission.[67]
A 2012 study co-authored by Shane Greenstein examined a decade of Wikipedia articles on United States politics and found that the more contributors there were to a given article, the more neutral it tended to be, in line with a narrow interpretation of Linus's law.[68]
Reavley et al. (2012) compared the quality of articles on select mental health topics on Wikipedia with corresponding articles in Encyclopædia Britannica and a psychiatry textbook. They asked experts to rate article content with regard to accuracy, up-to-dateness, breadth of coverage, referencing and readability. Wikipedia scored highest on all criteria except readability, and the authors concluded that Wikipedia is as good as or better than Britannica and a standard textbook.[24]
A 2014 perspective piece in the New England Journal of Medicine examined Wikipedia pages about 22 prescription drugs to determine if they had been updated to include the most recent FDA safety warnings. It found that 41% of these pages were updated within two weeks after the warning, 23% were updated more than two weeks later, and the remaining 36% had not been updated to include the warning as of more than 1 year later as of January 2014.[69]
A 2014 study in the Journal of the American Pharmacists Association examined 19 Wikipedia articles about herbal supplements, and concluded that all of these articles contained information about their "therapeutic uses and adverse effects", but also concluded that "several lacked information on drug interactions, pregnancy, and contraindications". The study's authors therefore recommended that patients not rely solely on Wikipedia as a source for information about the herbal supplements in question.[70]
Another study published in 2014 in PLOS ONE found that Wikipedia's information about pharmacology was 99.7% accurate when compared to a pharmacology textbook, and that the completeness of such information on Wikipedia was 83.8%. The study also determined that completeness of these Wikipedia articles was lowest (68%) in the category "pharmacokinetics" and highest (91.3%) in the category "indication". The authors concluded that "Wikipedia is an accurate and comprehensive source of drug-related information for undergraduate medical education".[71]
Expert opinion
This section needs to be updated. Please help update this article to reflect recent events or newly available information.(May 2021)
Librarians' views
In a 2004 interview with The Guardian, self-described information specialist and Internet consultant[72] Philip Bradley said that he would not use Wikipedia and was "not aware of a single librarian who would". He then explained that "the main problem is the lack of authority. With printed publications, the publishers have to ensure that their data are reliable, as their livelihood depends on it. But with something like this, all that goes out the window."[73]
In 2005, the library at Trent University in Ontario stated Wikipedia had many articles that are "long and comprehensive", but that there is "a lot of room for misinformation and bias [and] a lot of variability in both the quality and depth of articles". It adds that Wikipedia has advantages and limitations, that it has "excellent coverage of technical topics" and articles are "often added quickly and, as a result, coverage of current events is quite good", comparing this to traditional sources which are unable to achieve this task. It concludes that, depending upon the need, one should think critically and assess the appropriateness of one's sources, "whether you are looking for fact or opinion, how in-depth you want to be as you explore a topic, the importance of reliability and accuracy, and the importance of timely or recent information", and adds that Wikipedia can be used in any event as a "starting point".[74]
A 2006 review of Wikipedia by Library Journal, using a panel of librarians, "the toughest critics of reference materials, whatever their format", asked "long standing reviewers" to evaluate three areas of Wikipedia (popular culture, current affairs, and science), and concluded: "While there are still reasons to proceed with caution when using a resource that takes pride in limited professional management, many encouraging signs suggest that (at least for now) Wikipedia may be granted the librarian's seal of approval". A reviewer who "decided to explore controversial historical and current events, hoping to find glaring abuses" said, "I was pleased by Wikipedia's objective presentation of controversial subjects" but that "as with much information floating around in cyberspace, a healthy degree of skepticism and skill at winnowing fact from opinion are required". Other reviewers noted that there is "much variation" but "good content abounds".[75]
Information Today (March 2006) cites librarian Nancy O'Neill (principal librarian for Reference Services at the Santa Monica Public Library System) as saying that "there is a good deal of skepticism about Wikipedia in the library community" but that "she also admits cheerfully that Wikipedia makes a good starting place for a search. You get terminology, names, and a feel for the subject."[77]
PC Pro (August 2007) cites the head of the European and American Collection at the British Library, Stephen Bury, as stating "Wikipedia is potentially a good thing—it provides a speedier response to new events, and to new evidence on old items". The article concludes: "For [Bury], the problem isn't so much the reliability of Wikipedia's content so much as the way in which it's used." "It's already become the first port of call for the researcher", Bury says, before noting that this is "not necessarily problematic except when they go no further". According to Bury, the trick to using Wikipedia is to understand that "just because it's in an encyclopedia (free, web or printed) doesn't mean it's true. Ask for evidence ... and contribute."[59]
Articles on contentious issues
A 2006 article for the Canadian Library Association (CLA)[78] discussed the Wikipedia approach, process and outcome in depth, commenting for example that in controversial topics, "what is most remarkable is that the two sides actually engaged each other and negotiated a version of the article that both can more or less live with". The author comments that:
In fact Wikipedia has more institutional structure than at first appears. Some 800 experienced users are designated as administrators, with special powers of binding and loosing: they can protect and unprotect, delete and undelete and revert articles, and block and unblock users. They are expected to use their powers in a neutral way, forming and implementing the consensus of the community. The effect of their intervention shows in the discussion pages of most contentious articles. Wikipedia has survived this long because it is easier to reverse vandalism than it is to commit it...
Shi et al. extended this analysis in discussing "The wisdom of polarized crowds" in 2017 based on content analysis of all edits to English Wikipedia articles relating to politics, social issues and science from its start to December 1, 2016. This included almost 233,000 articles representing approximately 5 percent of the English Wikipedia. They wrote: "Political speech [at least in the United States] has become markedly more polarized in recent years ... . [D]espite early promise of the world-wide-web to democratize access to diverse information, increased media choice and social networking platforms ... [create] echo chambers that ... degrade the quality of individual decisions, ... discount identity-incongruent opinions, stimulate and reinforce polarizing information ... foment conflict and even make communication counter-productive. Nevertheless, a large literature documents the largely positive effect that social differences can exert on the collaborative production of information, goods and services. Research demonstrates that individuals from socially distinct groups embody diverse cognitive resources and perspectives that, when cooperatively combined ... outperform those from homogeneous groups." They translated edit histories of millions of Wikipedia editors into a 7-point political identification scale and compared that with Wikipedia's six-level article quality score (stub, start, C, B, good, featured) assigned via a machine learning algorithm. They found that "articles attracting more attention tend to have more balanced engagement ... [and] higher polarization is associated with higher quality."[79]
Academia
Academics have also criticized Wikipedia for its perceived failure as a reliable source and because Wikipedia editors may have no expertise, competence, or credentials in the topics on which they contribute.[80][81] Adrian Riskin, a mathematician in Whittier College commented that while highly technical articles may be written by mathematicians for mathematicians, the more general maths topics, such as the article on polynomials, are written in a very amateurish fashion with a number of obvious mistakes.[82]
Because Wikipedia cannot be considered a reliable source, the use of Wikipedia is not accepted in many schools and universities in writing a formal paper, and some educational institutions have banned it as a primary source while others have limited its use to only a pointer to external sources.[80][83][84] The criticism of not being a reliable source, however, may not only apply to Wikipedia but to encyclopedias in general—some university lecturers are not impressed when students cite print-based encyclopedias in assigned work.[85] However, it seems that instructors have underestimated the use of Wikipedia in academia because of these concerns. Researchers and academics contend that while Wikipedia may not be used as a 100 percent accurate source for final papers, it is a valuable jumping off point for research that can lead to many possibilities if approached critically. What may be missing in academia is the emphasis on critical analysis in regards to the use of Wikipedia in secondary and higher education. We should not dismiss Wikipedia entirely (there are fewer inaccuracies than there are errors of omission) but rather begin to support it, and teach the use of Wikipedia as an education tool in tandem with critical thinking skills that will allow students to filter the information found on the online encyclopedia and help them critically analyze their findings.[86][attribution needed]
An empirical study conducted in 2006 by a Business School lecturer in Information Systems at the University of Nottingham,[87] the subject of a review on the technical website Ars Technica,[88] involving 55 academics asked to review specific Wikipedia articles that either were in their expert field (group 1) or chosen at random (group 2), concluded that: "The experts found Wikipedia's articles to be more credible than the non-experts. This suggests that the accuracy of Wikipedia is high. However, the results should not be seen as support for Wikipedia as a totally reliable resource as, according to the experts, 13 percent of the articles contain mistakes (10% of the experts reported factual errors of an unspecified degree, 3% of them reported spelling errors)."[89]
The Gould Library at Carleton College in Minnesota has a web page describing the use of Wikipedia in academia. It asserts that "Wikipedia is without question a valuable and informative resource", but that "there is an inherent lack of reliability and stability" to its articles, again drawing attention to similar advantages and limitations as other sources. As with other reviews, it comments that one should assess one's sources and what is desired from them, and that "Wikipedia may be an appropriate resource for some assignments, but not for others." It cited Wikipedia co-founder Jimmy Wales' view that Wikipedia may not be ideal as a source for all academic uses, and (as with other sources) suggests that at the least, one strength of Wikipedia is that it provides a good starting point for current information on a very wide range of topics.[90]
In 2007, the Chronicle of Higher Education published an article written by Cathy Davidson, Professor of Interdisciplinary Studies and English at Duke University, in which she asserts that Wikipedia should be used to teach students about the concepts of reliability and credibility.[91]
In 2008, Hamlet Isakhanli, founder and president of Khazar University, compared the Encyclopædia Britannica and English Wikipedia articles on Azerbaijan and related subjects. His study found that Wikipedia covered the subject much more widely, more accurately and in more detail, though with some lack of balance, and that Wikipedia was the best source for the first approximation.[92]
In 2011, Karl Kehm, associate professor of physics at Washington College, said: "I do encourage [my students] to use [Wikipedia] as one of many launch points for pursuing original source material. The best Wikipedia entries are well researched with extensive citations".[93]
Some academic journals do refer to Wikipedia articles, but are not elevating it to the same level as traditional references. For instance, Wikipedia articles have been referenced in "enhanced perspectives" provided on-line in the journal Science. The first of these perspectives to provide a hyperlink to Wikipedia was "A White Collar Protein Senses Blue Light" in 2002,[94] and dozens of enhanced perspectives have provided such links since then. The publisher of Science states that these enhanced perspectives "include hypernotes—which link directly to websites of other relevant information available online—beyond the standard bibliographic references".[95]
Sverrir Steinsson[who?] investigated factors that influenced the credibility of English Wikipedia in 2023, and found that "Wikipedia transformed from a dubious source of information in its early years to an increasingly reliable one over time."[96] This was due to it becoming "an active fact-checker and anti-fringe",[97] with "pro-fringe editors" leaving the site as the Wikipedia community changed its interpretation of the NPOV policy and began to more accurately label misleading content as pseudoscience, conspiracy theory, etc., in harmony with the citations used to source that content.[97] This reinterpretation of NPOV "had meaningful consequences, turning an organization that used to lend credence and false balance to pseudoscience, conspiracy theories, and extremism into a proactive debunker, fact-checker and identifier of fringe discourse."[96]
Educational and cognitive psychologist Sam Wineburg said in 2024 that "No, Wikipedia isn’t an unreliable source that anyone can edit and that should be avoided. In 2024, it has become a remarkably rigorous self-correcting resource that all of us should be using more often."[98]
Journalism and use of Wikipedia in the newsroom
In his 2014 book Virtual Unreality, Charles Seife, a professor of journalism at New York University, noted Wikipedia's susceptibility to hoaxes and misinformation, including manipulation by commercial and political organizations "masquerading as common people" making edits to Wikipedia. In conclusion, Seife presented the following advice:[99]
Wikipedia is like an old and eccentric uncle.
He can be a lot of fun—over the years he's seen a lot, and he can tell a great story. He's also no dummy; he's accumulated a lot of information and has some strong opinions about what he's gathered. You can learn quite a bit from him. But take everything he says with a grain of salt. A lot of the things he thinks he knows for sure aren't quite right or are taken out of context. And when it comes down to it, sometimes he believes things that are a little bit, well, nuts.
If it ever matters to you whether something he said is real or fictional, it's crucial to check it out with a more reliable source.[99]
Seife observed that when false information from Wikipedia spreads to other publications, it sometimes alters truth itself.[99] On June 28, 2012, for example, an anonymous Wikipedia contributor added the invented nickname "Millville Meteor" to the Wikipedia biography of baseball player Mike Trout. A couple of weeks later, a Newsday sports writer reproduced the nickname in an article, and "with that act, the fake nickname became real".[99] Seife pointed out that while Wikipedia, by some standards, could be described as "roughly as accurate" as traditional publications, and is more up to date, "there's a difference between the kind of error one would find in Wikipedia and what one would in Britannica or Collier's or even in the now-defunct Microsoft Encarta encyclopedia ... the majority of hoaxes on Wikipedia could never have appeared in the old-fashioned encyclopedias."[99]Dwight Garner, reviewing Seife's book in The New York Times, said that he himself had "been burned enough times by bad online information", including "Wikipedia howlers", to have adopted a very sceptical mindset.[100]
In November 2012, judge Brian Leveson was accused of having forgotten "one of the elementary rules of journalism" when he named a "Brett Straub" as one of the founders of The Independent newspaper in his report on the culture, practices and ethics of the British press. The name had been added to the Wikipedia article on The Independent over a year prior, and turned out to be that of a 25-year-old Californian, whose friend had added his name to a string of Wikipedia pages as a prank.[101] Straub was tracked down by The Telegraph and commented, "The fact someone, especially a judge, has believed something on Wikipedia is kind of shocking. My friend went on and edited a bunch of Wikipedia pages and put my name there. [...] I knew my friend had done it but I didn't know how to change them back and I thought someone would. At one point I was the creator of Coca-Cola or something. You know how easy it is to change Wikipedia. Every time he came across a red linked name he put my name in its place."[102]
A 2016 BBC article by Ciaran McCauley similarly noted that "plenty of mischievous, made-up information has found its way" on to Wikipedia and that "many of these fake facts have fallen through the cracks and been taken as gospel by everyone from university academics to major newspapers and broadcasters."[103] Listing examples of journalists being embarrassed by reproducing hoaxes and other falsifications from Wikipedia in their writing, including false information propagated by major news organizations in their obituaries of Maurice Jarre and Ronnie Hazlehurst, McCauley stated: "Any journalist in any newsroom will likely get a sharp slap across the head from an editor for treating Wikipedia with anything but total skepticism (you can imagine the kicking I've taken over this article)."[103]
The Daily Mail—itself banned as a source on Wikipedia in 2017 because of its perceived unreliability—has publicly stated that it "banned all its journalists from using Wikipedia as a sole source in 2014 because of its unreliability".[104]
Slate said in 2022 that "Screenshots of vandalized Wikipedia articles, even when reverted within minutes, often have a much longer afterlife in news reports and on social media, creating the public impression that the platform is more vulnerable to abuse than it actually is."[105]
Science and medicine are areas where accuracy is of high importance and peer review is the norm. While some of Wikipedia's content has passed a form of peer review, most has not.[106]
A 2008 study examined 80 Wikipedia drug entries. The researchers found few factual errors in this set of articles, but determined that these articles were often missing important information, like contraindications and drug interactions. One of the researchers noted that "If people went and used this as a sole or authoritative source without contacting a health professional...those are the types of negative impacts that can occur." The researchers also compared Wikipedia to Medscape Drug Reference (MDR), by looking for answers to 80 different questions covering eight categories of drug information, including adverse drug events, dosages, and mechanism of action. They have determined that MDR provided answers to 82.5 percent of the questions, while Wikipedia could only answer 40 percent, and that answers were less likely to be complete for Wikipedia as well. None of the answers from Wikipedia were determined factually inaccurate, while they found four inaccurate answers in MDR. But the researchers found 48 errors of omission in the Wikipedia entries, compared to 14 for MDR. The lead investigator concluded: "I think that these errors of omission can be just as dangerous [as inaccuracies]", and he pointed out that drug company representatives have been caught deleting information from Wikipedia entries that make their drugs look unsafe.[23]
A 2009 survey asked US toxicologists how accurately they rated the portrayal of health risks of chemicals in different media sources. It was based on the answers of 937 members of the Society of Toxicology and found that these experts regarded Wikipedia's reliability in this area as far higher than that of all traditional news media:
In perhaps the most surprising finding in the entire study, all these national media outlets [U.S. newspapers, news magazines, health magazines, broadcast and cable television networks] are easily eclipsed by two representatives of "new media": WebMD and Wikipedia. WebMD is the only news source whose coverage of chemical risk is regarded as accurate by a majority (56 percent) of toxicologists, closely followed by Wikipedia's 45 percent accuracy rating. By contrast, only 15 percent describe as accurate the portrayals of chemical risk found in The New York Times, Washington Post, and Wall Street Journal.[21]
In 2010 researchers compared information about 10 types of cancer on Wikipedia to similar data from the National Cancer Institute's Physician Data Query and concluded "the Wiki resource had similar accuracy and depth to the professionally edited database" and that "sub-analysis comparing common to uncommon cancers demonstrated no difference between the two", but that ease of readability was an issue.[107]
A study in 2011 came to the result that categories most frequently absent in Wikipedia's drug articles are those of drug interactions and medication use in breastfeeding.[108] Other categories with incomplete coverage were descriptions of off-label indications, contraindications and precautions, adverse drug events and dosing.[108] Information most frequently deviating from other sources used in the study were that of contraindications and precautions, drug absorption and adverse drug events.[108]
A 2012 study reported that Wikipedia articles about pediatric otolaryngology contained twice as many errors and omissions as the medical database eMedicine.[109]
In a U.S. study in 2014, 10 researchers examined 10 Wikipedia health articles of the most costly medical conditions in the United States and found that 90% of the entries contained errors and statements that contradicted latest medical research. However, according to Stevie Benton of Wikimedia UK the sample size used in the research may have been too small to be considered representative.[110][111]
Only part of the data was made public, and for two statements that were released for other researchers to examine, the claim that Wikipedia's statements were contradictory to the peer-reviewed literature was called into question.[112] However, more open studies, published in 2017 and 2020, concluded, that Wikipedia provided less accurate medical information than paid-access online encyclopedias.[113][114]
A 2014 study published in PLOS One looked at the quality of Wikipedia articles on pharmacology, comparing articles from English and German Wikipedia with academic textbooks. It found that "the collaborative and participatory design of Wikipedia does generate high quality information on pharmacology that is suitable for undergraduate medical education".[115]
A 2024 review of online information sources for healthcare-related research cautioned against using Wikipedia as a primary reference, and noted its value as a resource to identify sources of information.[116] Jarry said in 2024 that evaluating Wikipedia's reliability on medicine or any subject is challenging and that researchers "have to pick a sample and hope it is representative," saying also that "Wikipedia, overall, has no business being this good."[52]
Judiciary
References to Wikipedia in United States judicial opinions have increased each year since 2004. In a 2017 ruling, the Supreme Court of Texas advised against reliance on the information in Wikipedia for judicial rulings, arguing that its lack of reliability prevents using it as a source of authority in legal opinions.[117][118]
The Supreme Court of India in its judgment in Commr. of Customs, Bangalore vs. ACER India Pvt. (Citation 2007(12)SCALE581) held that "We have referred to Wikipedia, as the learned Counsel for the parties relied thereupon. It is an online encyclopaedia and information can be entered therein by any person and as such it may not be authentic."[119]
Editors of Encyclopædia Britannica
In a 2004 piece called "The Faith-Based Encyclopedia", Robert McHenry, a former editor-in-chief of Encyclopædia Britannica, stated that Wikipedia errs in billing itself as an encyclopedia, because that word implies a level of authority and accountability that he believes cannot be possessed by an openly editable reference. McHenry argued that "the typical user doesn't know how conventional encyclopedias achieve reliability, only that they do".[120] He added:
[H]owever closely a Wikipedia article may at some point in its life attain to reliability, it is forever open to the uninformed or semiliterate meddler... The user who visits Wikipedia to learn about some subject, to confirm some matter of fact, is rather in the position of a visitor to a public restroom. It may be obviously dirty, so that he knows to exercise great care, or it may seem fairly clean, so that he may be lulled into a false sense of security. What he certainly does not know is who has used the facilities before him."[120]
Similarly, Britannica's executive editor, Ted Pappas, was quoted in The Guardian as saying:
The premise of Wikipedia is that continuous improvement will lead to perfection. That premise is completely unproven.[73]
In the September 12, 2006, edition of The Wall Street Journal, Jimmy Wales debated with Dale Hoiberg, editor-in-chief of Encyclopædia Britannica. Hoiberg focused on a need for expertise and control in an encyclopedia and cited Lewis Mumford that overwhelming information could "bring about a state of intellectual enervation and depletion hardly to be distinguished from massive ignorance". Wales emphasized Wikipedia's differences, and asserted that openness and transparency lead to quality. Hoiberg replied that he "had neither the time nor space to respond to [criticisms]" and "could corral any number of links to articles alleging errors in Wikipedia", to which Wales responded: "No problem! Wikipedia to the rescue with a fine article", and included a link to the Wikipedia article Criticism of Wikipedia.[121]
Tools for testing the reliability of articles
While experienced editors can view the article history and discussion page, for normal users it is not so easy to check whether information from Wikipedia is reliable. University projects from California, Switzerland and Germany try to improve that by methods of formal analysis and data mining. Wiki-Watch from Germany, which was inspired by the WikiBu from Switzerland, shows an evaluation up to five-stars for every English or German article in Wikipedia. Part of this rating is the tool WikiTrust which shows the trustworthiness of single text parts of Wikipedia articles by white (trustworthy) or orange (not trustworthy) markings.[122]
Sources accepted as reliable for Wikipedia may rely on Wikipedia as a reference source, sometimes indirectly. If the original information in Wikipedia was false, once it has been reported in sources considered reliable, Wikipedia can use them to reference the false information, giving an apparent credibility to falsehood. This in turn increases the likelihood of the false information being reported in other media.[123] A known example is the Sacha Baron Cohen article, where false information added in Wikipedia was apparently used by two newspapers, leading to it being treated as reliable in Wikipedia.[124][better source needed] This process of creating reliable sources for false facts has been termed "citogenesis" by xkcdwebcomic artist Randall Munroe.[125][126][127]
Propagation of misinformation
This section needs to be updated. Please help update this article to reflect recent events or newly available information.(March 2022)
Somewhat related to the "information loop" is the propagation of misinformation to other websites (Answers.com is just one of many) which will often quote misinformation from Wikipedia verbatim, and without mentioning that it has come from Wikipedia. A piece of misinformation originally taken from a Wikipedia article will live on in perhaps dozens of other websites, even if Wikipedia itself has deleted the unreliable material.[128]
Other
In one article, Information Today (March 2006) likens[77] comparisons between Wikipedia and Britannica to "apples and oranges":
Even the revered Encyclopædia Britannica is riddled with errors, not to mention the subtle yet pervasive biases of individual subjectivity and corporate correctness... There is no one perfect way. Britannica seems to claim that there is. Wikipedia acknowledges there's no such thing. Librarians and information professionals have always known this. That's why we always consult multiple sources and counsel our users to do the same.
Andrew Orlowski, a columnist for The Register, expressed similar criticisms in 2005, writing that the use of the term "encyclopedia" to describe Wikipedia may lead users into believing it is more reliable than it may be.[129]
BBCtechnology specialist Bill Thompson wrote that "Most Wikipedia entries are written and submitted in good faith, and we should not let the contentious areas such as politics, religion or biography shape our view of the project as a whole", that it forms a good starting point for serious research but that:[130]
No information source is guaranteed to be accurate, and we should not place complete faith in something which can so easily be undermined through malice or ignorance... That does not devalue the project entirely, it just means that we should be skeptical about Wikipedia entries as a primary source of information... It is the same with search engine results. Just because something comes up in the top 10 on MSN Search or Google does not automatically give it credibility or vouch for its accuracy or importance.[130]
Thompson adds the observation that since most popular online sources are inherently unreliable in this way, one byproduct of the information age is a wiser audience who are learning to check information rather than take it on faith due to its source, leading to "a better sense of how to evaluate information sources".[130]
In his 2007 Guide to Military History on the Internet, Simon Fowler rated Wikipedia as "the best general resource" for military history research, and stated that "the results are largely accurate and generally free of bias".[131] When rating Wikipedia as the No. 1 military site he mentioned that "Wikipedia is often criticised for its inaccuracy and bias, but in my experience the military history articles are spot on."[132]
In July 2008, The Economist magazine described Wikipedia as "a user-generated reference service" and noted that Wikipedia's "elaborate moderation rules put a limit to acrimony" generated by cyber-nationalism.[133]
Jimmy Wales, a co-founder of Wikipedia, stresses that encyclopedias of any type are not usually appropriate as primary sources, and should not be relied upon as being authoritative.[134]
Carnegie Mellon Professor Randy Pausch offered the following anecdote in his book The Last Lecture. He was surprised that his entry to World Book Encyclopedia on virtual reality was accepted without question, so he concluded, "I now believe Wikipedia is a perfectly fine source for your information, because I know what the quality control is for real encyclopedias."[135]
Removal of false information
Fernanda Viégas of the MIT Media Lab and Martin Wattenberg and Kushal Dave of IBM Research studied the flow of editing in the Wikipedia model, with emphasis on breaks in flow (from vandalism or substantial rewrites), showing the dynamic flow of material over time.[136] From a sample of vandalism edits on the English Wikipedia during May 2003, they found that most such acts were repaired within minutes, summarizing:
We've examined many pages on Wikipedia that treat controversial topics, and have discovered that most have, in fact, been vandalized at some point in their history. But we've also found that vandalism is usually repaired extremely quickly—so quickly that most users will never see its effects.[8]
They also stated that "it is essentially impossible to find a crisp definition of vandalism".[136]
Lih (2004) compared articles before and after they were mentioned in the press, and found that externally referenced articles are of higher quality work. An informal assessment by the popular IT magazine PC Pro for its 2007 article "Wikipedia Uncovered"[59] tested Wikipedia by introducing 10 errors that "varied between bleeding obvious and deftly subtle" into articles (the researchers later corrected the articles they had edited). Labeling the results "impressive" it noted that all but one was noted and fixed within the hour, and that "the Wikipedians' tools and know-how were just too much for our team." A second series of another 10 tests, using "far more subtle errors" and additional techniques to conceal their nature, met similar results: "despite our stealth attempts the vast majority... were discovered remarkably quickly... the ridiculously minor Jesse James error was corrected within a minute and a very slight change to Queen Anne's entry was put right within two minutes". Two of the latter series were not detected. The article concluded that "Wikipedia corrects the vast majority of errors within minutes, but if they're not spotted within the first day the chances... dwindle as you're then relying on someone to spot the errors while reading the article rather than reviewing the edits".
A study in late 2007 systematically inserted inaccuracies into Wikipedia entries about the lives of philosophers. Depending on how exactly the data are interpreted, either one third or one half of the inaccuracies were corrected within 48 hours.[137]
A 2007 peer-reviewed study[138] that measured the actual number of page views with damaged content stated: "42% of damage is repaired almost immediately, i.e., before it can confuse, offend, or mislead anyone. Nonetheless, there are still hundreds of millions of damaged views."[138]
Loc Vu-Quoc, professor for Mechanical and Aerospace Engineering at the University of Florida, stated in 2008 that "sometimes errors may go for years without being corrected as experts don't usually read Wikipedia articles in their own field to correct these errors".[139]
In August 2007, WikiScanner, a tool developed by Virgil Griffith of the California Institute of Technology, was released to match anonymous IP edits in the encyclopedia with an extensive database of addresses. News stories appeared about IP addresses from various organizations such as the Central Intelligence Agency, the Democratic Congressional Campaign Committee, Diebold, Inc. and the Australian government being used to make edits to Wikipedia articles, sometimes of an opinionated or questionable nature.[140] The BBC quoted a Wikimedia spokesperson as praising the tool: "We really value transparency and the scanner really takes this to another level. Wikipedia Scanner may prevent an organization or individuals from editing articles that they're really not supposed to."[141]
The WikiScanner story was also covered by The Independent, which stated that many "censorial interventions" by editors with vested interests on a variety of articles in Wikipedia had been discovered:
[Wikipedia] was hailed as a breakthrough in the democratisation of knowledge. But the online encyclopedia has since been hijacked by forces who decided that certain things were best left unknown... Now a website designed to monitor editorial changes made on Wikipedia has found thousands of self-serving edits and traced them to their original source. It has turned out to be hugely embarrassing for armies of political spin doctors and corporate revisionists who believed their censorial interventions had gone unnoticed.[142]
Not everyone hailed WikiScanner as a success for Wikipedia. Oliver Kamm, in a column for The Times, argued instead that:
The WikiScanner is thus an important development in bringing down a pernicious influence on our intellectual life. Critics of the web decry the medium as the cult of the amateur. Wikipedia is worse than that; it is the province of the covert lobby. The most constructive course is to stand on the sidelines and jeer at its pretensions.[143]
WikiScanner only reveals conflict of interest when the editor does not have a Wikipedia account and their IP address is used instead. Conflict of interest editing done by editors with accounts is not detected, since those edits are anonymous to everyone—except for "a handful of privileged Wikipedia admins".[144]
Wikipedia has been accused of systemic bias, which is to say its general nature leads, without necessarily any conscious intention, to the propagation of various prejudices. Although many articles in newspapers have concentrated on minor, indeed trivial, factual errors in Wikipedia articles, there are also concerns about large-scale, presumably unintentional effects from the increasing influence and use of Wikipedia as a research tool at all levels. In an article in the Times Higher Education magazine (London) philosopher Martin Cohen frames Wikipedia of having "become a monopoly" with "all the prejudices and ignorance of its creators", which he describes as a "youthful cab-drivers" perspective.[145] Cohen's argument, however, finds a grave conclusion in these circumstances: "To control the reference sources that people use is to control the way people comprehend the world. Wikipedia may have a benign, even trivial face, but underneath may lie a more sinister and subtle threat to freedom of thought."[145] That freedom is undermined by what he sees as what matters on Wikipedia, "not your sources but the 'support of the community'."[145]
Critics also point to the tendency to cover topics in a detail disproportionate to their importance. For example, Stephen Colbert once mockingly praised Wikipedia for having a "longer entry on 'lightsabers' than it does on the 'printing press'."[146] In an interview with The Guardian, Dale Hoiberg, the editor-in-chief of Encyclopædia Britannica, noted:
People write of things they're interested in, and so many subjects don't get covered; and news events get covered in great detail. In the past, the entry on Hurricane Frances was more than five times the length of that on Chinese art, and the entry on Coronation Street was twice as long as the article on Tony Blair.[73]
This critical approach has been satirised as "Wikigroaning", a term coined by Jon Hendren[147] of the website Something Awful.[148] In the game, two articles (preferably with similar names) are compared: one about an acknowledged serious or classical subject and the other about a popular topic or current event.[149] Defenders of a broad inclusion criteria have held that the encyclopedia's coverage of pop culture does not impose space constraints on the coverage of more serious subjects (see "Wiki is not paper"). Ivor Tossell wrote:
That Wikipedia is chock full of useless arcana (and did you know, by the way, that the article on "Debate" is shorter than the piece that weighs the relative merits of the 1978 and 2003 versions of Battlestar Galactica?) isn't a knock against it: Since it can grow infinitely, the silly articles aren't depriving the serious ones of space.[150]
Wikipedia has been accused of deficiencies in comprehensiveness because of its voluntary nature, and of reflecting the systemic biases of its contributors. Wikipedia co-founder Larry Sanger stated in 2004, "when it comes to relatively specialized topics (outside of the interests of most of the contributors), the project's credibility is very uneven."[151] He expanded on this 16 years later in May 2020, by comparing how coverage impacts tone between the articles of U.S. presidents Donald Trump (seen as negative) and Barack Obama (seen as positive).[citation needed]
In a GamesRadar editorial, columnist Charlie Barrat juxtaposed Wikipedia's coverage of video game-related topics with its smaller content about topics that have greater real-world significance, such as God, World War II and former U.S. presidents.[152] Wikipedia has been praised for making it possible for articles to be updated or created in response to current events. Its editors have also argued that, as a website, Wikipedia is able to include articles on a greater number of subjects than print encyclopedias can.[153]
A 2011 study reported evidence of cultural bias in Wikipedia articles about famous people on both the English and Polish Wikipedias. These biases included those pertaining to the cultures of both the United States and Poland on each of the corresponding-language Wikipedias, as well as a pro-U.S./English-language bias on both of them.[154]
Wikipedia's notability guidelines, which are used by editors to determine if a subject merits its own article, and the application thereof, are the subject of much criticism.[155] In May 2018, a Wikipedia editor rejected a draft article about Donna Strickland before she won the Nobel Prize in Physics in November of the same year, because no independent sources were given to show that Strickland was sufficiently notable by Wikipedia's standards. Journalists highlighted this as an indicator of the limited visibility of women in science compared to their male colleagues.[156][157]
The gender bias on Wikipedia is well documented and has prompted a movement to increase the number of notable women on Wikipedia through the Women in Red WikiProject. In an article entitled "Seeking Disambiguation", Annalisa Merelli interviewed Catalina Cruz, a candidate for office in Queens, New York in the 2018 election who had the notorious SEO disadvantage of having the same name as a porn star with a Wikipedia page. Merelli also interviewed the Wikipedia editor who wrote the candidate's ill-fated article (which was deleted, then restored, after she won the election). She described the Articles for Deletion process and pointed to other candidates who had pages on the English Wikipedia despite never having held office.[158]
Novelist Nicholson Baker, critical of deletionism, writes: "There are quires, reams, bales of controversy over what constitutes notability in Wikipedia: nobody will ever sort it out."[159]
Journalist Timothy Noah wrote of his treatment: "Wikipedia's notability policy resembles U.S. immigration policy before 9/11: stringent rules, spotty enforcement". In the same article, Noah mentions that the Pulitzer Prize-winning writer Stacy Schiff was not considered notable enough for a Wikipedia entry until she wrote her article "Know it All" about the Wikipedia Essjay controversy.[160]
On a more generic level, a 2014 study found no correlation between the characteristics of a given Wikipedia page about an academic and the academic's notability as determined by citation counts. The metrics of each Wikipedia page examined included length, number of links to the page from other articles, and number of edits made to the page. This study also found that Wikipedia did not cover notable ISI highly cited researchers properly.[161]
In 2020, Wikipedia was criticized for the amount of time it took for an article about Theresa Greenfield, a candidate for the 2020 United States Senate election in Iowa, to leave Wikipedia's Articles for Creation process and become published. Particularly, the criteria for notability were criticized, with The Washington Post reporting: "Greenfield is a uniquely tricky case for Wikipedia because she doesn't have the background that most candidates for major political office typically have (like prior government experience or prominence in business). Even if Wikipedia editors could recognize she was prominent, she had a hard time meeting the official criteria for notability."[162] Jimmy Wales also criticized the long process on his talk page.[163]
Wikipedia co-founder Jimmy Wales stated in 2006: "The Wikipedia community is very diverse, from liberal to conservative to libertarian and beyond. If averages mattered, and due to the nature of the wiki software (no voting) they almost certainly don't, I would say that the Wikipedia community is slightly more liberal than the U.S. population on average, because we are global and the international community of English speakers is slightly more liberal than the U.S. population. There are no data or surveys to back that."[164]
A number of politically conservative commentators have argued that Wikipedia's coverage is affected by liberal bias.[165]Andrew Schlafly created Conservapedia because he found Wikipedia "increasingly anti-Christian and anti-American" for its frequent use of British spelling and coverage of topics like creationism and the effect of Christianity on the Renaissance.[166] In 2007, an article in The Christian Post criticised Wikipedia's coverage of intelligent design, saying that it was biased and hypocritical.[167]Lawrence Solomon of the National Review stated that Wikipedia articles on subjects like global warming, intelligent design, and Roe v. Wade are slanted in favor of liberal views.[168][non-primary source needed] In a September 2010 issue of the conservative weekly Human Events, Rowan Scarborough presented a critique of Wikipedia's coverage of American politicians prominent in the approaching midterm elections as evidence of systemic liberal bias. Scarborough compared the biographical articles of liberal and conservative opponents in Senate races in the Alaska Republican primary and the Delaware and Nevada general election, emphasizing the quantity of negative coverage of Tea Party movement-endorsed candidates. He also cites some criticism by Lawrence Solomon and quotes in full the lead section of Wikipedia's article on the conservative wiki Conservapedia as evidence of an underlying bias.[169][non-primary source needed] Jonathan Sidener of The San Diego Union-Tribune wrote that "vandalism and self-serving misinformation [are] common particularly in the political articles".[170][non-primary source needed] A 2015 study found that negative facts are more likely to be removed from Wikipedia articles on U.S. senators than positive facts but did not find any significant difference relating to political affiliation.[171]
Amid the George Floyd protests, there were several disputes over racial justice on Wikipedia.[165] The Wikipedia community voted against a proposal to black out the website in support of Black Lives Matter because it may have threatened Wikipedia's reputation for neutrality.[165][nb 3] It also led to the creation of the WikiProject Black Lives Matter, in line with AfroCROWD's Juneteenth efforts to improve the coverage of civil rights movement-related topics; the Black Lives Matter project was nominated for deletion on the grounds that it was "non-neutral advocacy".[165] In Wikipedia, neutrality is more of a process that is achieved through consensus. Social scientist Jackie Koerner took issue with the word neutrality and said she preferred the word balance to neutrality because she believed that one of Wikipedia's goals should be knowledge equity.[165]
Although Wikipedia is stated not to be a primary source, it has been used as evidence in legal cases. In January 2007, The New York Times reported that U.S. courts vary greatly in their treatment of Wikipedia as a source of information, with over 100 judicial rulings having relied on the encyclopedia, including those involving taxes, narcotics, and civil issues such as personal injury and matrimonial issues.[177]
In April 2012, The Wall Street Journal reported that in the five years since the 2007 The New York Times story, federal courts of appeals had cited Wikipedia about 95 times. The story also reported that the U.S. Court of Appeals for the Fourth Circuit vacated convictions in a cockfighting case because a juror used Wikipedia to research an element of the crime, expressing in its decision concerns about Wikipedia's reliability.[178]
In one notable case, the trademark of Formula One racing decision,[179] the UK Intellectual Property Office considered both the reliability of Wikipedia, and its usefulness as a reliable source of evidence:
Wikipedia has sometimes suffered from the self-editing that is intrinsic to it, giving rise at times to potentially libellous statements. However, inherently, I cannot see that what is in Wikipedia is any less likely to be true than what is published in a book or on the websites of news organizations. [Formula One's lawyer] did not express any concerns about the Wikipedia evidence [presented by the plaintiff]. I consider that the evidence from Wikipedia can be taken at face value." The case turned substantively upon evidence cited from Wikipedia in 2006 as to the usage and interpretation of the term Formula One.
In the United States, the United States Court of Federal Claims has ruled that "Wikipedia may not be a reliable source of information."[180] and "...Articles [from Wikipedia] do not—at least on their face—remotely meet this reliability requirement...A review of the Wikipedia website reveals a pervasive and, for our purposes, disturbing series of disclaimers...".[177][181] Such disclaimers include the Wikipedia not being able to guarantee the validity of the information on its articles and having no formal peer review.
Among other reasons for these statements about Wikipedia's reliability are the stability of the articles (which due to editing may cause new readers to find information that differs from the originally cited) and, according to Stephen Gillers, a professor at New York University Law School, "the most critical fact is public acceptance", therefore "a judge should not use Wikipedia when the public is not prepared to accept it as authority".[182]
Wikipedia has also become a key source for some current news events such as the 2007 Virginia Tech massacre, when The New York Times cites Wikimedia to report 750,000 page views of the article in the two days after the event:
Even The Roanoke Times, which is published near Blacksburg, Virginia, where the university is located, noted on Thursday that Wikipedia "has emerged as the clearinghouse for detailed information on the event".[183]
The Washington Post commented, in the context of 2008 presidential election candidate biographies, that despite occasional brief vandalism, "it's hard to find a more up-to-date, detailed, thorough article on Obama than Wikipedia's. As of Friday (14 September 2007), Obama's article—more than 22 pages long, with 15 sections covering his personal and professional life—had a reference list of 167 sources."[184]
Broad opinions
Several commentators have drawn a middle ground, asserting that the project contains much valuable knowledge and has some reliability, even if the degree is not yet assessed with certainty. Others taking this view include danah boyd, [sic] who in 2005 discussed Wikipedia as an academic source, concluding that "[i]t will never be an encyclopedia, but it will contain extensive knowledge that is quite valuable for different purposes",[185] and Bill Thompson who stated "I use the Wikipedia a lot. It is a good starting point for serious research, but I would never accept something that I read there without checking."[130]
The inconvenient reality is that people and their products are messy, whether produced in a top-down or bottom-up manner. Almost every source includes errors... Many non-fiction books are produced via an appallingly sloppy process... In this author's opinion, the flap over Wikipedia was significantly overblown, but contained a silver lining: People are becoming more aware of the perils of accepting information at face value. They have learned not to consult just one source.
Dan Gillmor, a Silicon Valley commentator and author commented in October 2004 that, "I don't think anyone is saying Wikipedia is an absolute replacement for a traditional encyclopedia. But in the topics I know something about, I've found Wikipedia to be as accurate as any other source I've found."[73]
Larry Sanger stated on Kuro5hin in 2001 that "Given enough eyeballs, all errors are shallow",[186] which is a paraphrase of Linus' Law of open-source development.
Likewise, technology figure Joi Ito wrote on Wikipedia's authority, "[a]lthough it depends a bit on the field, the question is whether something is more likely to be true coming from a source whose resume sounds authoritative, or a source that has been viewed by hundreds of thousands of people (with the ability to comment) and has survived."[187]
In a 2008 letter to the editor of Physics Today, Gregg Jaeger, an associate professor at Boston University,[188] has characterized Wikipedia as a medium that is susceptible to fostering "anarchy and distortions" in relation to scientific information.[189][nb 4]
Jean Goodwin wrote on the reasons why Wikipedia may be trusted. According to him, while readers may not assess the actual expertise of the authors of a given article, they may assess the passion of Wikipedians, and in so far provide a reason for trust.[203]
Dariusz Jemielniak, a Wikimedia Foundation Board of Trustees member, suggested in 2019 that given the arrival of Wikipedia's 18th birthday, "maybe academics should start treating it as an adult".[204]
Inaccurate information may persist in Wikipedia for a long time before it is challenged. The most prominent cases reported by mainstream media involved biographies of living persons. The Seigenthaler incident demonstrated that the subject of a biographical article must sometimes fix blatant lies about his or her own life. In May 2005, a user edited the biographical article on John Seigenthaler Sr. so that it contained several false and defamatory statements.[10] The inaccurate claims went unnoticed between May and September 2005 when they were discovered by Victor S. Johnson, Jr., a friend of Seigenthaler. Wikipedia content is often mirrored at sites such as Answers.com, which means that incorrect information can be replicated alongside correct information through a number of web sources. Such information can develop a misleading air of authority because of its presence at such sites: "Then [Seigenthaler's] son discovered that his father's hoax biography also appeared on two other sites, Reference.com and Answers.com, which took direct feeds from Wikipedia. It was out there for four months before Seigenthaler realized and got the Wikipedia entry replaced with a more reliable account. The lies remained for another three weeks on the mirror sites downstream."[206]
Seth Finkelstein reported in an article in The Guardian on his efforts to remove his own biography page from Wikipedia, simply because it was subjected to defamation: "Wikipedia has a short biography of me, originally added in February 2004, mostly concerned with my internet civil liberties achievements. After discovering in May 2006 that it had been vandalised in March, possibly by a long-time opponent, and that the attack had been subsequently propagated to many other sites which (legally) repackage Wikipedia's content, the article's existence seemed to me overall to be harmful rather than helpful." He added: "For people who are not very prominent, Wikipedia biographies can be an 'attractive nuisance'. It says, to every troll, vandal, and score-settler: 'Here's an article about a person where you can, with no accountability whatsoever, write any libel, defamation, or smear. It won't be a marginal comment with the social status of an inconsequential rant, but rather will be made prominent about the person, and reputation-laundered with the institutional status of an encyclopedia.'"[207]
In the same article, Finkelstein recounts how he voted his own biography as "not notable enough" in order to have it removed from Wikipedia. He goes on to recount a similar story involving Angela Beesley, previously a prominent member of the foundation which runs Wikipedia. Taner Akçam, a Turkish history professor at the University of Minnesota, was detained at the Montreal airport, as his article was vandalized by Turkish nationalists in 2007. While this mistake was resolved, he was again arrested in US for the same suspicion two days later.[208]
On March 2, 2007, MSNBC reported that Hillary Clinton had been incorrectly listed for 20 months in her Wikipedia biography as valedictorian of her class of 1969 at Wellesley College. (Hillary Rodham was not the valedictorian, though she did speak at commencement.)[209] The article included a link to the Wikipedia edit,[210] where the incorrect information was added on July 9, 2005. After the msnbc.com report, the inaccurate information was removed the same day.[211][nb 5]
Attempts to perpetrate hoaxes may not be confined to editing Wikipedia articles. In October 2005 Alan Mcilwraith, a former call center worker from Scotland created a Wikipedia article in which he claimed to be a highly decorated war hero. The article was quickly identified by other users as unreliable (see Wikipedia Signpost article April 17, 2006); however, Mcilwraith had also succeeded in convincing a number of charities and media organizations that he was who he claimed to be: "The 28-year-old, who calls himself Captain Sir Alan McIlwraith, KBE, DSO, MC, has mixed with celebrities for at least one fundraising event. But last night, an Army spokesman said: 'I can confirm he is a fraud. He has never been an officer, soldier or Army cadet.'"[212]
In May 2010, French politician Ségolène Royal publicly praised the memory of Léon-Robert de l'Astran, an 18th-century naturalist, humanist and son of a slave trader, who had opposed the slave trade. The newspaper Sud-Ouest revealed a month later that de l'Astran had never existed—except as the subject of an article in the French Wikipedia. Historian Jean-Louis Mahé discovered that de l'Astran was fictional after a student, interested by Royal's praise of him, asked Mahé about him. Mahé's research led him to realize that de l'Astran did not exist in any archives, and he traced the hoax back to the Rotary Club of La Rochelle. The article, created by members of the Club in January 2007, had thus remained online for three years—unsourced—before the hoax was uncovered. Upon Sud-Ouest's revelation—repeated in other major French newspapers—French Wikipedia administrator DonCamillo immediately deleted the article.[9][205][213][214][215][216]
There have also been instances of users deliberately inserting false information into Wikipedia in order to test the system and demonstrate its alleged unreliability. Journalist Gene Weingarten ran such a test in 2007 by anonymously inserting false information into his own biography. The fabrications were removed 27 hours later by a Wikipedia editor who was regularly watching changes to that article.[217] Television personality Stephen Colbert lampooned this drawback of Wikipedia, calling it wikiality.[218]
"Death by Wikipedia" is a phenomenon in which a person is erroneously proclaimed dead through vandalism. Articles about the comedian Paul Reiser, British television host Vernon Kay, French professor Bertrand Meyer, and the West Virginia Senator Robert Byrd, who died on June 28, 2010, have been vandalized in this way.[219][220][221][nb 6]
Other false information
In June 2007, an anonymous Wikipedia contributor became involved in the Chris Benoit double murder and suicide because of an unverified piece of information he added to the "Chris Benoit" English Wikipedia article. This information regarding Benoit's wife's death was added fourteen hours before police discovered the bodies of Benoit and his family.[222] Police detectives seized computer equipment from the man held responsible for the postings, but believed he was uninvolved and did not press charges.[223] The IP address from which the edit was made was traced to earlier instances of Wikipedia vandalism. The contributor apologized on Wikinews, saying: "I will never vandalize anything on Wikipedia or post wrongful information. I will never post anything here again unless it is pure fact ... ."[224]
On August 29, 2008, shortly after the first round draw was completed for UEFA Europa League football cup, an edit was made to the article for the football club AC Omonia, apparently by users of the website B3ta.[225][nb 7] On September 18, 2008, David Anderson, a British journalist writing for the Daily Mirror, quoted this in his match preview ahead of Omonia's game with Manchester City, which appeared in the web and print versions of the Mirror and the nickname was quoted in subsequent editions on September 19.[226][227]
In May 2009, University College Dublin sociology student Shane Fitzgerald added an incorrect quote to the article on the recently deceased composer Maurice Jarre. Fitzgerald wanted to demonstrate the potential dangers of news reporters' reliance on the internet for information.[228] Although Fitzgerald's edits were removed three times from the Wikipedia article for lack of sourcing,[229] they were nevertheless copied into obituary columns in newspapers worldwide.[230] Fitzgerald believes that if he had not come forward his quote would have remained in history as fact.[229]
After the 2010 FIFA World Cup, FIFA president Sepp Blatter was presented with the Order of the Companions of Oliver Reginald Tambo. The citation, however, read: "The Order of the Companions of OR Tambo in Gold—awarded to Joseph Sepp Bellend Blatter (1936–) for his exceptional contribution to the field of football and support for the hosting of the Fifa World Cup on the African continent", after the name on his Wikipedia entry was vandalized.[232]
In October 2012, the Asian Football Confederation official website published an article about the United Arab Emirates national football team's bid to qualify for the 2015 AFC Asian Cup, in which the team's nickname was stated to be the "Sand Monkeys". This was the indirect result of vandalism of the Wikipedia article on the team, and the AFC was forced to apologise for what was perceived as a racist slur.[233][234]
In December 2012, an article titled "Bicholim conflict"[235] was deleted after standing since 2007.[236] It talked about a war that took place in India between the years 1640 and 1641, but was later confirmed to be completely fictitious.[237] The hoax article had won Wikipedia's "Good Article" award, a status conferred on fewer than 1 percent of articles on the site, a few months after its creation in 2007, and held that status for five years.[238]
In March 2013, it was discovered that both Wikipedia and IMDb had for three-and-a-half years contained articles on a fictitious Russian filmmaker named Yuri Gadyukin. False information had been planted in both sites as part of a viral promotion campaign for an upcoming film.[239]
In May 2014, The New Yorker reported that a 17-year-old student had added an invented nickname to the Wikipedia article on the coati in 2008, saying coatis were also known as "Brazilian aardvarks". The taxonomically false information, inserted as a private joke, lasted for six years in Wikipedia and over this time came to be propagated by hundreds of websites, several newspapers (one of which was later cited as a source in Wikipedia) and even books published by university presses. It was only removed from Wikipedia after publication of the New Yorker article, in which the student explained how the joke had come about.[1][2]
In March 2015, it became known that an article on Wikipedia entitled "Jar'Edo Wens", purportedly about an Australian aboriginal deity of that name, was a hoax. The article had survived for more than nine years before being deleted, making it one of the longest-lived documented hoax articles in Wikipedia's history. The article spawned mentions of the fake god on numerous other websites as well as in a book titled Atheism and the Case Against Christ.[240][241][242]
In June 2022, it was discovered that an editor known as Zhemao (Chinese: 折毛) had created over 200 articles on the Chinese Wikipedia about fabricated events in medieval Russian history.[246] Dubbed the Zhemao hoaxes, the hoax articles combined research and fantasy, creating an alternate history centered around a "Kashin silver mine" and political ties between "princes of Tver" and "dukes of Moscow".[247]
In August 2022, Wikipedia criticism site Wikipediocracy published an interview with a hoaxer who ten years prior had added a hoax to Wikipedia, claiming that an "Alan MacMasters" had invented the electric toaster. The false information was widely reproduced online as well as in newspapers and books subsequently cited in Wikipedia.[248][249][250]
In 2023, Jan Grabowski and Shira Klein published an article in the Journal of Holocaust Research in which they claim to have discovered a "systematic, intentional distortion of Holocaust history" on the English-language Wikipedia.[251] Analysing 25 Wikipedia articles and almost 300 back pages (including talk pages, noticeboards and arbitration cases), Grabowski and Klein believe they have shown how a small group of editors managed to impose a fringe narrative on Polish-Jewish relations, informed by Polish nationalist propaganda and far removed from evidence-driven historical research. In addition to the article on the Warsaw concentration camp, the authors claim that the activities of the editors' group had an effect on several articles, such as History of the Jews in Poland, Rescue of Jews by Poles during the Holocaust and Jew with a coin. Supposed nationalist editing on these and other articles allegedly included content ranging "from minor errors to subtle manipulations and outright lies", examples of which the authors offer.[251]
While Wikipedia policy requires articles to have a neutral point of view, there have been attempts to place a spin on articles. In January 2006 several staffers of members of the U.S. House of Representatives attempted to cleanse their respective bosses' biographies on Wikipedia, and to insert negative remarks on political opponents. References to a campaign promise by Martin Meehan to surrender his seat in 2000 were deleted, and negative comments were inserted into the articles on U.S. Senator Bill Frist and Eric Cantor, a congressman from Virginia. Numerous other changes were made from an IP address which is assigned to the House of Representatives.[252] In an interview, Jimmy Wales remarked that the changes were "not cool".[253]
On August 31, 2008, The New York Times ran an article detailing the edits made to the biography of Sarah Palin in the wake of her nomination as running mate of John McCain. During the 24 hours before the McCain campaign announcement, 30 edits, many of them flattering details, were made to the article by Wikipedia single-purpose user identity Young Trigg. This person later acknowledged working on the McCain campaign, and having several Wikipedia user accounts.[254][255]
Larry Delay and Pablo Bachelet write that from their perspective, some articles dealing with Latin American history and groups (such as the Sandinistas and Cuba) lack political neutrality and are written from a sympathetic Marxist perspective which treats socialist dictatorships favorably at the expense of alternate positions.[256][257]
In November 2007, libelous accusations were made against two politicians from southwestern France, Jean-Pierre Grand and Hélène Mandroux-Colas, on their Wikipedia biographies. Jean-Pierre Grand asked the president of the French National Assembly and the prime minister of France to reinforce the legislation on the penal responsibility of Internet sites and of authors who peddle false information in order to cause harm.[258] Senator Jean Louis Masson then requested the Minister of Justice to tell him whether it would be possible to increase the criminal responsibilities of hosting providers, site operators, and authors of libelous content; the minister declined to do so, recalling the existing rules in the LCEN law.[259]
In 2009, Wikipedia banned the Church of Scientology from editing any articles on its site. The Wikipedia articles concerning Scientology were edited by members of the group to improve its portrayal.[260]
On August 25, 2010, the Toronto Star reported that the Canadian "government is now conducting two investigations into federal employees who have taken to Wikipedia to express their opinion on federal policies and bitter political debates."[261]
In 2010, Al Jazeera's Teymoor Nabili suggested that the article Cyrus Cylinder had been edited for political purposes by "an apparent tussle of opinions in the shadowy world of hard drives and 'independent' editors that comprise the Wikipedia industry." He suggested that after the Iranian presidential election, 2009 and the ensuing "anti-Iranian activities" a "strenuous attempt to portray the cylinder as nothing more than the propaganda tool of an aggressive invader" was visible. The edits following his analysis of the edits during 2009 and 2010, represented "a complete dismissal of the suggestion that the cylinder, or Cyrus' actions, represent concern for human rights or any kind of enlightened intent", in stark contrast to Cyrus' own reputation (among the people of Babylon) as written in the Old Testament.[262]
Arab–Israeli conflict
In April 2008, the Boston-based Committee for Accuracy in Middle East Reporting in America (CAMERA) organized an e-mail campaign to encourage readers to correct perceived Israel-related biases and inconsistencies in Wikipedia.[263] Excerpts of some of the e-mails were published in the July 2008 issue of Harper's Magazine under the title of "Candid camera".[264]
CAMERA argued the excerpts were unrepresentative and that it had explicitly campaigned merely "toward encouraging people to learn about and edit the online encyclopedia for accuracy".[265] According to some defenders of CAMERA, serious misrepresentations of CAMERA's role emanated from the competing Electronic Intifada group; moreover, it is said, some other Palestinian advocacy groups have been guilty of systematic misrepresentations and manipulative behaviors but have not suffered bans of editors amongst their staff or volunteers.[266][267]
Five editors involved in the campaign were sanctioned by Wikipedia administrators.[268] Israeli diplomat David Saranga said that Wikipedia is generally fair in regard to Israel. When confronted with the fact that the entry on Israel mentioned the word "occupation" nine times, whereas the entry on the Palestinian people mentioned "terror" only once, he replied: "It means only one thing: Israelis should be more active on Wikipedia. Instead of blaming it, they should go on the site much more, and try and change it."[269]
Political commentator Haviv Rettig Gur, reviewing widespread perceptions in Israel of systemic bias in the English-language Wikipedia articles, has argued that there are deeper structural problems creating this bias: anonymous editing favors biased results, especially if those Gur calls "pro-Palestinian activists" organize concerted campaigns as has been putatively done in articles dealing with Arab-Israeli issues, and current Wikipedia policies, while well-meant, have proven ineffective in handling this.[270]
On August 3, 2010, it was reported that the Yesha Council together with Israel Sheli (My Israel), a network of online pro-Israel activists committed to spreading Zionism online, were organizing people at a workshop in Jerusalem to teach them how to edit Wikipedia articles in a pro-Israeli way.[271][272][273] Around 50 people took part in the course.[273]
The project organiser, Ayelet Shaked, who has since been elected to Israel's parliament, was interviewed on Arutz Sheva Radio. She emphasized that the information has to be reliable and meet Wikipedia rules. She cited some examples such as the use of the term "occupation" in Wikipedia entries, as well as in the editing of entries that link Israel with Judea and Samaria and Jewish history".[274]
"We don't want to change Wikipedia or turn it into a propaganda arm," commented Naftali Bennett, director of the Yesha Council. "We just want to show the other side. People think that Israelis are mean, evil people who only want to hurt Arabs all day."[275] "The idea is not to make Wikipedia rightist but for it to include our point of view," he said in another interview.[273]
A course participant explained that the course is not a "Zionist conspiracy to take over Wikipedia"; rather, it is an attempt to balance information about disputed issues presented in the online encyclopedia.
[T]he goal of this workshop was to train a number of pro-Israelis how to edit Wikipedia so that more people could present the Israeli side of things, and thus the content would be more balanced... Wikipedia is meant to be a fair and balanced source, and it is that way by having people from all across the spectrum contributing to the content.[276]
Following the course announcement, Abdul Nasser An-Najar, the head of Palestinian Journalists Syndicate said there were plans to set up a counter group to ensure the Palestinian view is presented online as the "next regional war will be [a] media war."[275]
In 2011, Wikipedia founder Jimmy Wales stated in retrospect about the course organized by Israel Sheli, "we saw absolutely no impact from that effort whatsoever. I don't think it ever—it was in the press but we never saw any impact."[277]
Editing for financial rewards
In an October 2012 Salon story, Wikipedia co-founder Jimmy Wales stated that he was against the practice of paid editing of Wikipedia, as are a number long-time members of Wikipedia's community. Nonetheless, a number of organizations do pay employees to edit Wikipedia articles, with one writer, Soraya Field Fiorio, stating that she writes commissioned Wikipedia articles for writers and musicians for $30 an hour. According to Fiorio, her clients control the article's content in the same way that they control press releases, which function as part of publicity strategies.[278] In January 2007, Rick Jelliffe claimed in a story carried by CBS[279] and IDG News Service[280][281] that Microsoft had offered him compensation in exchange for his future editorial services on OOXML. A Microsoft spokesperson, quoted by CBS, commented that "Microsoft and the writer, Rick Jelliffe, had not determined a price and no money had changed hands, but they had agreed that the company would not be allowed to review his writing before submission".
In a story covered by the BBC, Jeffrey Merkey claimed that in exchange for a donation his Wikipedia entry was edited in his favor. Jay Walsh, a spokesman for Wikipedia, flatly denied the allegations in an interview given to the Daily Telegraph.[282]
In a story covered by InformationWeek, Eric Goldman, assistant law professor at Santa Clara University in California argued that "eventually, marketers will build scripts to edit Wikipedia pages to insert links and conduct automated attacks on Wikipedia",[283] thus putting the encyclopedia beyond the ability of its editors to provide countermeasures against the attackers, particularly because of a vicious circle where the strain of responding to these attacks drives core contributors away, increasing the strain on those who remain.[284][nb 8]
Conflicts involving Wikipedia policy makers
In February 2008, British technology news and opinion website The Register stated that a prominent administrator of Wikipedia had edited a topic area where he had a conflict of interest to keep criticism to a bare minimum, as well as altering the Wikipedia policies regarding personal biography and conflict of interest to favour his editing.[285]
Some of the most scathing criticism of Wikipedia's claimed neutrality came in The Register, which in turn was allegedly criticized by founding members of the project. According to The Register: "In short, Wikipedia is a cult. Or at least, the inner circle is a cult. We aren't the first to make this observation. On the inside, they reinforce each other's beliefs. And if anyone on the outside questions those beliefs, they circle the wagons. They deny the facts. They attack the attacker. After our Jossi Fresco story, Fresco didn't refute our reporting. He simply accused us of 'yellow journalism'. After our Overstock.com article, Wales called us 'trash'."[286]
Charles Arthur in The Guardian said that "Wikipedia, and so many other online activities, show all the outward characteristics of a cult."[287]
In February 2015, a longstanding Wikipedia administrator was site-banned after Wikipedia's Arbitration Committee found that they had, over a period of several years, manipulated the content of Wikipedia articles to add positive content and remove negative content about the controversial Indian Institute of Planning and Management and its dean, Arindam Chaudhuri. An Indian journalist commented in Newsweek on the importance of the Wikipedia article to the institute's PR campaign and voiced the opinion that "by letting this go on for so long, Wikipedia has messed up perhaps 15,000 students' lives".[288][289]
Scientific disputes
The 2005 Nature study also gave two brief examples of challenges that Wikipedian science writers purportedly faced on Wikipedia. The first concerned the addition of a section on violence to the schizophrenia article, which exhibited the view of one of the article's regular editors, neuropsychologistVaughan Bell, that it was little more than a "rant" about the need to lock people up, and that editing it stimulated him to look up the literature on the topic.[25]
The second dispute reported by Nature involved the climatologist William Connolley related to protracted disputes between editors of climate change topics, in which Connolley was placed on parole and several opponents banned from editing climate related articles for six months;[25] a separate paper commented that this was more about etiquette than bias and that Connolley did "not suffer fools gladly".[290]
See also
Bourgeois v. Peters (2004), one of the earliest court opinions to cite and quote Wikipedia
^There was also a dispute on the "George Floyd", "George Floyd protests", and "Murder of George Floyd" articles on whether they should mention Floyd's prior criminal charges, use of the word riot (rejected because most reliable sources did not refer to them as riots), and change it from Death to Killing, respectively. While death was the more neutral term, editors felt that killing was the more accurate term and neutral by definition. As for the criminal charges, those in favour cited in support that Wikipedia is not censored, while those opposed cited weight policy, positing that it would be undue to add because his past criminal history did not have relevance to his murder.[165]
^The letter was in response to a review of his book Quantum Information: An Overview, that had questioned "whether there is an audience for such encyclopedic texts, especially given the easy access to online sources of information such as the arXiv e-print server and Wikipedia."
^Between the two edits, the wrong information had stayed in the Clinton article while it was edited more than 4,800 times over 20 months.
^Wikipedia considers vandalism as "any addition, removal, or change of content in a deliberate attempt to compromise the integrity of Wikipedia". The Wikipedia page "Researching with Wikipedia" states: "Wikipedia's radical openness means that any given article may be, at any given moment, in a bad state: for example, it could be in the middle of a large edit or it could have been recently vandalized. While blatant vandalism is usually easily spotted and rapidly corrected, Wikipedia is certainly more subject to subtle vandalism than a typical reference work."
^It added the following erroneous information to the section titled "The fans": "A small but loyal group of fans are lovingly called "The Zany Ones"—they like to wear hats made from discarded shoes and have a song about a little potato."
^Wikipedia operates bots to aid in the detection and removal of vandalism, and uses nofollow and a CAPTCHA to discourage and filter additions of external links.
^ abViégas, Fernanda B.; Wattenberg, Martin; Dave, Kushal (2003). "History flow: results". research.ibm.com. IBM Collaborative User Experience Research Group. Archived from the original on November 2, 2006. Retrieved July 7, 2016.
^ abPetiška, Eduard; Moldan, Bedřich (December 9, 2019). "Indicator of quality for environmental articles on Wikipedia at the higher education level". Journal of Information Science. 47 (2): 269–280. doi:10.1177/0165551519888607. ISSN0165-5515. S2CID214401940.
^ abcReavley, N. J.; MacKinnon, A. J.; Morgan, A. J.; Alvarez-Jimenez, M.; Hetrick, S. E.; Killackey, E.; Nelson, B.; Purcell, R.; Yap, M. B. H.; Jorm, A. F. (2011). "Quality of information sources about mental disorders: A comparison of Wikipedia with centrally controlled web and printed sources". Psychological Medicine. 42 (8): 1753–1762. doi:10.1017/S003329171100287X. hdl:11343/59260. PMID22166182. S2CID13329595.
^Timmer, John (October 18, 2007). "Anonymous "good samaritans" produce Wikipedia's best content, says study". Ars Technica. Archived from the original on October 26, 2007. Retrieved October 27, 2007. Good samaritans with less than 100 edits made higher-quality contributions than those with registered accounts and equal amounts of content. In fact, anonymous contributors with a single edit had the highest quality of any group. But quality steadily declined, and more-frequent anonymous contributors were anything but Samaritans; their contributions generally didn't survive editing... The authors also recognize that contributions in the form of stubs on obscure topics might survive unaltered indefinitely, inflating the importance of single contributions...Objective ratings of quality are difficult, and it's hard to fault the authors for attempting to find an easily-measured proxy for it. In the absence of independent correlation, however, it's not clear that the measurement used actually works as a proxy. Combined with the concerns regarding anonymous contributor identity, there are enough problems with this study that the original question should probably be considered unanswered, regardless of how intuitively satisfying these results are.
^Gertner 2023: "While estimates of its influence can vary, Wikipedia is probably the most important single source in the training of A.I. models. ... In fact, no one I spoke with in the tech community seemed to know if it would even be possible to build a good A.I. model without Wikipedia."
^See author-acknowledged comments in response to the citation of the Nature study, at PLoS One, 2014, Citation of fundamentally flawed Nature quality 'study', in response to T. Yasseri et al. (2012), Dynamics of Conflicts in Wikipedia, published June 20, 2012, DOI 10.1371/journal.pone.0038869. Retrieved July 21, 2014. Archived January 16, 2016, at the Wayback Machine.
^Michael Kurzidim: Wissenswettstreit. Die kostenlose Wikipedia tritt gegen die Marktführer Encarta und Brockhaus an, in: c't 21/2004, October 4, 2004, S. 132–139.
^Dorothee Wiegand: "Entdeckungsreise. Digitale Enzyklopädien erklären die Welt." c't 6/2007, March 5, 2007, p. 136–145. Original quote: "Wir haben in den Texten der freien Enzyklopädie nicht mehr Fehler gefunden als in denen der kommerziellen Konkurrenz"
^Bragues, George (April 2007). "Wiki-Philosophizing in a Marketplace of Ideas: Evaluating Wikipedia's Entries on Seven Great Minds". SSRN978177.
^ abcPC Pro magazine, August 2007, p. 136, "Wikipedia Uncovered".
^Schönert, Ulf; Güntheroth, Horst (December 2007). "Wikipedia: Wissen für alle" [Wikipedia: Knowledge for Everyone]. Stern (in German). Vol. 2007, no. 50. pp. 30–44. Archived from the original on January 11, 2023. Retrieved January 11, 2023. Einige Wikipedia-Artikel sind für Laien schlicht zu kompliziert, viele zu weitschweifig, urteilten die Tester. [Some Wikipedia articles are simply too complicated for laypersons, many too long-winded, judged the testers.]
^Rector, Lucy Holman (2008). "Comparison of Wikipedia and other encyclopedias for accuracy, breadth, and depth in historical articles". Reference Services Review. 36 (1): 7–22. doi:10.1108/00907320810851998.
^Luyt, Brendan; Tan, Daniel (April 1, 2010). "Improving Wikipedia's credibility: References and citations in a sample of history articles". Journal of the American Society for Information Science and Technology. 61 (4): 715–722. doi:10.1002/asi.21304. hdl:10356/95416. ISSN1532-2890.
^Brown, Adam R. (April 8, 2011). "Wikipedia as a Data Source for Political Scientists: Accuracy and Completeness of Coverage". PS: Political Science & Politics. 44 (2): 339–343. doi:10.1017/S1049096511000199. S2CID154963796.
^Phillips, Jennifer; Lam, Connie; Palmisano, Lisa (July 1, 2014). "Analysis of the accuracy and readability of herbal supplement information on Wikipedia". Journal of the American Pharmacists Association. 54 (4): 406–14. doi:10.1331/JAPhA.2014.13181. PMID25063262.
^ abcdWaldman, Simon (October 26, 2004). "Who knows?". The Guardian. London. Archived from the original on August 25, 2014. Retrieved February 3, 2011.
^"About Wikipedia". Trent University Library. Trent University. April 30, 2007. Archived from the original on December 4, 2005. Retrieved April 13, 2010.
^"I want my Wikipedia!". Library Journal. April 2006. Archived from the original on December 4, 2015. Retrieved October 23, 2015.
^Feng Shi; Misha Teplitskiy; Eamon Duede; James A. Evans (March 4, 2019). "The wisdom of polarized crowds"(PDF). Nature Human Behaviour. 3 (4): 329–336. arXiv:1712.06414. doi:10.1038/S41562-019-0541-6. ISSN2397-3374. PMID30971793. WikidataQ47248083.. They continued, "To explore whether political diversity has an upper bound beyond which polarization hampers performance, we re-estimated the regression models of quality with a quadratic polarization term. Estimates suggest that quality may eventually decline with increasing polarization, but the optimal level of polarization is above that realized by 95% of the teams in this study. For the 5% most polarized teams, there is no statistically significant pattern between polarization and quality. In other words, we do not find evidence that very high levels of political polarization hampers Wikipedia performance." (p. 11)
^Youngwood, Susan (April 1, 2007). "Wikipedia: What do they know; when do they know it, and when can we trust it?". Rutland Herald. Archived from the original on November 8, 2016. Retrieved May 16, 2019. Perhaps the most important thing to understand about Wikipedia—both its genius and its Achilles heel—is that anyone can create or modify an entry. Anyone means your 10-year-old neighbor or a Nobel Prize winner—or an editor like me, who is itching to correct a grammar error in that Wikipedia entry that I just quoted. Entries can be edited by numerous people and be in constant flux. What you read now might change in five minutes. Five seconds, even.
^Cohen, Noam (February 27, 2007). "Wikipedia on an academic hit list". NY Times News Service. Archived from the original on March 5, 2007. Retrieved April 16, 2007. Middlebury professor Thomas Beyer, of the Russian department, said: 'I guess I am not terribly impressed by anyone citing an encyclopedia as a reference point, but I am not against using it as a starting point.'
^Polk, Tracy; Johnston, Melissa P.; Evers, Stephanie (2015). "Wikipedia Use in Research: Perceptions in Secondary Schools". TechTrends: Linking Research & Practice to Improve Learning. 59 (3): 92–102. doi:10.1007/s11528-015-0858-6. S2CID62595811.
^The study explains that "In the survey, all respondents under Condition 1 were asked if there were any mistakes in the article they had been asked to read. Only five reported seeing mistakes and one of those five reported spelling mistakes rather than factual errors. This suggests that 13 percent of Wikipedia's articles have errors." Thus 80% of the 13% related to factual errors and 20% of the 13% related to spelling errors. Chesney, Thomas (May 16, 2006). "An empirical examination of Wikipedia's credibility". First Monday. doi:10.5210/fm.v11i11.1413. Archived from the original on April 11, 2010. Retrieved January 20, 2010.
^Bailey, Matt (October 2, 2007). "Using Wikipedia". Lawrence McKinley Gould Library, Carleton College. Archived from the original on November 3, 2007. Retrieved October 31, 2007.
^What is Happening in the Educational System of the Contemporary World and How "The State Program on Reforms of the Higher Education System in the Republic of Azerbaijan for the Period of 2008–2012" May Best be Carried Out (in Azeri). Khazar University Press, 2008
^George, Yolanda S. & Malcolm, Shirley S. "Perspectives from AAAS"(PDF). American Association for the Advancement of Science. Archived from the original(PDF) on October 29, 2007. Retrieved October 27, 2007.
^ abcLavsa, S. M.; Corman, S. L.; Culley, C. M.; Pummer, T. L. (2011). "Reliability of Wikipedia as a medication information source for pharmacy students". Currents in Pharmacy Teaching and Learning. 3 (2): 154–158. doi:10.1016/j.cptl.2011.01.007.
^Volsky, Peter G.; Baldassari, Cristina M.; Mushti, Sirisha; Derkay, Craig S. (September 2012). "Quality of Internet information in pediatric otolaryngology: A comparison of three most referenced websites". International Journal of Pediatric Otorhinolaryngology. 76 (9): 1312–1316. doi:10.1016/j.ijporl.2012.05.026. PMID22770592.
^Matheson, D., & Matheson-Monnet, C. (2017). Wikipedia as informal self education for clinical decision-making in medical practice. Open Medicine Journal, 4 (1), 15–25. https://doi.org/10.2174/1874220301704010015
^Yacob, M., Lotfi, S., Tang, S., & Jetty, P. (2020). Wikipedia in vascular surgery medical education: Comparative study. JMIR Medical Education, 6(1), e18076. https://doi.org/10.2196/18076
^Banchik LH, Gray B. What happened to my Index Medicus? Nutr Clin Pract. 2024 Aug;39(4):743-750. doi: 10.1002/ncp.11173. Epub 2024 Jun 12. PMID: 38864650.
^Orlowski, Andrew (December 12, 2005). "Who's responsible for Wikipedia?". The Register. Archived from the original on February 6, 2009. Retrieved June 30, 2009. The public has a firm idea of what an 'encyclopedia' is, and it's a place where information can generally be trusted, or at least slightly more trusted than what a labyrinthine, mysterious bureaucracy can agree upon, and surely more trustworthy than a piece of spontaneous graffiti—and Wikipedia is a king-sized cocktail of the two.
^ abReid Priedhorsky, Jilin Chen, Shyong (Tony) K. Lam, Katherine Panciera, Loren Terveen, John Riedl, "Creating, destroying, and restoring value in wikipedia", Proc. GROUP 2007, doi: ACM.org
^ abcCohen, Martin (August 27, 2008). "Encyclopaedia Idiotica". Times Higher Education (August 28, 2008): 26. Archived from the original on September 6, 2011. Retrieved May 31, 2011.
^Stephen Colbert, The Colbert Report, episode 3109, August 21, 2007.
^Brophy-Warren, Jamin (June 17, 2007). "Oh, that John Locke". The Wall Street Journal (June 16, 2007): P3. Archived from the original on September 4, 2017. Retrieved August 8, 2017.
^Hendren, Johnny "DocEvil" (June 5, 2007). "The Art of Wikigroaning". Something Awful. Archived from the original on June 16, 2007. Retrieved June 17, 2007.
^Callahan, Ewa S.; Herring, Susan C. (October 2011). "Cultural bias in Wikipedia content on famous persons". Journal of the American Society for Information Science and Technology. 62 (10): 1899–1915. doi:10.1002/asi.21577. S2CID14767483.
^Samoilenko, Anna; Yasseri, Taha (January 22, 2014). "The distorted mirror of Wikipedia: a quantitative analysis of Wikipedia coverage of academics". EPJ Data Science. 3 (1). arXiv:1310.8508. doi:10.1140/epjds20. S2CID4971771.
^Sato, Yumiko (January 9, 2021). 日本語版ウィキペディアで「歴史修正主義」が広がる理由と解決策 [Reasons Why "Historical Revisionism" is Widespread on Japanese Wikipedia and Solutions for It]. Yumiko Sato's Music Therapy Journal (in Japanese). Retrieved August 23, 2021.
^Hillshafer, David (2013). "The Mass Murder Problem". Skeptic. 18 (1): 24–32.
^Lippard, Jim (2012). "The Decline and (Probable) Fall of the Scientology Empire!". Skeptic Vol. 17 No. 1. pp. 18–27. The citations in question are Citations 10, 14 and 16, as seen on page 27.
^Sheaffer, Robert (2014). "Between a Beer Joint and a Highway Warning Sign: The 'Classic' Cash-Landrum Case Unravels". "Psychic Vibrations"". Skeptical Inquirer. 38 (2): 28.
^Goodwin, Jean. (2010). The authority of Wikipedia. In Juho Ritola (Ed.), Argument cultures: Proceedings of the Ontario Society for the Study of Argumentation Conference. Windsor, ON, Canada: Ontario Society for the Study of Argumentation. CD-ROM. 24 pp.
^"Mistakes and hoaxes on-line". Australian Broadcasting Corporation. April 15, 2006. Archived from the original on November 13, 2012. Retrieved April 28, 2007.
^O'Neil, Mathieu (March 2010). "Shirky and Sanger, or the costs of crowdsourcing". International School for Advanced Studies. 09 (1). Journal of Science Communication. Archived from the original on April 29, 2011. Retrieved May 31, 2011.