Wikipedia:Editorial oversight and control: Difference between revisions
m multiple overlap |
→User oversight: add tools |
||
Line 38: | Line 38: | ||
{{quotation|"Given enough eyeballs, all errors are shallow"|[[Larry Sanger]], referencing [[Linus's law|Linus' law]], as stated on [[Kuro5hin]] in [[2001]].<ref>"[http://www.kuro5hin.org/story/2001/9/24/43858/2479 Wikipedia is wide open. Why is it growing so fast? Why isn't it full of nonsense?]", [[September 24]], [[2001]].</ref>}} |
{{quotation|"Given enough eyeballs, all errors are shallow"|[[Larry Sanger]], referencing [[Linus's law|Linus' law]], as stated on [[Kuro5hin]] in [[2001]].<ref>"[http://www.kuro5hin.org/story/2001/9/24/43858/2479 Wikipedia is wide open. Why is it growing so fast? Why isn't it full of nonsense?]", [[September 24]], [[2001]].</ref>}} |
||
Wikipedia's primary editorial control, that ensures the bulk of its quality, is simply the sheer volume of well-intentioned editors who regularly and constantly watch over its articles. At any given time, a large number of the thousands of active [[Wikipedia:Wikipedian|Wikipedians]] will be using, checking, or editing the articles held. Each of these has their own [[Wikipedia:Watchlist|watchlist]], a user page that lists changes to the articles they have worked on or are otherwise choosing to watch. On average, only a few minutes lie between a blatently bad or harmful edit, and some editor noticing and acting on it. Repeated edits tend to lead rapidly to escalation of the process, further safeguards and actions, and the involvement of others, including possible use of administrator powers or dispute resolution depending on the situation. |
Wikipedia's primary editorial control, that ensures the bulk of its quality, is simply the sheer volume of well-intentioned editors who regularly and constantly watch over its articles. At any given time, a large number of the thousands of active [[Wikipedia:Wikipedian|Wikipedians]] will be using, checking, or editing the articles held. Each of these has their own [[Wikipedia:Watchlist|watchlist]], a user page that lists changes to the articles they have worked on or are otherwise choosing to watch. Hundreds of Wikipedians use automated software tools (described below) to watch edits ''en masse''. On average, only a few minutes lie between a blatently bad or harmful edit, and some editor noticing and acting on it. Repeated edits tend to lead rapidly to escalation of the process, further safeguards and actions, and the involvement of others, including possible use of administrator powers or dispute resolution depending on the situation. |
||
The primary control therefore is not so much that "only approved editors" can update and improve articles. It is more, that even bad editors can edit -- but any vandalism and errors they add rarely get much of a foothold and their bad edits are very rapidly spotted and reversed by others. This is different to traditional knowledge and publishing, which attempts to limit content creation to a relatively small circle of approved editors in an attempt to exercise strong [[hierarchy|hierarchical]] control. |
The primary control therefore is not so much that "only approved editors" can update and improve articles. It is more, that even bad editors can edit -- but any vandalism and errors they add rarely get much of a foothold and their bad edits are very rapidly spotted and reversed by others. This is different to traditional knowledge and publishing, which attempts to limit content creation to a relatively small circle of approved editors in an attempt to exercise strong [[hierarchy|hierarchical]] control. |
Revision as of 13:22, 8 January 2007
![]() | This project page is actively undergoing a major edit for the 24 hours of 08 january 2007, for initial writing and creation, and may contain some temporary or 'junk' text until it is completed. This article is intended to back up pages such as WP:ABOUT and Reliability of Wikipedia, which reference Wikipedia's self-regulation without going into much depth about it; information which third parties performing research or seeking to understand Wikipedia might wish to find in one location. . To help avoid edit conflicts, please do not edit this page while this message is displayed. This page was last edited at 13:22, 8 January 2007 (UTC) (18 years ago) – this estimate is cached, . Please remove this template if this page hasn't been edited for a significant time. If you are the editor who added this template, please be sure to remove it or replace it with {{Under construction}} between editing sessions. |
This page summarizes the various processes and structures by which Wikipedia articles and their editing are editorially controlled, and the processes which are built into that model to ensure quality of article content.
Rather than one sole form of control, Wikipedia relies upon multiple approaches, and these overlap to provide more robust coverage and resilience. This is similar to computer security, where multiple overlapping systems are considered essential to avoid creating a weak point.
Overview of the Wikipedia editorial structure
There are tens of thousands of regular editors - everyone from expert scholars to casual readers. Anyone who visits the site can edit it, and this fact has encouraged contribution of a tremendous amount of content. There are mechanisms that help community members watch for bad edits, a few hundred administrators with special powers to enforce good behavior, and a judicial style arbitration committee that considers the few situations remaining unresolved, and decides on withdrawal or restriction of editing privileges or other punishments when needed, after all other consensus remedies have been tried.
As a Wiki, anyone can contribute to Wikipedia, and everyone is encouraged to. Overall Wikipedia gets hundreds of times more well-meaning editors than bad ones, so problematic editors rarely obtain much of a foothold. In the normal course of events, the primary control over editorship is the effective utilisation of the large number of well-intentioned editors to overcome issues raised by the much smaller number of problematic editors. It is inherent in the Wikipedia model's approach, that poor information can be added, but that over time those editing articles reach strong consensus, and quality improves in a form of group learning, so that substandard edits will very rapidly be removed. This assumption is still being tested and its limitations and reliability are not yet a settled matter – Wikipedia is a pioneer in communal knowledge building of this kind.
The Wikipedia community is largely self-organising, so that anyone may build a reputation as a competent editor and become involved in any role they may choose, subject to peer approval. Individuals often will choose to become involved in specialised tasks, such as reviewing articles at others request, watching current edits for vandalism, or watching newly created articles for quality control purposes, or similar roles. Editors who find that editorial administrator responsibility would benefit their ability to help the community may ask their peers in the community for agreement to undertake such roles; a structure which enforces meritocracy and communal standards of editorship and conduct. At present around a 75-80% approval rating after inquiry, is considered the requirement for such a role, a standard which tends to ensure a high level of experience, trust and familiarity across a broad front of projects within Wikipedia.
(Such rights are stringently restricted, ensuring that editorial and administrative matters are separated powers and only rarely lead to editorial conflict of interest.)
Wikipedia's editorial control process
Wikipedia has somewhat more formal editorial systems of control than are apparent to a newcomer, with ten main areas of overlapping control in three main areas primarily responsible:
- Core community level controls
-
- The degree of oversight possible with tens of thousands of bona fide editors
- The wiki system itself, which as operated, appears to strongly select for robust and best collaborative knowledge of many people (even on contentious topics), rather than the unrepresentative viewpoint or negative impact of a few.
- Editorial panels and processes
-
- Widely respected and enforced policies which provide all editors with a solid basis to take matters into their own hands in addressing both deliberate and innocent bad edits.
- A consensus based ethos, which impacts beneficially the decision-making process.
- Escalation processes whereby poor conduct or articles being problematically edited will tend to come to the attention of a wider range of editors with authority or willingness to act on them, making vandalism very short term and ultimately somewhat futile.
- Wide range of fine grained editorial processes such as dispute resolution, mediation, and requests for comment and consultation within the wider Wikipedia community.
- Software facilitated controls
-
- Systems built into its editing software that make it easy for a large number of editors to watch for vandalism, monitor recent changes, and check for activity on selected Watchlist articles, in real time.
- Design decisions in the software that make identifying and reverting any number of bad edits possible at the immediate click of a button, whereas vandalism itself takes longer to do.
- Ability to set fine-grained software blocks on problematic editors, and partially or fully protect targeted articles.
- Standardized alerts, known as tags, which can be added to any fact or article, and which allow individual facts (or entire sections and articles) to be highlighted as questionable or brought immediately to others' attention.
User oversight
"Given enough eyeballs, all errors are shallow"
Wikipedia's primary editorial control, that ensures the bulk of its quality, is simply the sheer volume of well-intentioned editors who regularly and constantly watch over its articles. At any given time, a large number of the thousands of active Wikipedians will be using, checking, or editing the articles held. Each of these has their own watchlist, a user page that lists changes to the articles they have worked on or are otherwise choosing to watch. Hundreds of Wikipedians use automated software tools (described below) to watch edits en masse. On average, only a few minutes lie between a blatently bad or harmful edit, and some editor noticing and acting on it. Repeated edits tend to lead rapidly to escalation of the process, further safeguards and actions, and the involvement of others, including possible use of administrator powers or dispute resolution depending on the situation.
The primary control therefore is not so much that "only approved editors" can update and improve articles. It is more, that even bad editors can edit -- but any vandalism and errors they add rarely get much of a foothold and their bad edits are very rapidly spotted and reversed by others. This is different to traditional knowledge and publishing, which attempts to limit content creation to a relatively small circle of approved editors in an attempt to exercise strong hierarchical control.
A 2002 study by IBM found that as a result of this process, most vandalism on the English Wikipedia is reverted within five minutes.:
- "We've examined many pages on Wikipedia that treat controversial topics, and have discovered that most have, in fact, been vandalized at some point in their history. But we've also found that vandalism is usually repaired extremely quickly--so quickly that most users will never see its effects." [1] (Official results)
User collaborative knowledge-building
Unusually, Wikipedia relies for a large part of its editorial work, upon editors drawn from the general public, who may well lack relevant qualifications in the subjects they edit. Experience suggests that any appearance of weakness which may be created, is deceptive.
The role of Wikipedia editors in general, is guided by two principles. Most editors will choose to edit subjects where they have personal interest, knowledge, and familiarity. And the editorial role in Wikipedia is not to produce original research so much as to collate and source existing knowledge in an encyclopedic form, under strict policies of neutrality of viewpoint and verifiability of information thus added.
Attempts to add information which is of poor quality or questionable are easy to spot, by the many other editors reviewing a given topic, who generally come with different viewpoints and understandings initially. For facts to remain in an article requires consensus amongst (often dozens or hundreds) of diverse editors with an interest in the article, that the fact is agreed, and neutrally and appropriately presented in a balanced manner, with any statement considered to require citation being properly sourced. Editors on most articles will often include coverage of a range of viewpoints on the subject, and will often include a number of specialists.
In addition, one should not overlook the effect of reader involvement - the millions of readers of articles are themselves encouraged to be bold and correct or improve any article they read.
Over time, experience suggests that as a result of this collaboration on a large scale, articles do usually rise to this general standard, and many long-standing articles having survived this prossess of examination over the years, are stable, robust, and well written as a result. Controversial articles often highlight greatly the success of this approach, since the process of developing a wording that satisfies a consensus of often-opposed editors is not a trivial one.
The Wiki structure
It is possible that this selectivity for collaboration is in part due to the Wiki structure. Editors who disagree are unable to write alternative articles or versions to express their differing viewpoints. Ultimately there is only one page upon which all must edit. Since other aspects of the editorial process tend to reduce sustained "edit warring", and strong universally accepted viewpoints describe how opposing views are to be neutrally included and presented, ultimately there is great pressure in the long term, for a common agreed version to emerge on that one page. Once it has done so, then it is the usual stance of editors who have worked for this goal, no matter their viewpoint, that it will only be replaced by a better version.
Respect for policies and principles
Rules and policies must strike a fine balance between good and necessary practice, and abuse or game-playing, in order to be effective in dealing with would-be disruptive contributors. Wikipedia's policies reflect this dynamic tension quite strongly,
Edit monitoring and software facilitation
Wikipedia:List of Wikipedians by number of edits lists some statistics on editorial involvement. However this page only lists edits made by the 3 million or so editors; it does not show editors' monitoring of articles and edits in cases where no correction was deemed necessary.
Reputable editors who decide to monitor recent edits more seriously will often use software such as VandalProof, a program written for Wikipedia by AmiDaniel, as well as functionality that automatically flags changes by known problem editors, and will use this to watch hundreds of recent edits as they happen. Other automated corrections, such as bad links, typographic errors and spellchecking, and some forms of vandalism, are automatically fixed by bots, automated programs written by Wikipedians and operated by authorisation. There are also large user-groups dedicated to rapid reversal of vandalism, such as Recent changes patrol and the Counter-Vandalism Unit.
As of 2007, approximately 600 editors use VandalProof alone, providing significant overlap in monitoring editorial quality..
Other tools and user-groups focussing on monitoring edits as they happen or subsequently, are listed at: Category:Wikipedia counter-vandalism tools.
Tags
Articles can also be brought to others' attention by means of a wide range of inline and article tags, used to flag individual statements and citations, or articles as a whole, to request checking or citation, and to indicate to other editors and readers that a fact or presentation is unsupported or questionable as it stands. A number of editors deliberately look for such tagged articles to work on them. For example: Category:Pages needing expert attention from Culture experts, and the assistance with neutrality user-group.
References
See also
For dealing with vandalism see Wikipedia:Vandalism. For editing Wikipedia yourself to fix obvious vandalism and errors, see the section Contributing to Wikipedia on the 'About' page.