The Wikipedia was initially created to promote collaboration between writers before submitting their work to a peer review process, to address complaints about the speed of peer review. Ironically, the criticism most widely levied against the Wikipedia is the lack of accountability for authors, and the potential to misinform readers. There is a large community around the Wikipedia project which actively fixes errors as they are discovered, but an unending stream of vandals and spammers chip away at the good will of volunteers who maintain the project for the collective good. We suggest that vandalism detection systems can be used to help direct the volunteer effort on changes more likely to be a problem, making more efficient use of the project's human resources.
We use edit distance to quantify the effort of authors, and propose automated methods to evaluate the quality of this effort and how they might be combined into an author reputation system. We desire that an author's reputation be correlated with the stability of the text they contribute — low reputation should be a predictor of future author contributions being edited or deleted. Reputation can then be another input to a vandalism detection system.
Instead of measuring the “truth” of contributions, our quality ideas measure the “group consensus” in a piece of text. As the article text stabilizes over time, we conclude that it has reached a form which most members of the community can reasonably agree on. As group collaboration increases in prominence on the Internet, we feel that this research will open the door on new applications and quality measures.