Tuesday, April 28, 2009

Assignment #11 - Peter Clain

The Wikipedia article on my old high school can be fairly amusing at times. While the information is never drastically updated, the article is frequently edited by students looking to cause trouble. There was even a period of several months, in fact, when my brother and several of my friends were listed under the school’s notable alumni. However, over the years, the article has been more strictly monitored. It offers less information that it did before, and many of the edits in the history are either minor updates or the removal of inappropriate material.

Updates to the article can be seen in the revision history, which clearly shows the changes made in each edit as well as the author of those changes and the time the changes were made. In the absence of a registered user, an IP address is given in place of the author. Revisions to previous changes are clearly noted, and when vandalism is detected, it is usually noted in the description accompanying each edit.

In her article, Bryant goes into detail about the importance of collaboration in controlling vandalism and the tools Wikipedia gives its users to control it. The watchlist, for example, “alerts Wikipedians to changes on pages that interest them, and they can review the changes. Vandalism can be reverted, and controversial changes can be addressed.” In the case of this article, vandalism is dealt with swiftly, as students are always making inappropriate changes. For example:

• 16:28, 30 June 2008 StaticGull (talk | contribs) m (3,960 bytes) (Reverted edits by 75.72.56.27 to last version by 24.127.164.111 (using Huggle)) (undo)
• (cur) (prev) 16:27, 30 June 2008 75.72.56.27 (talk) (15 bytes) (←Replaced content with 'SEX SEX SEX SEX') (undo)

However, many of these controversial changes are made by anonymous users. This makes it difficult to identify inappropriate changes based on the user, and the presence of “bots” and a group of dedicated registered users is essential to keeping the page content under control.

Changes could be made to Wikipedia to make collaboration easier, but certain aspects of the site should not be changed. While vandalism can be problematic, the barrier of entry should remain low to encourage users to edit. Therefore, changes should be directed at helping users detect vandalism instead of preventing it. Displaying all recent changes on one page and highlighting them, similar to SVN, would be more efficient in identifying problems.

3 comments:

  1. In high school kids did the same thing to our wikipedia page (i had forgotten about that..) It seems like the community does a good job of monitoring itself on your high school page, since inappropriate edits are constantly being taken out and updated. It is hard to maintain the balance between being inspiring anyone to edit pages, and maintaining accuracy on the site. I think it is important to have these bots and more experienced users who will patrol wikipedia.

    ReplyDelete
  2. I like how you compare the Wikipedia system to SVN, since SVN software is designed specifically to help people collaborate on often highly intertwined material with many contributors. I agree that the barrier to contribution should remain low, as that follows the decentralized Wikipedia ideal and we should encourage a broad knowledge base even though we have higher risk of vandalism.
    Since tracking and reverting changes is made so easy by the Wikipedia platform, I almost feel that prevention is uncommon and unnecessary, as any type of vandalism could be fixed without much effort. Good post.

    ReplyDelete
  3. It offers less information that it did before, and many of the edits in the history are either minor updates or the removal of inappropriate material.



    data entry outsourcing

    ReplyDelete