Jump to content

英文维基 | 中文维基 | 日文维基 | 草榴社区

User:Hawstom/Wikipedia reputation

From Wikipedia, the free encyclopedia

I am concerned about the Wikipedia reputation. I would appreciate your edits and discussion. I also recommend an insightful and better-developed proposal by David Gerard at User:David Gerard/1.0. David has the vision, friends. See also Category:Wikipedia 1.0 for other ideas to compare with.

Wikipedia's reputation

[edit]

Wikipedia's popularity and size are now matters of fact. But our reputation is continually in doubt.

For those of us who care about accomplishing real good through Wikipedia, such as alleviating hunger, suffering and oppression, we must find a way not only to make Wikipedia a great encyclopedia, but to make it a widely reputable and respected encyclopedia of recourse for individuals, families, organizations, corporations, and governments everywhere. We who care about reaching the world can't afford to snub the needs of significant world audiences. We must enhance our reputation without compromising the wiki model that built us. I propose a simple method of adding disclaimers to low-confidence articles. This proposal is palatable and workable and fits us well. Yet it potentially adds considerably to our reputation without adding an onerous review process.

Confusion is terminology=

[edit]

I believe that the comments on this page confuse trust with confidence. Trust comes for experience and relationship, as a result, readers who have used Wikipedia in the past or have (trusted) colleauges who have reccomended it, will develop their own view of the level of trustworhtiness of Wikipedia.

Confidence in Wikipedia is altogether another matter. It lies in our belief that the processes systems controls checks and balances which give rise to it work and work consistently. For many the belief in the good will and best intentions of contributors (human values) is enough. For others contributors are not all to be trusted and a third party must intervene in order to ensure confidence.

For me the biggest barrier to confidence is a form of audit trail - How has the entry developed? What has been altered when how often? Has the arguement swung back and forth? Knowing that I had access to this information would probably be enough to ensure that I never sought it and would give me full confidence in what I read because I would know that contributors would be aware that I could gauge the quality of their contribution.

Professor Eddie Obeng


Not Wikipedia 1.0

[edit]

Wikipedia was not meant be "complete". While we can and perhaps should release targeted versions of Wikipedia, all of those versions will be in a sense "Wikipedia Lite" or "Wikipedia Digest". The real Wikipedia is a living organism that will always be current and incomplete. Therefore I avoid adopting the the term Wikipedia 1.0. Rather, I favor the idea of Wikipedia Selections for XXX 2009. And I am suggesting adding selectability attributes to every version of every article to be tweaked as part of the wiki process. Below I develop how we could add and use the simplest attribute, Trust.

Approval vs. trust

[edit]

Those who favor an approval model need only recall Nupedia.

Many have proposed approval models for Wikipedia. I agree with Jimbo Wales and others who have maintained that approval models will surely throw cold water on the fire of Wikipedia's success. Rather than an approval process, we simply need a way to tell the public "Wikipedia trusts this article a lot/little/none". That doesn't require experts, Wikipedia's phobia; it only requires trusted associates, Wikipedia's strength.

This article is highly trusted by Wikipedia, but it is still a work in progress.
Please help us improve it by editing this article now if you see needed work.

User trust model

[edit]

If we want a mechanism that works right now to improve our credibility a little bit without sacrificing the principles we are all comfortable with, it must be simple, open, and based on a community trust model. There are many existing examples of community trust models, but I propose that we simply start with the existing technical hierarchy of anon/user/admin/bureaucrat/developer as our community trust model. While it is true we have expressly disclaimed any editorial authority for our trusted users, the implicit attitude of the community has been to give them that authority. This proposal simply recognizes the de facto arrangement. In the future, a new model based separately on editorial trust could be created, but I would not favor this added level of community complexity. Let's make the current model work for us.

Editing

[edit]

An article is only as trustworthy as the last hand that touched it.

We assign to every version of every article a confidence level equal to the trust level of the editor who saved it.

For high-trust users (admin and above), we put a check box or radio button on the editing page so that they may save with artificially low trust interim work with remaining unresolved credibility problems. The key here is embarrassment avoidance. We want to disclaim embarrasing versions of articles.

Article presentation

[edit]

We continue to show to the public the latest version. If the confidence level for that article version is lower than our established standard, we show a diplomatic disclaimer along with links to any more trusted article versions available.

Wikipedia's Seal of Trust:  This version of this article is Not Trusted [alternatively Somewhat Trusted, Trusted) by Wikipedia.  
To see a more trusted version click here.  
To always see more trusted versions, click here.

We put in user preferences a selection for the level of credibility to show by default(Developer/Bureaucrat/Admin/User/Anonymous), and we enable anon users, via a cookie or session id, to say, "Show only article versions with credibility level X or higher."

I really like this idea. I know a system that has already implemented this idea. CalSWIM [1] mashup which fetches watershed related articles from Wikipedia is doing something similar. It suggest the most recent but reliable version of the page rather than the most recent page. Reliability is based on some features including user reputation [2].

How it's simple

[edit]

Wiki works because of immediate gratification.

Nobody has to take the time to review articles. The approval occurs naturally as a by-product of the way we already work.

How it improves our credibility

[edit]

Disclaims precisely the article versions we don't trust. Discourages vandalism and foolery by providing an immediate disincentive. Places editorial responsibility on our trusted users. Makes trusted versions available forever on request.

Since an article is only as trustworthy as the last hand that touched it and wiki works because of immediate gratification, only natural and open methods such as this can work at the Wikipedia.

What it means to you

[edit]

Anonymous and new users

[edit]

If you are an anonymous or new user, this proposal lets you continue to contribute to Wikipedia and see immediate results from your contributions. Because you are anonymous or new, your edits will always be disclaimed to some extent, though they will always be shown unless users have requested to see only trusted versions of articles.

Seasoned Wikipedians

[edit]

If you are a seasoned Wikipedian, this proposal relieves some of the relentless pressure you have been under to guard the Wikipedia's credibility against continual entropy. Whether the articles you watch are edited by well meaning editors or vandals, any untrusted edits will be flagged. If you go on vacation for a year and no seasoned Wikipedian checks your article in the interim, readers in perpetuity will still be able to elect to see your trusted versions of the articles. *Huge sigh of relief heard throughout community*. سكس

Developers

[edit]

If you are a Wikipedia developer, this proposal is light on new requests for you. No programming of a new trust model. All that is required is

  • a new trust attribute for article versions
  • code to assign that trust attribute to new articles based on the trust level of the "saving editor". There is no need to go backwards into the database.
  • Presentation html to add the Wikipedia Seal of Trust level to articles
  • Code to allow high-trust editors to save articles with artificially low trust when they have to leave an article in a state of mess.
  • New preferences option to allow users to choose to see only Trusted Articles of a given level (optional "later" programming)