Dear Andrew Lih,
dear scientific community,
I am a bit disappointed about the available material
that tries to measure the quality of Wikipedia articles.
The quoted newspaper article of the Wall street journal
for example just analyses technical topics but it would
be a dangerous claim to assume that quality is equally
distributed over the different fields and topics.
But you need that claim as condition for the method
of randomly picking articles and conclude for the rest.
There was another attempt to compare the Quality of
Wikipedia with other Encyclopedias in the German Computer
Newspaper C't with the same random approach.
(1) But there is a problem since it is a random way of
choosing articles to compare or to analyse. I see
some problems in non technical fields such as soft
sciences (in social science for example every theory
on society redefines all concepts of society on it's
own: how can an encyclopedia claim to have a definition?).
(2) Political terms are sometimes very complex topics
where the NPOV may not work, because there is no
right nor wrong. It is often a question of opinion
and majority that sometimes changes reality.
I observed a discussion and an edit war on the article
about Direct Democracy (in the Germen Wikipedia:
article "Direkte Demokratie") that led to a loss
of quality: only a minimal and weak consens
survived the different opinions: the evolutionary
process did not improve quality in that case.
(3) The third problem is the tendency of specific groups
that lead to vandalism. There are groups that use
values or ideologies and reject a neutral or scientific
view (moralists, religious groups, nationalists,
neocapitalists etc.). What about articles that are
important for these groups? Are these article tendentious?
My question: Is there a scientific study on the
quality of the Wikipedia ariticles? Does anyone
work on that problems? What methods could be used
to analyse the Quality?
Ingo Frost
(studies Wikipedia from a social system science view)