2019:Research/Learning from experience -What can we learn from the Wikipedia Community when designing algorithmic systems?

From Wikimania
Jump to navigation Jump to search
Notes


Abstract[edit | edit source]

In recent years, various scholarly disciplines have examined the importance and influence of algorithmic systems for our society from different perspectives. In order to analyze the social significance of algorithmic systems and to be able to work towards a value-oriented handling of them, essential questions are considered in particular with regard to the extent of the transparency of their use, the degree of user participation in this use and the possibility of their control in technical and regulatory terms.

The Wikipedia community has already gained a wealth of experience with the design and use of algorithmic systems. Over the last fifteen years, a (mostly) productive socio-technical assemblage of human and algorithmic actors has developed, which collectively and coordinately master extensive tasks. However, the use of algorithmic actors, i.e., bots, has also been observed critically, both socially and politically. Since their first use in 2002, bots have changed the culture in the community, and a system of algorithmic governance has emerged.

To shed light on the mechanisms of the bot community in Wikipedia, we use data from a study conducted by the authors in 2012 based on the English language version of Wikipedia. Within the scope of this study, we collected the completed request for approvals (RfAs) for bots from the years 2002 to 2012. Of the 2,682 applications, we selected 575 since they were positively decided and contained further contextual information in the form of links to rules, guidelines, or discussions. This selection was motivated by the goal to investigate bots that are related to the jointly defined rules and guidelines of Wikipedia.

We selected typical cases which illustrate why these tools are used so extensively in the Wikipedia community and how they help the community to meet the complex task of collaboratively creating encyclopedic articles and their quality assurance. On the other hand, these examples show how the community has learned to deal with technical difficulties or problematic consequences of bots.

In our talk, we describe different mechanisms of algorithmic governance in Wikipedia to show how the use of bots can efficiently solve large recurring tasks and how the community has been able to agree on the possibilities and limits of using bots over time.

Finally, based on these experiences of the Wikipedia community, we derive design recommendations for algorithmic systems. These include clear identifiability of non-human systems, a jointly defined framework for action, the diversity of algorithmic implementations, an open infrastructure, value orientation, and impact assessment, as well as ensuring human capacity to act.

With our talk, we want to draw more attention to the open peer production system, Wikipedia, which provides various ideas on how we should regulate commercial platforms in the future.

Authors[edit | edit source]

Claudia Müller-Birn (Freie Universität Berlin)

Relevance to Wikimedia Community[edit | edit source]

With this talk, I hope to engage with the Wikipedia community to initiate a reflective process on what they have achieved over the last years and to what extent their developed community processes can impact other system's designs. My motivation for providing this talk is mostly driven by the very positive feedback I got from a Wikipedian after a talk I gave on this topic.

Furthermore, I hope, by this talk, I can get more researchers interested in studying Wikipedia's bot community. I would love Stuart Geiger and Aaron Halfaker to join me for this talk. I reached out to them but unfortunately too late. By now, I haven't heard anything back. However, if the talk gets accepted, I hope all of us will present their insights into the bot community.

Session type[edit | edit source]

22-min presentation.

Participants [subscribe here!][edit | edit source]