2019:Advocacy/Platform Regulation and Free Knowledge Advocacy
This is an Accepted submission for the Advocacy space at Wikimania 2019. |
Title
[edit | edit source]Platform Regulation, Content Moderation, and Free Knowledge Advocacy
Description
[edit | edit source]Giant online platforms like Google, Facebook, and Twitter are being criticized for how they do (or don't) moderate content on their systems. The public, the press, and politicians all worry that platforms are not being held responsible when users post harmful content online, and are considering making the platforms liable if they don't moderate content better. Two concerns for Wikimedians arise from this: 1. There's no consensus on what "better" content moderation is; 2. Wikimedia makes platforms, and the community moderates those platforms' content.
This session will briefly describe some of the current debates around content moderation and platform responsibility, with examples from the European Union and the United States, and how they affect Wikipedia and other movement projects. The workshop will then turn to a roundtable discussion with attendees, seeking to come to a common understanding of the place of the movement in these debates.
Related questions include: How do proposals for commercial platforms affect Wikimedia? How does a platform's moderation of content affect its responsibilities, ethically or legally? What should the relationship be between content moderation and platform responsibility? What can it mean to moderate content "neutrally" and be committed to accurate, free knowledge?
Note: This session is the result of combing two previously-proposed sessions on platform liability:
It's the Platforms, Stupid!, Lilli Iliev (WMDE) & John Weitzmann (WMDE)
Platform regulation is all the rage for politicians at the moment, when a political issue is in any way related to the internet. The European Union kicked off this trend around privacy and copyright, and is far from finished with exploring new regulatory territory. This seems to provide a controversial inspiration for other regions and for governments of even the most autocratic type and dictators alike. Wikimedia projects are – at least content-wise – always based on some kind of internet platform, and are thus necessarily part of the discussion, even if they are not always the main target of regulation or sometimes even expressly exempt from it.
In this session we first want to explore what the implications of platform regulation can be for Wikimedia projects big and small. The notion of “content moderation” for example often implies requirements to use automated systems and filters in order to be compliant. How far can this impact out projects? What is acceptable for u s, also regarding the open paradigm and its being based on freedom of expression? The Wikimedia Foundation has a standing position on intermediary liability, but (how) do we need to expand on that, maybe regionally and locally?
Secondly we want to look at the prospects of engaging with the ongoing political platform regulation debates. What messages can be successful where? What are the pitfalls for openness advocates around platform regulation as a topic?
Internet platforms and access to knowledge, Natalia Mileszyk (Centrum Cyfrowe, Poland)
Due to the rise of a few powerful companies such as Uber, Facebook, Amazon or Google, the term “platform” has moved beyond its initial computational meaning of technological architecture and has come to be understood as a socio-cultural phenomenon. Platforms are said to facilitate and shape human interactions, thus becoming important cultural, economic and social actors. While the companies offering platform services are increasingly the target of regulatory action, there is also much discretion left for them to decide on their business model. Increasingly rooted in the daily life of many individuals, many platforms monetise social interactions. Many sectors and social practices are being “platformised”, from public health to security, from news to entertainment services.
Wikipedia and other Wikimedia projects are platforms. However, the predominant discourse does not include them, or many other non-commercial projects.
In this session, we seek to address two main questions regarding Wikimedia projects and platformization.
The first is where movement projects fit in the developing regulatory landscape. In what ways do Wikimedia projects share the problems that regulators seek to solve with new regulation, and in what ways do the projects differ from them? We can both defend necessary aspects of online platforms, and also differentiate Wikimedia projects from many commercial platforms. What combination of these arguments yields the ideal policy results?
The second question has to do with how Wikimedia projects intersect with the increasing dominance of other online platforms. As readers reach Wikimedia projects through other platforms’ intermediation, how can content and data should be sharable and accessible to support the idea of openness - which should be also addressed by platforms on systemic level, not only by individual users. What are the takeaways from experiences of services such as Wikipedia and Wikidata? How should we address openness in big platforms? How can we incorporate the practices of platform cooperativism in this landscape? Finally, how would governments, markets, or norms regulate and make these actors responsible without causing over-policing of content and avoid reliance on unaccountable automated tools aiding content moderation?
The workshop will include mapping of obstacles and challenges to making internet platforms fairer players in the online ecosystem. We will jointly discuss recommendations - what kind of legislation and self-regulatory measures should be implemented by platforms to enable access to knowledge.