Designing a scalable reporting system for a safe Wikipedia.
I joined the Trust and Safety Tools team in July of 2022 to lead design for the incident reporting system, which would be a complex, multi-year project.
Trust and Safety Tools is the technical arm for the Trust and Safety department at Wikimedia. We design and build products, features, and tools that help the Wikipedia community enforce Trust and Safety policy that keep the community safe and the platform compliant with legal requirements.
The PM, tech lead, and I work cross-functionally with Trust and Safety to understand their priorities and propose tooling that meets the safety needs they identify.
- Data scientist
- Product manager
- Product designer 🙋🏾♀️
- Tech Lead
- Developers (2)
- Program manager
- Community relations
- Build an MVP that allows editors to report harassment to Administrators
- Release and test on two pilot wikis
- Connects to current systems
Long term vision
- All incidents
- Private reporting: creating a private space for reporters and responders to interact
- Escalation: a smart system that routes cases to the correct entity for further support
- Build for the most underserved and vulnerable user. Experienced editors on Wikipedia already have workarounds for reporting incidents such as harassment. We wanted to build a system that would be discoverable by a new editor i.e. someone who was not well-connected in the Wikipedia community and was vulnerable to attacks. We knew that if this user could find and navigate the system, the more experienced users definitely could.
- Meet the needs of reporters and responders. We also wanted to meet the needs of the people responding to the reports i.e. the Admins and T&S staff. So, I needed to make sure the system was providing the responder with enough information to make a decision but not overwhelming them.
- Build for scale. As with most things on Wikipedia, we needed to ensure this system would work at scale. Yes we were launching on a small pilot wiki but eventually this would need to work on the large wikis, like English, Spanish, and German Wikipedia.
- Build an extension that ports to any community. We wanted to ensure that the system was customizable to any of the wikis, since each community is governed by a different sets of Admins with a different set of cultural norms and behavioral policies.
Why do we need the Incident Reporting System?
- Universal Code of Conduct: In 2022, the Wikimedia community ratified the Universal Code of Conduct (UCoC), a community-led effort to create behavioral guidelines for all the wikis. As part of enforcing the UCoC, the UCoC committee and Wikimedia Trust and Safety proposed an incident reporting tool for our contributors to report incidents that violated the UCoC.
- Digital Service Act (EU): In 2022 the EU passed the Digital Services Act ****which requires big tech platforms, like Wikipedia, to have official systems in place for users to report policy violations.
- Meeting basic expectations: Big tech platforms like Reddit, Discord, Instagram, and Twitter have reporting systems that protect users from harm and the platform from regulation. Our self-governing communities on Wikipedia have devised various creative ways of moderating online social interactions, but there is no official, effective, and transparent reporting system that is accessible to all. For years the community has requested such a system.
Understanding the problem space