2024:Program/Parsoid is coming! How do I keep up? ((Wikitext)) in 2024.
Session title: Parsoid is coming! How do I keep up? [[Wikitext]] in 2024.
- Session type: Lecture
- Track: Technology
- Language: en
Wikitext has a chance to evolve in 2024, as the Content Transform Team begins to deploy the next-generation Parsoid wikitext parser for article views on many Foundation wikis. This talk describes how to opt-in to an early look at the new parser and outlines the testing tools and decision framework used to guide deployment timing and ensure wiki content is unbroken. We’ll survey useful tools to check and migrate content on Wikimedia and 3rd-party wikis. With our new parser comes opportunities for new wikitext features, and we’ll touch on a few of those as well!
Description
[edit | edit source]MediaWiki articles are written in a markup language called wikitext, and then turned into HTML for display in a reader’s browser by a component called a wikitext parser. Parsoid is the next-generation wikitext parser, and in 2024 the Wikimedia Content Transform Team has started to enable it for article views on many Foundation wikis using the ParserMigration extension, which is also appropriate for third-party wikis to deploy. This talk will describe why we’re replacing the old wikitext parser, how to try out the new parser as a reader, editor, or site admin, and how we are working to make the transition smooth for communities.
Although almost all wikitext appears exactly the same whether rendered with the legacy parser or with Parsoid, there are some differences which editors and admins will need to keep up with. We will survey the tools used on Wikimedia projects to migrate content and the community maintenance efforts organized around them, and describe how you can help. We’ll also discuss ways that third-party wikis can use tools like the Linter extension, tracking categories, bots, frameworks, and extensions like ReplaceText to automatically migrate content which uses deprecated wikitext markup so we can all maintain pace with the changes to the wikitext parser taking place in MediaWiki.
We’ll conclude by discussing some of the exciting possibilities enabled by the transition to the new parser.
Recording session: https://www.youtube.com/watch?v=LedziGgRqZU&t=8845
- How does your session relate to the event theme, Collaboration of the Open?
Wikitext is the medium through which much collaboration on projects using MediaWiki occurs: it is the format used when we create articles and when we talk with each other, and our workflows and governance mechanisms are often expressed using wikitext templates. As such, improvements to the power and expressivity of wikitext can open new opportunities for collaboration, and can increase the amount of information captured in the open. Conversely, ossified wikitext can (perceptibly or not) constrict the universe of solutions available as we try to work together. This talk is about collaboratively working to liberate wikitext (by maintaining the structures we have built on top of it and freeing them from ancient chains) so that we can further extend wikitext in the future in order to capture the collaborations and open knowledge of tomorrow.
- What is the experience level needed for the audience for your session?
Average knowledge about Wikimedia projects or activities
Resources
[edit | edit source]- https://docs.google.com/presentation/d/1ZCGD77num6YS9HlVEVDnzQfwc_zFPOAil0y5qkLX57A/edit?usp=sharing
Speakers
[edit | edit source]- C. Scott Ananian
- C. Scott Ananian began editing articles on Wikipedia in 2005, and since 2013 is an employee of the Wikimedia Foundation, working on the Parsoid project. He also dabbles with LanguageConverter and real-time collaboration in Visual Editor.
- Previously, Dr. Ananian was a jack-of-all-trades for the One Laptop per Child Foundation (OLPC). He received his PhD in computer science from MIT, and before joining OLPC was a local activist and organizer for copyright issues. He organized Free Sklyarov Boston in July 2001, and in 2004 and 2005 was the lead programmer for the Election Incident Reporting System, which collected real-time data on elections across the US. He's also been a kernel hacker and part-time khipu researcher. Now he tries to build robust and reliable systems to allow everyone to discover, share, and learn.