Wikisource:Community collaboration/Monthly Challenge/FAQs
- Each month, the Wikisource community selects a few texts to proofread and validate.
- The texts are featured for a maximum of three months with a few exceptions.
- The challenge builds Wikisource's core collection and helps introduce new users to Wikisource.
- Wikisource seeks to make free, scan-backed ebooks accessible to everyone.
Why the “Monthly Challenge”?
Although Wikisource is a community project, users often find themselves working alone or in small groups. This is wonderful because the end goal is to produce scan backed texts that can be verified, corrected, and cited.
The Monthly Challenge aims to bring together the community, focus its energy on a set of texts, provide training for new users, and introduce new members and long standing contributors to build a strong community based on mutual aid, respect, and a sense of common purpose. It seeks to help new users learn how Wikisource functions and provide a welcoming space for all.
How are texts selected?
How are texts picked?
The selection of texts is always a tricky proposition and there can never be a perfect selection. Nevertheless, whenever possible, it’s crucial to strive for texts that the community wishes to work on, will help to train new users, and attract users to this site.
Overall, the Monthly Challenge seeks to create scan backed ebooks of important texts. The precise definition of important varies across individuals, communities, and countries. No person or list can ever be the final arbiter of importance. The nomination process exists to give users a chance to propose any texts they wish.
Although the texts selected have a varying degree of difficulty, texts that are shorter and contain simpler format are usually more likely to succeed than a long, scientific work full of mathematics formulas and tables. Therefore, the selection tends to have more literature.
Multiple editions of a text can be featured over time as long as each edition has an independent value.
Are any texts prioritized?
Although no texts are specifically included or excluded from the Monthly Challenge, certain factors may increase the likelihood for selection:
- Books that have recently entered the public domain (with the tag Celebrating the Public Domain).
- Books written by members of under-represented communities such as Black Writers, Indigenous Writers, Latinx Writers, Non-Western Writers, Women Writers, etc.
- Books that are commonly considered key texts of English and used in education because free e-books of these texts help to break down barriers that maintain systematic inequality.
- Books that fill gaps in the Wikisource collection (for example when there is no existing copy, or the existing copy is not scan-backed)
Some works have multiple editions that are not all suitable for inclusion in the Monthly Challenge. Roughly speaking, editions can be broken down into the following categories:
- Editions to which the author contributed are always suitable.
- Editions to which the author did not contribute to but were carefully edited are generally suitable. Such evidence must either come from the preface of the work or from a scholarly source.
- Editions to which the author did not contribute to but have new illustrations may be suitable.
- Editions that are reprints without new illustrations or evidence of careful editing are generally unsuitable.
Are there special formatting rules?
No. Works in the Monthly Challenge are formatted the same as any other. However, because the Monthly Challenge (like WS:PotM) may attract more users per work than usual, it's important to be aware of how others are working on the works. Remember to check the talk page of the Index for work-specific formatting conventions.
Can I "claim" a work to work on myself?
No. Works in the Monthly Challenge are worked on collaboratively and it is encouraged for any user to get involved in any of the works on the list.
Why are the stats out-of-date?
Currently, the statistics for each index (the progress bars on the tile view), as well as the daily statistics are updated by a script. This means that the figures will only be as accurate as of the last script run. There may also be a "replication lag" between the live database and the ToolForge database.
It is hoped that eventually, at least the index progress bars can be generated live by the server. In the meantime, the progress bars for old challenges might not be updated once the challenge is complete.
How does the Monthly Challenge work at a technical level?
Much of the Monthly Challenge is automatic, driven by a combination of templates, Lua scripts and bot edits (the latter mostly for statistics). See How it works for gory details, or Administration for information on how to administer the challenges without needing to modify the infrastructure.