Jump to content

Quality review/Description of systems

From mediawiki.org

Encyclopedia of Life curation system

[edit]
Source: Discussion with Robert Corrigan.

When an EOL user applies for curatorship, their credentials are reviewed by a member of EOL's "Species Pages Group". If approved, their name, credentials and declared field of expertise are published as a profile on the website, associated with their reviews.

They are granted broad curatorial privileges, but usually focus on topics they are familiar with; this self-discipline is encouraged by the fact that their name is associated with information they add, and actions they perform.

New curators are recruited through several channels. One of them is the "EOL Rubenstein Fellows program", a program to fund early-career scientists to improve EOL in their field of expertise. These Fellows usually remain active as authors and curators beyond their fellowship. EOL also reaches out to curatorial staff of their member institutions. A new process is being developed to allow amateurs to participate in the curation process, either under the supervision of a curator, or as a curator themselves.

An entry can have multiple reviews, but only the latest review counts. Little disagreement has been observed between curators.

The EOL review system features a binary trusted/untrusted flag, and flags issues (misidentified, incorrect/misleading information, etc.) rather than assessing criteria on a Likert scale. The rationale behind this choice was to "force a conversation between the curator and the provider", and also to acknowledge the constant feed of new information, and their possibly different interpretation.

Likert scales are used to rate objects (e.g. media files); in this case, object ratings are averaged.

EOL functions as an information aggregator from many providers, and as such doesn't assess the completeness of an entry.

Entries by curators are "trusted" by default, but are still "subject to later curatorial actions". Conflict of interest is not an issue for EOL, who considers curators as "trusted experts" by definition.

Possible future improvements include the facilitation of dialogue between curators and providers, a queuing system, more community features and granular annotation.

American Psychology Society Wikipedia Initiative

[edit]
Source: Discussion with Rosta Farzan.

The initiative is advertised from the APS website, and the review system linked from it. E-mails will be sent to APS members, encouraging them to join, and to invite their students as well. Anyone can use the system; there is no credentials verification yet. People who sign up are asked to volunteer information about how much they've edited Wikipedia, what their "educational attainment" is, whether they're a student and/or a member of the APS. Last, they're asked to select one topic that best matches their interest and expertise.

Multiple reviews are (or will be) possible.

The criteria and rating system were chosen after discussion with Erik Möller and Robert Kraut. The rationale was to use a Likert scale because people were assumed to be familiar with it. A binary (good/bad) flag was thought not to be useful for their purposes

An API to extract data from their database is not available yet, but can be developed in PHP.

Possible improvements include more contextual help, to provide reviewers with guidelines, and a list of issues they should pay attention too and provide feedback about.

Rfam / Pfam review process

[edit]
Source: Discussion with Magnus Manske.

Reviewers are a very small group working for the Wellcome Trust Sanger Institute, with academic degrees or publications in the relevant science (RNA/protein biology).

Many of the articles included in this project were generated using data from their own database. The articles are then read from Wikipedia and included into their website (example), after the changes from Wikipedia have been reviewed.

Changes to articles they watch are reviewed on a weekly basis; they remove vandalism and correct erroneous information quickly, directly on Wikipedia.

A list of reviews is available in JSON format, for inclusion in Magnus Manske's database on the toolserver.

Magnus Manske's "sifter" tools

[edit]
Source: Discussion with Magnus Manske.

The content of reviews are partly copied into a public table on the toolserver (u_magnus_sifter_p), which can be accessed through an API (example script). The data is updated using a cronjob.

The table includes the following fields: partner, partner_id, partner_ts, partner_url, wikipage, revision, wikiproject, reviewer, reviewer_url.

Future improvements may include the possibility to have "multiple review types (quality, length, easy-to-understand etc.) per article per revision per reviewer".