Jump to content

Talk:Moderator Tools/Automoderator/Testing

Add topic
From mediawiki.org

What is your overall perception of Automoderator's accuracy?

[edit]
  • I checked 5 batches (150 edits) for hrwiki. Can't complain about accuracy, but sensitivity is rather low: very (very!) few edits will ever be caught. Seems like a lot of unnecessary computation for a little gain (in terms of patroller time). Note that hrwiki is highly abuse-filter'ed nowadays, so many bad edits never reach the main namespace. ponor (talk) 13:20, 15 November 2023 (UTC)Reply
    @Ponor Thanks for sharing your thoughts! I wanted to quickly note that we just fixed an issue in the revert rate calculations which were causing us to under-count reverts. That changes the hrwiki rate from 0-1 per day to 1-4 per day, depending on the thresholds. As compared to the total number of reverts per day on hrwiki (~50), that's approximately the same fraction as on other wikis. Samwalton9 (WMF) (talk) 13:28, 22 November 2023 (UTC)Reply
  • I checked one batch for dewiki so far and my initial perception is similar to Ponor, I was surprised how cautious the tool was even on „low caution“ mode. --Johannnes89 (talk) 18:16, 19 December 2023 (UTC)Reply

Among the different caution levels, which did you feel was the best?

[edit]

This will be a compromise between the number of edits Automoderator catches, and its accuracy rate.

  • Medium level (0.98) seemed safe, but to get any useful action I'd go with 0.975. And I still think the most meaningful action would be to slow down a vandal (captcha, obligatory edit summary, throttle), rather that reverting one or two edits a day. ponor (talk) 13:26, 15 November 2023 (UTC)Reply
  • I have evaluated 7 sheets (210 edits). 0.975 seemed to be a good compromise, with ~96% accuracy and recall around 50%. (Having 0.99 at 0.5 would be really awesome.) --Matěj Suchánek (talk) 13:29, 16 November 2023 (UTC)Reply
  • In terms of not making false reverts: „somewhat cautious“-mode, although this seems to lead to less edits being reverted then I would have initially expected. --Johannnes89 (talk) 18:16, 19 December 2023 (UTC)Reply
  • I would prefer "Very cautious" or "Cautious" level (~100% accuracy) in the pilot phase first on zhwiki and I don't want to see good edits being reverted. Thanks. --SCP-2000 (talk) 05:29, 11 January 2024 (UTC)Reply

Did you notice any patterns in the edits Automoderator decided to revert?

[edit]

We're particularly interested in false positives, so that we can improve Automoderator's accuracy.

General discussion

[edit]

Requests for data for other wikis

[edit]
  • Because I'm most familiar with hrwiki recent changes and vandalism patterns and am willing to do some tests, can you please generate data for my wiki. Thanks, ponor (talk) 11:32, 24 October 2023 (UTC)Reply
    @Ponor Absolutely - I've filed T349606. Samwalton9 (WMF) (talk) 11:58, 24 October 2023 (UTC)Reply
    Thanks, Samwalton9. Given that helpfulness of Automoderator, if I'm reading it correctly, is below 1% (# Automoderator reverts / # daily reverts) for the four listed wikis, I'm thinking maybe you should focus more on wikis that do not have their edit stream already heavily filtered by Abuse Filter. I'd ask (at least) administrators on those wikis to help you with this. Reach out to shwiki, for example, I've heard they had a recent increase of vandalisms, and do not have many filters in use.
    Also, I'd still like Automoderator to be able to slow down any suspect vandals (let them wait, let them fill edit summary in), rather that revert their edits. Immediate reverts usually mean rage and more vandalism. ponor (talk) 12:10, 24 October 2023 (UTC)Reply
    @Ponor Thanks for sharing your thoughts. One thing to note with this data is that 'reverts' captures many different kinds of edits which take a page back to an earlier state, so this number isn't exclusively anti-vandalism reverts. We did some analysis of existing anti-vandalism bots where we constrained the comparison to reverts which happen within 24 hours, as an estimate which might be closer to only being anti-vandalism reverts. Comparing my data sources, it looks like 'fast' reverts are approximately 50% of daily reverts. That increases the % a little, but I think you're right that tools like AbuseFilter will be impacting this - thanks for the suggestion to reach out to wikis like shwiki, I think that's a great idea. Samwalton9 (WMF) (talk) 12:51, 24 October 2023 (UTC)Reply
  • I'm interested in data for cswiki (500,000+ articles, 150–250 vandal edits a day). And I'm happy to bring more people familiar with patrolling. --Matěj Suchánek (talk) 16:38, 24 October 2023 (UTC)Reply
    @Matěj Suchánek: I've filed a Phabricator task for this at T349832. CLo (WMF) (talk) 15:33, 26 October 2023 (UTC)Reply
    @Ponor and Matěj Suchánek: I have generated datasets for hrwiki and cswiki. Thank you participating in the testing. --KCVelaga (WMF) (talk) 06:49, 14 November 2023 (UTC)Reply

Batch testing impressions

[edit]
  • I've noticed quite a number of bot edits in the data for hrwiki. Because there are so many of them in comparison with user (esp. anonymous user) edits, is it possible that the training set is somewhat skewed? A bot fixing a word/pattern in thousands of articles is not an uncommon situation; I'd exclude bot edits from any consideration. ponor (talk) 13:40, 15 November 2023 (UTC)Reply
    Just like a bot can fix a word/pattern, an anonymous user can (though in a few articles). Sure, Automod should ignore bots, but it's reasonable to include some bot edits in the training data as negative samples. --Matěj Suchánek (talk) 13:29, 16 November 2023 (UTC)Reply
    @Ponor I don't believe that this impacts the training set used for the model itself, but you're right that the hrwiki dataset includes approximately 1 in 5 bot edits, which seems high. Perhaps we can regenerate the dataset with different sampling to include fewer bot edits. Samwalton9 (WMF) (talk) 13:23, 22 November 2023 (UTC)Reply

Internal configuration

[edit]

Is Moderator Tools/Automoderator/Testing#Internal configuration just the configuration for your testing spreadsheet or is it the tools actual configuration? I would expect Automoderator not to take action on experienced users as well (e.g. users above a certain edit count or within user groups like en:WP:Extended confirmed or de:WP:Sichter (autoreview/editor)) --Johannnes89 (talk) 18:35, 19 December 2023 (UTC)Reply

@Johannnes89 That is the current configuration of the testing spreadsheet, and what we'll start with for Automoderator (Automoderator's configuration hasn't been built yet). These internal configurations are the ones which will apply to all wikis, so we're trying to find data points which we know will improve accuracy across all communities. User groups are a bit tricky because (besides roles like sysop) they're customisable on each wiki. What we want to do instead is probably to give each wiki the option of which roles to ignore, so that enwiki could decide not to take actions on Extended confirmed users, or dewiki could skip autoreview/editor users. It would be more effort for us to attempt to configure that internally in a way which applies to all wikis and remains up to date I think.
One useful piece of context to add is that the Revert Risk model Automoderator will use already considers the user's user groups when scoring an edit, so it rarely scores an experienced editor highly. I've filed T353795 to see if we can quantify this for Extended confirmed users. Samwalton9 (WMF) (talk) 11:40, 20 December 2023 (UTC)Reply
Thanks, perhaps it's more useful to exclude users based on a number of edits (e.g. 200+) instead of certain user groups? Or will this be possible via local configuration as well? Given that Automoderator is a tool intended to fight vandalism, I don't think it should ever revert an experienced editor. Johannnes89 (talk) 21:40, 20 December 2023 (UTC)Reply
@Johannnes89 To be honest we haven't made any final decisions about what configurations will be provided to communities beyond the basics, so I'm very interested to hear which options you think would be most useful! Samwalton9 (WMF) (talk) 09:53, 21 December 2023 (UTC)Reply

Revert rates updated

[edit]

We identified a bug in the code which was generating data for the daily average Automoderator reverts, which was significantly undercounting the number of reverts which would be performed. I've just updated the totals with the correct data. Samwalton9 (WMF) (talk) 13:22, 22 November 2023 (UTC)Reply

Testing spreadsheet update

[edit]

We just published a new version of the testing spreadsheet.

  • The spreadsheet now has support for data from 17 Wikimedia projects all in the one sheet, so we no longer need to generate and maintain individual sheets for each project - new project data can easily be added on request.
  • It also now has better translation support - you can instantly change the sheet's language via the 'Select language' dropdown in the top right. We currently have partial translations for German and Japanese, but welcome more. If you would like to translate the sheet, please make a copy and fill out the strings on the 'String translations' tab, then send it back to us.
  • We also fixed a minor issue where self-reverts were being excluded from the dataset entirely, rather than judged as 'No' by the model caution levels. This should result in a small increase in Automoderator's accuracy as reported by the spreadsheet for some datasets.

Links and instructions have been updated on the testing page. Let me know if you have any questions! Samwalton9 (WMF) (talk) 11:08, 6 December 2023 (UTC)Reply

google?

[edit]

Why is the ods file only available via google? IOW: Why isn't it uploaded locally in this wiki? I don't like the idea that google (or f*c*book or alike) get to know who downloads the file. -- seth (talk) 12:33, 20 December 2023 (UTC)Reply

@Lustiger seth Unfortunately .ods files can't be uploaded to Commons. I'd be happy to upload elsewhere if you have any suggestions. Samwalton9 (WMF) (talk) 12:42, 20 December 2023 (UTC)Reply
Oh, I didn't know that. That seems a bit ridiculous to me, but of course that's not your fault. I asked at w:de:WP:FZW#Hochladen von ods-Datei for a reasonable solution. -- seth (talk) 20:31, 20 December 2023 (UTC)Reply
What does the file contain? Just a table? Then a csv file would be even better, because it could easily be imported by many more tools. -- ~~~~ seth (talk) 10:21, 23 December 2023 (UTC)Reply
It's a little more complicated with various formulae and support for translation, unfortunately. Samwalton9 (WMF) (talk) 13:43, 8 January 2024 (UTC)Reply
You should be able to upload it to people.wikimedia.org. Taavi (talk!) 18:11, 23 December 2023 (UTC)Reply
@Taavi Very neat, I didn't know we had access to this - thanks! @Lustiger seth you can now find a direct non-Google link to download this file here. Samwalton9 (WMF) (talk) 14:06, 8 January 2024 (UTC)Reply
Ah, much better, thanks! :-) -- seth (talk) 17:02, 8 January 2024 (UTC)Reply

dewiki

[edit]

Just did a batch on dewiki. Looks good to me: no false positives came up until switched to 'not cautious' (which i wouldn' recommend to use in reality). Sensitivity is low of course, i.e. regarding suspicious edits about ethnicity, nationality - it's an AI. It should act on clear vandalism only, thus taking some workload from us. - Keep up the good work! --MBq (talk) 12:52, 29 December 2023 (UTC)Reply

Test Report: itwiki - dataset 1

[edit]

I have just tested the tool on your spreadsheet.

These are the results:

  • Model the model that has been taken into consideration is the less cautious one that did not revert any edit that I would have certainly not have reverted. This happened to be: CAUTIOUS. If I had extended this to "Maybe"s, then it would have been "VERY CAUTIOUS".
  • Effectiveness: out of 12 edits that I marked as "Yes" (to be reverted), only 3 would have been reverted by Automoderator. Out of 8 "Maybe"s, 1 would have been reverted.
  • Edit types: the types of edit that the tool would have reverted were:
  1. Clear vandalism (1)
  2. Copyright infringement (1)
  3. Unreferenced edit (1) -- this was marked as "Maybe"
  4. Text removal that caused erros (1)
The types of edit that I would have reverted but the tool would have not are:
  1. Missing file included (1)
  2. Clear vandalism (4)
  3. Unjustified content removal (3)
  4. POV/Comment (1)

Valcio (talk) 10:55, 9 February 2024 (UTC)Reply