Extension talk:AbuseFilter

About this board

19 previous topics. Previous discussion was archived at Extension talk:AbuseFilter/Archive 1 on 2016-10-24.

Expire/delete old log entries

Tbleher (talkcontribs)

I successfully use AbuseFilter to block spam. Over ca 10 years of usage, the filters have blocked more than 300.000 spam attempts, compared to ca. 15.000 legitimate edits. Unfortunately, AbuseFilter seems to keep details of ALL spam attempts, which significantly increases the database size. Is there a way to expire/delete old AbuseFilter entries after a certain time?

Something like a maintenance script that deletes the entries plus the referenced entries in the other tables?

I looked through the source code a bit, but didn't find a script like that.

Ciencia Al Poder (talkcontribs)

AbuseFilter stores data in External storage when External storage is set up. It would be easy if the extension would allow to configure a specific database for AF, and it would be a matter of rolling a new store every few years and just drop the old one after a while. However, AF doesn't allow to specify the store :(

This is a nice suggestion, but it doesn't seem to exist a way to do that AFAIK.

Reply to "Expire/delete old log entries"
Alex Mashin (talkcontribs)

New users' creating their user pages seems not to be filtered by AbuseFilter at all (while testing a filter, those revisions are not in the list, even if changes not meeting the filter conditions are shown). Is this normal?
Alexander Mashin talk 04:56, 18 September 2023 (UTC)

Ciencia Al Poder (talkcontribs)

It woks for me. Note that testing filters retrospectively has a limitation when you check for page creation (usually by checking the page_id variable), because when the page is created, the page_id variable is now different than 0.

Reply to "Creating new user pages"

Profanity Filter Creation

Fokebox (talkcontribs)

I aim to block offensive language for a better user experience. Could someone please guide me on how to create this filter?

Matěj Suchánek (talkcontribs)

First, you will need to decide for yourself whom the filter should concern: all users, new users, anonymous users. Also, which pages it should check. (All pages, topic pages, talk pages, ...)

Then, you will need to decide which text you want to check. Should it be wikitext, edit summary, user name...?

Last, compile a list of profanity words.

For example:

user_editcount < 10 /* users with less than 10 edits */
& ( page_namespace == 0 | page_namespace % 2 == 1 ) /* articles and talk pages */
& contains_any(string(added_lines),

Using regex (string(added_lines) rlike 'word1|word[23]') can be more compact if the profane words have many forms.

Reply to "Profanity Filter Creation" (talkcontribs)

I have a bug that causes errors in ServiceContainer.php, etc. when I install the master version of this extension on my server.

Ciencia Al Poder (talkcontribs)
Reply to "Master version error"

Rename to EditFilter?

Aaron Liu (talkcontribs)

As noted on Wikipedia, filters aren't necessarily for abuse, so maybe this should be renamed?

Dinoguy1000 (talkcontribs)

This probably isn't the best place to propose a rename; I'd suggest to file a task on Phabricator instead.

C.Syde65 (talkcontribs)

I'm not sure if extensions are typically renamed. Correct me if I'm wrong.

Though the system messages associated with the extension are sometimes renamed if need be.

Matěj Suchánek (talkcontribs)
Reply to "Rename to EditFilter?"

How to use "str_replace_regexp()" ?

Erik Baas (talkcontribs)

Is it possible to let an abusefilter replace unwanted text, like deprecated HTML tags and possibly other lint errors? My goal was to silently replace <br /> with <br> (just an example!!), without annoying them. I tried this:

new_wikitext = str_replace_regexp(new_wikitext,"\<\/?br\/\>","<br>")

but that's no good. Can it be done, and if so, how? - ~~~~

Lustiger seth (talkcontribs)

the filter does never change a page. it's a read-only tool.
string modification operations can be used to modify some strings in the memory of the filter only. -- seth (talk) 08:07, 19 March 2023 (UTC)

Erik Baas (talkcontribs)

OK, thank you.

Squeak24 (talkcontribs)

Hi have just installed MediaWiki 1.39.1 and I am getting the following error when I try and login:

[Y77sWZygTcebxETgC82f3QABkRQ] /wiki/index.php?title=Special:UserLogin&returnto=Special%3AVersion Error: Class 'Wikimedia\Equivset\Equivset' not found


from /home/user/public_html/wiki/extensions/AbuseFilter/includes/ServiceWiring.php(122)

#0 /home/user/public_html/wiki/vendor/wikimedia/services/src/ServiceContainer.php(447): Wikimedia\Services\ServiceContainer::{closure}(MediaWiki\MediaWikiServices)

#1 /home/user/public_html/wiki/vendor/wikimedia/services/src/ServiceContainer.php(411): Wikimedia\Services\ServiceContainer->createService(string)

#2 /home/user/public_html/wiki/includes/MediaWikiServices.php(301): Wikimedia\Services\ServiceContainer->getService(string)

#3 /home/user/public_html/wiki/vendor/wikimedia/services/src/ServiceContainer.php(419): MediaWiki\MediaWikiServices->getService(string)

#4 /home/user/public_html/wiki/extensions/AbuseFilter/includes/ServiceWiring.php(287): Wikimedia\Services\ServiceContainer->get(string)

#5 /home/user/public_html/wiki/vendor/wikimedia/services/src/ServiceContainer.php(447): Wikimedia\Services\ServiceContainer::{closure}(MediaWiki\MediaWikiServices)

#6 /home/user/public_html/wiki/vendor/wikimedia/services/src/ServiceContainer.php(411): Wikimedia\Services\ServiceContainer->createService(string)

#7 /home/user/public_html/wiki/includes/MediaWikiServices.php(301): Wikimedia\Services\ServiceContainer->getService(string)

#8 /home/user/public_html/wiki/vendor/wikimedia/services/src/ServiceContainer.php(419): MediaWiki\MediaWikiServices->getService(string)

#9 /home/user/public_html/wiki/vendor/wikimedia/object-factory/src/ObjectFactory.php(211): Wikimedia\Services\ServiceContainer->get(string)

#10 /home/user/public_html/wiki/vendor/wikimedia/object-factory/src/ObjectFactory.php(152): Wikimedia\ObjectFactory\ObjectFactory::getObjectFromSpec(array, array)

#11 /home/user/public_html/wiki/includes/auth/AuthManager.php(2487): Wikimedia\ObjectFactory\ObjectFactory->createObject(array, array)

#12 /home/user/public_html/wiki/includes/auth/AuthManager.php(2519): MediaWiki\Auth\AuthManager->providerArrayFromSpecs(string, array)

#13 /home/user/public_html/wiki/includes/auth/AuthManager.php(2203): MediaWiki\Auth\AuthManager->getPreAuthenticationProviders()

#14 /home/user/public_html/wiki/includes/specialpage/AuthManagerSpecialPage.php(275): MediaWiki\Auth\AuthManager->getAuthenticationRequests(string, User)

#15 /home/user/public_html/wiki/includes/specialpage/LoginSignupSpecialPage.php(144): AuthManagerSpecialPage->loadAuth(NULL)

#16 /home/user/public_html/wiki/includes/specialpage/LoginSignupSpecialPage.php(235): LoginSignupSpecialPage->load(NULL)

#17 /home/user/public_html/wiki/includes/specialpage/SpecialPage.php(701): LoginSignupSpecialPage->execute(NULL)

#18 /home/user/public_html/wiki/includes/specialpage/SpecialPageFactory.php(1428): SpecialPage->run(NULL)

#19 /home/user/public_html/wiki/includes/MediaWiki.php(316): MediaWiki\SpecialPage\SpecialPageFactory->executePath(string, RequestContext)

#20 /home/user/public_html/wiki/includes/MediaWiki.php(904): MediaWiki->performRequest()

#21 /home/user/public_html/wiki/includes/MediaWiki.php(562): MediaWiki->main()

#22 /home/user/public_html/wiki/index.php(50): MediaWiki->run()

#23 /home/user/public_html/wiki/index.php(46): wfIndexMain()

#24 {main}

Any idea what the issue maybe?

Ciencia Al Poder (talkcontribs)

Looks like composer wasn't run

Squeak24 (talkcontribs)

I have run php composer.phar update, I still get the same issue.

This is the output:

[~/public_html/wiki]# php composer.phar update

> ComposerHookHandler::onPreUpdate

Loading composer repositories with package information

Info from https://repo.packagist.org: #StandWithUkraine

Updating dependencies

Nothing to modify in lock file

Installing dependencies from lock file (including require-dev)

Nothing to install, update or remove

Package phpunit/php-token-stream is abandoned, you should avoid using it. No replacement was suggested.

Generating optimized autoload files

59 packages you are using are looking for funding.

Use the `composer fund` command to find out more!

> ComposerVendorHtaccessCreator::onEvent

No security vulnerability advisories found

Squeak24 (talkcontribs)

OK, I have added:


to the composer.json file. That seems to have resolved the issue.

Considering this now comes with MediaWiki core, surely this is a bug that needs to be looked into.

Ciencia Al Poder (talkcontribs)
Reply to "Error on new install"

Any way to monitor the throttling of users and IPs separately with one filter?

Dragoniez (talkcontribs)

Hi, is it possible to make a filter monitor the throttling of users and IPs separately? Let's say we set the rate limit as 1 edit in 60 seconds. If we specify the throttling groups as user, then the filter is triggered if the same user attempts to make 2 edits within 60 seconds, and if we specify the groups as ip, that does the same for the same IPs. But if the groups are set as user ip and if a user makes an edit and an IP makes another 10 seconds later, the filter is triggered. What I'm trying to do is detect one user making a second edit within 60 seconds and one IP making a second edit under the same condition. Is there any way to do this with one filter, or is it necessary to prepare two different filters for users and IPs? Any help would be appreciated.

Reply to "Any way to monitor the throttling of users and IPs separately with one filter?"

Size of records stored in `text`

Incnis Mrsi (talkcontribs)


I do technical job for a wiki which undergoes a DoS attack of special kind. Namely, the offender posts huge (more than 2 MiB) edits which are rejected by filters, but the extension stores texts in the text table. It results in huge quantities of garbage in the database. I looked at the configuration variables, but found nothing about limiting size (or truncation) of records.

I deem unlikely that modern versions could permit for such brute attack to proceed. May the loophole be closed by developers at some time since 2012? Unfortunately all wiki software on that site is very old.

Ciencia Al Poder (talkcontribs)

If your MediaWiki version is from before 2012, you probably have a lot of more problems than this particular attack. I'm not aware of any fix, though, like removing the ability to ckeck past actions.

On the server side, however, you can limit PHP max_post_size or similar, and also the client_max_body_size in nginx. Note that this would also limit file uploads at that size.

Reply to "Size of records stored in `text`"
Cojoilustrado (talkcontribs)

Jesus, the extension instructions are def not written for us the common people.

Can someone help me allowing youtu.be link in my wiki? Something like step by step, please, that would be useful anyone with a wiki and not just developers.

Sorry if I come across a bit heavy, but for someone like me, basically every single extension page might as well be in Aramaic. It takes weeks figuring out some of these things.

Is it even possible to allow youtu.be? Why would it be blocked but not youtube? If the idea is to block .be, then a rule is needed out of the box that allows youtube. It could be abused, but I think by now we know youtube is a safe site.

Dinoguy1000 (talkcontribs) (talkcontribs)

Most likely, you can link to youtube.com instead.

Cojoilustrado (talkcontribs)

Thanks, I figured the whitelist thingy. The problem with youtube urls links is that they are long and messy, the youtu.be are short short, making them easier to deal with in code. Especially of you have a bunch of them. Thanks for the replies.

Dinoguy1000 (talkcontribs)

If you're only linking to videos (and not even to particular timestamps), the only URL parameter you actually need is v (i.e. https://youtube.com/watch?v=[string]). Even if you're linking to other things, or to specific timestamps, etc, if the URLs you're using have many URL parameters, chances are good you can remove most of them and still have a URL that does what you need of it. If you're any good at all with scripting, it should be pretty easy to write a bit of code that will strip unneeded URL parameters for you; the hard part would just be figuring out which parameters are actually necessary when you're first writing the script (and only "hard" in the sense that you'd have to repeatedly remove parameters and load the resulting URL to see the result).

Reply to "youtu.be"
Return to "AbuseFilter" page.