Extension talk:SpamBlacklist/From MediaWiki

Latest comment: 5 years ago by Octahedron80 in topic Unicode domain

Conflicting with ConfirmEdit Extension

Hi, it seems this extension is conflicting with the ConfirmEdit Extension (link). On creation of a new page (where a captcha is thrown) i get the following error:

Warning: Missing argument 4 for wfspamblacklistvalidate() in /home/www/web125/html/rockinchina/wiki/extensions/SpamBlackList/SpamBlacklist.php on line 67
  • MediaWiki: 1.6.8, PHP: 4.4.6

Cross posting to: Extension_talk:ConfirmEdit -- Matsch 20:06, 23 August 2007 (UTC)Reply

I too am seeing precisely the same error, but we don't have the ConfirmEdit extension installed.
MediaWiki: 1.6.9, PHP: 4.3.10-22 209.198.95.98 18:07, 27 August 2007 (UTC)Reply



In order to correct the problem on my wiki, I did the following edit in the SpamBlackList.php file:


 # removing the last two arguments to try and fix error problem
 # function wfSpamBlacklistFilter( &$title, $text, $section, &$hookErr, $editSummary ) { 
function wfSpamBlacklistFilter( &$title, $text, $section ) {

That fixed the problem for me. Allan Newsome 71.228.186.123 21:39, 19 March 2011 (UTC)Reply

Chinese Spam

I've found that [1] means fewer spam postings from Chinese wikispammers happen. Any other ways to minimize the Chinese based spam with Chinese characters other than trying add those character strings? And does the blocking work for them or is MediaWiki reading them as not the right way in order to block them right? --76.214.233.199 14:59, 2 September 2007 (UTC)Reply

sorry, i had to unlink your link because of spam protection. -- seth 23:36, 19 April 2008 (UTC)Reply

Line Error Message

So I load the pages into my extension directory and at the top of my wiki is spits out

Warning: Call-time pass-by-reference has been deprecated; If you would like to pass it by reference, modify the declaration of [runtime function name](). If you would like to enable call-time pass-by-reference, you can set allow_call_time_pass_reference to true in your INI file. in /home/.lutece/mariainc/disapedia.com/extensions/SpamBlacklist/SpamBlacklist.php on line 103

Warning: Call-time pass-by-reference has been deprecated; If you would like to pass it by reference, modify the declaration of [runtime function name](). If you would like to enable call-time pass-by-reference, you can set allow_call_time_pass_reference to true in your INI file. in /home/.lutece/mariainc/disapedia.com/extensions/SpamBlacklist/SpamBlacklist.php on line 112

Warning: Call-time pass-by-reference has been deprecated; If you would like to pass it by reference, modify the declaration of [runtime function name](). If you would like to enable call-time pass-by-reference, you can set allow_call_time_pass_reference to true in your INI file. in /home/.lutece/mariainc/disapedia.com/extensions/SpamBlacklist/SpamBlacklist.php on line 112

How did I screw up? 24.5.195.51 23:14, 23 November 2007 (UTC)Reply

I had the same problem running Mediawiki 1.10.1 until I installed using the 1.10 branch. --Tosfos 04:17, 31 December 2007 (UTC)Reply

Losing content

I'm wondering if anybody knows how to customize this extension so that you don't lose all your edits if you unknowingly put a link that is on the spam list. Right now it simply spits you out to the main page. It would be great if this extension simply reloaded the editing page (with all the edits) and put the warning message at the top of the editing page. Similar to the ConfirmEdit extension. Is this possible? Edward (March 28, 2008)

more detailed manual and suggestions

hi!
is there a more detailed description than Extension:SpamBlacklist#Usage? i have some questions/suggestions/feature requests:

  1. where does the regexp try to match? afaics (i googled for "function getExternalLinks") there is done a preparsing which greps all external urls and separates them by "\n".
    for the s-modifier is not used, it wouldn't be necessary to avoid patterns like ".*" as it is said in the manual. ".*" would only match until EOL.
  2. does (?:foo|bar) vs. (foo|bar) affect speed, when S-modifier ist set? (without the S-modifier i guess that a non-capturing pattern would be faster).
  3. why does the header of those blacklists say "Every non-blank line is a regex fragment which will only match hosts inside URLs", while that isn't true? it does not only match hosts, it matches the path, too.
  4. actually the code !http://[a-z0-9\-.]*(line 1|line 2|...)!Si in manual is not right, cause "!" is not the delimiter character. so escaping like \! wouldn't be necessary to match a "!".
  5. wouldn't it be better and faster to use just preg_replace('|\\\*/|', '\/', $build) instead of str_replace( '/', '\/', preg_replace('|\\\*/|', '/', $build) )?
  6. as there exists a log-file for all changes on the spamblacklist like meta:Spam_blacklist/Log it would be a nice feature, if MediaWiki:Spamprotectiontext would mention the reason noted in that log-file. is that possible?
    See bugzilla:4459 Mike.lifeguard 00:42, 21 May 2008 (UTC)Reply
    Actually this is fixed now, as I wrote there (bugzilla:4459). -- seth 19:13, 19 February 2009 (UTC)Reply
    That is not fixed.  — Mike.lifeguard | @meta 00:09, 20 February 2009 (UTC)Reply
  7. the blacklist-users should have the opportunity to chose whether a regexp should be blocked on articles only or on arcticles _and_ their talk-pages. this could be done for example by a special parameter or a separated blacklist.
  8. a new entry in a blacklist should not cause a spamprotection intervention on existing links, but only when someone tries to put a new (forbidden) link to a page. (i guess, this could be technically solved by simply counting the numbers of occurrences of forbidden urls before and after editing of a page: if diff!=0 then block)
    Done by bugzilla:1505 in rev:34769 Mike.lifeguard 00:42, 21 May 2008 (UTC)Reply

especially the realization of the last two feature requests would be a very helpful advantage.
-- seth 22:48, 17 April 2008 (UTC), 23:36, 19 April 2008 (UTC), 22:44, 1 May 2008 (UTC), 23:41, 12 July 2008 (UTC)Reply

see also: bugzilla:14089, bugzilla:4459, bugzilla:14091, bugzilla:14092. -- seth 09:40, 12 May 2008 (UTC)Reply

however, i just updated the manual, so some of my requests are completed. -- seth 20:34, 12 May 2008 (UTC)Reply

See SpamBlacklist requests on Bugzilla.  — Mike.lifeguard | @meta 00:12, 20 February 2009 (UTC)Reply

Warnings for wfSpamBlacklistFilter()

I'm getting the following warning messages when trying to save a spam address in an article:

Warning: Missing argument 4 for wfSpamBlacklistFilter(), called in /home/zzzzzzz4/public_html/wiki/includes/EditPage.php on line 651 and defined in /home/zzzzzzz4/public_html/wiki/extensions/SpamBlacklist/SpamBlacklist.php on line 75

Warning: Missing argument 5 for wfSpamBlacklistFilter(), called in /home/zzzzzzz4/public_html/wiki/includes/EditPage.php on line 651 and defined in /home/zzzzzzz4/public_html/wiki/extensions/SpamBlacklist/SpamBlacklist.php on line 75

  • MediaWiki: 1.11.0, PHP: 5.2.6

I have no idea what is the problem. Please help! --Nathanael Bar-Aur L. 00:26, 24 August 2008 (UTC)Reply

I don't get why the hook function for $wgFilterCallback, wfSpamBlacklistFilter() declares 5 parameters when the manual clearly states that the function is supposed to be getting only 3? Am I missing something? Apparently this June 19 edit is responsible [2] If I undo these changes the warning messages go away, but I don't know if it will mess up something else. Please someone check it out. --Nathanael Bar-Aur L. 02:13, 24 August 2008 (UTC)Reply
I have this same problem! --74.67.44.255 02:32, 17 September 2008 (UTC)Reply
Same problem here, with just a barebones 1.6 MediaWiki installation and this as the only additional extension. What gives? :-(

I'd recommend updating. I'm using r45914 with no issues.  — Mike.lifeguard | @meta 21:25, 27 January 2009 (UTC)Reply

www required

I downloaded SpamBlacklist extension today.

I used the domain badspammer\.com for a test case:

The spam filter works for http://www.badspammer.com/ but does not work for http://badspammer.com/

Is there a patch to fix this?

--jwalling 06:31, 2 September 2008 (UTC)Reply

Update 1:

The problem cleared itself without my intervention. Go figure!
--jwalling 07:24, 2 September 2008 (UTC)Reply

Update 2:

The block of http://badspammer.com/ is erratic
Block works here: http://www.gustavwiki.com/wiki/Talk:Spam_Patrol
But not here: http://www.gustavwiki.com/wiki/Spam_Patrol#SpamBlacklist_Extension
--jwalling 07:43, 2 September 2008 (UTC)Reply

Warnings for wfSpamBlacklistFilterMerged()

PHP Warning: Missing argument 4 for wfSpamBlacklistFilterMerged() in PATH/SpamBlacklist/SpamBlacklist.php on line 84
PHP Notice: Undefined variable: editSummary in PATH/SpamBlacklist/SpamBlacklist.php on line 94

No Spam Protection. Occurs regardless of local stored or online source list. System: MediaWiki 1.12.0, PHP 5.2.0-8+etch13 (cgi-fcgi), --Martin

"Hook wfSpamBlacklistFilterMerged failed to return a value; should return true to continue hook processing or false to abort."

I've installed with this default settings. Why am I getting this?

Conflict with EditPage.php

Hi, perhaps someone can help me? I get this following error message:

Missing Warning: Missing argument 4 for wfSpamBlacklistFilter(), called in /includes/EditPage.php on line 651 and defined in /extensions/SpamBlacklist/SpamBlacklist.php on line 74

Missing Warning: Missing argument 4 for wfSpamBlacklistFilter(), called in /includes/EditPage.php on line 651 and defined in /extensions/SpamBlacklist/SpamBlacklist.php on line 74


EditPage.php, line 651 contains following code:

}
if ( $wgFilterCallback && $wgFilterCallback( $this->mTitle, $this->textbox1, $this->section ) ) {
# Error messages or other handling should be performed by the filter function
wfProfileOut( $fname );
wfProfileOut( "$fname-checks" );
return false;

SpamBlacklist.php, line 74 contains following code:

  /**
  * Hook function for $wgFilterCallback
  */
 function wfSpamBlacklistFilter( &$title, $text, $section, &$hookErr, $editSummary ) {
$spamObj = wfSpamBlacklistObject();
$ret = $spamObj->filter( $title, $text, $section, $editSummary );
if ( $ret !== false ) EditPage::spamPage( $ret );
return ( $ret !== false );

The bold typed lines in the code are the lines which are mentioned in the eror message. Since I'm not specialized in PHP-Programming, I have no idea what hat to be changed in the code, so that it works correctly. The spamfilter function itself works, but I wan't to get rid of the error message and the included information to my server paths.

Thanks, Hajosch —The preceding unsigned comment was added by 85.179.136.31 (talkcontribs) 10:06, 18 November 2008 (UTC). Please sign your posts with ~~~~!Reply

I am getting similar problems when I edit a page and do not provide a summary for the change, the edit pages comes back warning me about this with the error mention below at the top of the page. If I do provide a summary, it change is committed and I see no error. I downloaded the latest revision but I am still seeing this error:
Missing argument 4 for wfSpamBlacklistFilterMerged() in <<removed>>/public_html/wiki/extensions/SpamBlacklist/SpamBlacklist.php on line 83
SkyLined - 74.125.121.49 11:15, 6 March 2009 (UTC)Reply

Not working for me also

I have also tried all options, can any one help me out...URGENT... Suyash Jin


White List of Regular Expressions

The RTEMS Project has a Mediawiki installation (http://wiki.rtems.org) and use this extension with no real problems. We have a user whose last name is "Sporner" which matches *porn*. He created an account with an umlaut for the "o" to get by the "porn" filter. But now any time someone attempts to reference him by his name without the umlaut, that edit is rejected. I see a white list for URLs. Is there a similar capability for a white list regex so "Sporner" will be ok?

Alternatively, can someone suggest a modification to the regex which would let us use his name in an edit. :)

Thanks.

--JoelSherrill 14:58, 11 June 2009 (UTC)Reply

This blocks external links and nothing else, so I don't really know what you're talking about.  — Mike.lifeguard | @meta 23:32, 28 June 2009 (UTC)Reply

MediaWiki:Spam-blacklist or MediaWiki:Spamblacklist?

Is that hyphen correct? At least on Wikitravel (MW 1.11.2), the form MediaWiki:Spamblacklist is necessary. Jpatokal 08:16, 22 June 2009 (UTC)Reply

The hyphen is correct. Wikitravel doesn't even have this extension installed, so I'm not sure what you're talking about. —Emufarmers(T|C) 02:08, 23 June 2009 (UTC)Reply

Which version does this actually work with?

"The extension might work with MediaWiki version 1.6.0 or greater." is unhelpfully vague. I was getting the above "Missing argument 4 for wfSpamBlacklistFilterMerged()" error until I upgraded my MediaWiki install to 1.15.0 - I didn't note down the version number of my previous install, but it was certainly above 1.6.0 (I think it was around 1.8.0, I hadn't updated that particular wiki for a while). --Kevan 18:12, 6 July 2009 (UTC)Reply

Whitelist admin actions

I'm an admin at wikimedia commons & we sometimes have the case that images are sourced from urls which are blacklisted. This makes it hard or impossible to add a proper source link & we have to resort to putting spaces in the url etc. Thus my request: Could the blacklist maybe be overidden by admins? I don't think we have to fear admins mass spamming their project ;)--Diebuche 11:02, 2 July 2010 (UTC)Reply

Problem with gallery

so after running the cleanup.php script, all the galleries in the cleaned pages do not work any more. when i edit any of the pages, do not make any change at all and then save, it solves the problem for this page. when cleaning large number of pages this is very bad - still better than having to do it manually, but not by much...

it seems that there is some missing operation or parameter when calling updateArticle().

the problem is that after the cleanup.php script is run, the photos in the gallery of the cleaned pages point to:

  • URL/index.php/File:Filename

instead of

  • URL/index.php?title=File:Filename

IOW, the photos in the gallery lose the "?title=" part.
if i open the page for edit and save, no change shows in the page history for my save, but the problem is fixed.
. קיפודנחש 18:01, 10 October 2010 (UTC)Reply


Found Out Why It Sometimes Doesn't Work

I was having the same problem others were (just wouldn't load). I found that the problem was the name of the blacklist extension directory. I re nameded it and found out that i put a space at the end of the name and, with the space removed, It works like a charm. If you get an error with an extension such as: "No such file or directory in.../LocalSettings.php". empty spaces apear to be saved as part of a folder/file name. Make sure there aren't any extra spaces at the end of your extension's folder name.

Expected Memory Use

I found one day that I had to up php.ini to 64M to have the spam black list not exhaust memory - this is using the default black and white lists. What is the expected memory footprint?

Errors running cleanup.php

When trying to run cleanup.php I constantly get tese errors, and haven't been able yet to resolve them:

PHP Notice:  Undefined variable: wgSpamBlacklistSettings in /var/www/<site>/extensions/SpamBlacklist/cleanup.php on line 77
PHP Warning:  Invalid argument supplied for foreach() in /var/www/<site>/SpamBlacklist_body.php on line 17
PHP Fatal error:  Call to undefined method SpamBlacklist::getBlacklists() in /var/www/<site>/extensions/SpamBlacklist/cleanup.php on line 81

Does anyone have advice on how to get past this? --Automagic 15:48, 12 May 2011 (UTC)Reply

what is cleanup.php anyway? [3] Errectstapler 13:55, 9 July 2011 (UTC)Reply

preg_match() Compilation failed

I am also getting errors while running cleanup, however, they are slightly different:

Warning:  preg_match() [<a href='function.preg-match'>function.preg-match</a>]: Compilation failed: missing ) at offset 1026 in /home2/server/wiki/extensions/SpamBlacklist/cleanup.php on line 106
Warning:  preg_match() [<a href='function.preg-match'>function.preg-match</a>]: Compilation failed: unmatched parentheses at offset 907 in /home2/server/wiki/extensions/SpamBlacklist/cleanup.php on line 106

If I do an array_splice of 1000 on the $regexes array in the cleanup.php script, it runs without issue. If I make it 2000 or more, it comes up with the errors listed above. My server is running PHP 5.2.X

Any ideas if this will be addressed? Zacharyz 22:36, 11 September 2011 (UTC)Reply

Only works on domain names?

So I've tried to edit my local MediaWiki:Spam-blacklist file. It seems it only blocks things if it's in the domain name, not anything that follows, despite the examples on the main page of this talk file. For example, www.badspammer.com is blocked, but not www.example.com/badspammer.html is not. Any way to fix this or is the extension just not as useful as I thought?

Hook wfSpamBlacklistFilterMerged failed to return a value

Detected bug in an extension! Hook wfSpamBlacklistFilterMerged failed to return a value; should return true to continue hook processing or false to abort.

Backtrace:

  1. 0 /customers/deadhorseinterchange.com/deadhorseinterchange.com/httpd.www/wiki/includes/EditPage.php(956): wfRunHooks('EditFilterMerge...', Array)
  2. 1 /customers/deadhorseinterchange.com/deadhorseinterchange.com/httpd.www/wiki/includes/EditPage.php(2483): EditPage->internalAttemptSave(false, false)
  3. 2 /customers/deadhorseinterchange.com/deadhorseinterchange.com/httpd.www/wiki/includes/EditPage.php(449): EditPage->attemptSave()
  4. 3 /customers/deadhorseinterchange.com/deadhorseinterchange.com/httpd.www/wiki/includes/EditPage.php(340): EditPage->edit()
  5. 4 /customers/deadhorseinterchange.com/deadhorseinterchange.com/httpd.www/wiki/includes/Wiki.php(510): EditPage->submit()
  6. 5 /customers/deadhorseinterchange.com/deadhorseinterchange.com/httpd.www/wiki/includes/Wiki.php(63): MediaWiki->performAction(Object(OutputPage), Object(Article), Object(Title), Object(User), Object(WebRequest))
  7. 6 /customers/deadhorseinterchange.com/deadhorseinterchange.com/httpd.www/wiki/index.php(116): MediaWiki->initialize(Object(Title), Object(Article), Object(OutputPage), Object(User), Object(WebRequest))
  8. 7 {main}

What the hell does this mean and how do I fix it? I'm sick of all these problems, the SpamBlacklist does this, the ReCaptcha keeps asking for keys even though it has them, and SimpleAntiSpam simply doesn't work.

What version of the extension are you using? —Emufarmers(T|C) 08:37, 28 May 2011 (UTC)Reply
Version 1.15.X, corresponding with the version of my wiki. 86.18.165.200 15:39, 28 May 2011 (UTC)Reply
I have the very same issue. I'm using version MW1.15-r48184. 70.53.2.75 16:19, 4 June 2011 (UTC)Reply
Update: The most recent version (trunk-r88992) clears up this issue and works just fine with MediaWiki 1.15.x. 70.53.2.75 13:51, 5 June 2011 (UTC)Reply
Thanks! This Worked...do you recommend updating to the newest version of MW for this to work better? Ciao

How Do I Make a List?

I plan on setting one up on a wiki I'm an admin on (part of Wikia), but I'm terrible at putting the list together. Can someone copy/paste the full text of the format with lines that say "Word 1," "Word 2," etc? Thanks in advance.

---****--- Roads 15:35, 1 August 2011 (UTC)

Notice: Use of undefined constant PROTO_HTTP - assumed 'PROTO_HTTP' in extensions/SpamBlacklist/SpamBlacklist_body.php on line 38

I'm seeing this message when saving a page.

<br /> 
<b>Notice</b>:  Use of undefined constant PROTO_HTTP - assumed 'PROTO_HTTP' in <b>/home/httpd/mediawiki/extensions/SpamBlacklist/SpamBlacklist_body.php</b> on line <b>38</b><br /> 

Line 38 looks like this:

                $thisHttp = wfExpandUrl( $title->getFullUrl( 'action=raw' ), PROTO_HTTP );

This is PHP 5.2.10 (cli) (built: Jun 14 2011 19:45:53), MW 1.16 branch (Revision: 96874), SBL trunk (Revision: 96876).

Cheers, --Dmb 08:19, 15 September 2011 (UTC)Reply

Yes I'm getting this too and it has taken the website down. A fix would be very welcome.
Line 38 of extensions/SpamBlacklist/SpamBlacklist_body.php
PHP 5.1.6, Mediawiki and SpamBlacklist revision 105765
Pgr94 17:03, 10 December 2011 (UTC)Reply

In includes/GlobalFunctions.php wfExpandUrl() appears to only take one argument, yet the call with PROTO_HTTP is supplying two.

/**
 * Expand a potentially local URL to a fully-qualified URL.
 * Assumes $wgServer is correct. :)
 * @param string $url, either fully-qualified or a local path + query
 * @return string Fully-qualified URL
 */
function wfExpandUrl( $url ) {
        if( substr( $url, 0, 1 ) == '/' ) {
                global $wgServer;
                return $wgServer . $url;
        } else {
                return $url;
        }
}

Pgr94 17:13, 10 December 2011 (UTC)Reply

The trunk of includes/GlobalFunctions.php has wfExpandUrl() defined to take two arguments. Conclusion: An update to SpamBlackList assumes a more recent version of includes/GlobalFunctions.php and thus breaks running wikis. Pgr94 17:25, 10 December 2011 (UTC)Reply

Broken in this edit:
r95663 | catrope | 2011-08-29 09:37:47 -0500 (Mon, 29 Aug 2011) | 8 lines
Pgr94 17:32, 10 December 2011 (UTC)Reply

HTTP sources not working?

HTTP URLs (including m:Spam_blacklist) are not working for me. They have no effect (but no errors are reported.)

Local DB and file sources work fine.

Is there a known bug with the HTTP fetch function, or a clue I could look for in the logs or cache?

--Finnw (talk) 09:29, 4 September 2012 (UTC)Reply

Same here.

After Upgrade to 19.2 will not allow editing of Main Page

I've been using this extension for years and I upgraded to 19.2 tonight. Upon doing so I found that I can edit all of the pages on my site with the exception of the Main Page.

When I attempt to do so, the result is:

Spam protection filter The text you wanted to save was blocked by the spam filter. This is probably caused by a link to a blacklisted external site.

The following text is what triggered our spam filter: spam

There are no IRLs in my edit, I am the Admin of the Wiki and I cannot understand what's going on. Any thoughts would be welcome! TeraS (talk) 04:10, 7 October 2012 (UTC)Reply

PHP 5.0.2+ compatibility

I'm running PHP 5.0.x on my server, and I found that a single change makes SpamBlacklist work: add

   if (! defined(' __DIR__')) define('__DIR__', dirname(__FILE__));

in SpamBlacklist.php on line 9 (or, basically, at the very start of this file). Until then, I've added a note mentioning that PHP 5.3 is required, since __DIR__ appears to have been introduced in 5.3. --Romanski (talk) 16:10, 18 October 2012 (UTC)Reply

Licensing

"SpamBlacklist was written by Tim Starling and is (deliberately) ambiguously licensed." So in other words, it's copyrighted. Leucosticte (talk) 23:13, 17 November 2012 (UTC)Reply

RBL code

Something I thought should be noted about the external blacklist servers: the code doesn't use filter matches through the whitelist, so if one of the servers has a false positive, there's no easy way to override it. Has anyone looked at adding whitelisting logic to it? --Abates (talk) 08:15, 13 February 2013 (UTC)Reply

Shared Whitelist

I have a wiki farm and wish to share a whitelist between them. This can be done either through a file that all wikis can access, or via a URL, similar to http://en.wikipedia.org/w/index.php?title=MediaWiki:Spam-blacklist&action=raw&sb_ver=1.

I am aware that $wgSpamBlacklistFiles can be used to share blacklists, but I wasn't able to find the equivalent for whitelists.

Is this possible?

--Mastergalen (talk) 15:49, 4 October 2013 (UTC)Reply

Compatibility issues with Extension:LabeledSectionTransclusion

It looks like SBL has issues looking over the parser output when LabeledSectionTransclusion extension is being used. Disabling either extension will bypass the issue temporarily.

Invalid marker: ?UNIQ3f2bd00b392e0653-section-00000006-QINU?

Backtrace:

#0 includes/parser/StripState.php(66): StripState->addItem('general', '?UNIQ3f2bd00b39...', '<section begin=...')
#1 includes/parser/Parser.php(3959): StripState->addGeneral('?UNIQ3f2bd00b39...', '<section begin=...')
#2 includes/parser/Preprocessor_DOM.php(1144): Parser->extensionSubstitution(Array, Object(PPFrame_DOM))
#3 includes/parser/Parser.php(4357): PPFrame_DOM->expand(Object(PPNode_DOM), 27)
#4 includes/parser/Parser.php(1180): Parser->formatHeadings('<h1>?UNIQ3f2bd0...', '={{{model}}} Ov...', true)
#5 includes/parser/Parser.php(383): Parser->internalParse('={{{model}}} Ov...')
#6 [internal function]: Parser->parse('={{{model}}} Ov...', Object(Title), Object(ParserOptions), true, true, NULL)
#7 includes/StubObject.php(79): call_user_func_array(Array, Array)
#8 includes/StubObject.php(99): StubObject->_call('parse', Array)
#9 includes/content/WikitextContent.php(299): StubObject->__call('parse', Array)
#10 includes/content/WikitextContent.php(299): StubObject->parse('={{{model}}} Ov...', Object(Title), Object(ParserOptions), true, true, NULL)
#11 extensions/SpamBlacklist/SpamBlacklistHooks.php(28): WikitextContent->getParserOutput(Object(Title))
#12 [internal function]: SpamBlacklistHooks::filterMergedContent(Object(RequestContext), Object(WikitextContent), Object(Status), '', Object(User), false)
#13 includes/Hooks.php(255): call_user_func_array('SpamBlacklistHo...', Array)
#14 includes/GlobalFunctions.php(3883): Hooks::run('EditFilterMerge...', Array)
#15 includes/EditPage.php(1305): wfRunHooks('EditFilterMerge...', Array)
#16 includes/EditPage.php(1612): EditPage->runPostMergeFilters(Object(WikitextContent), Object(Status), Object(User))
#17 includes/EditPage.php(1187): EditPage->internalAttemptSave(false, false)
#18 includes/EditPage.php(416): EditPage->attemptSave()
#19 includes/actions/EditAction.php(59): EditPage->edit()
#20 includes/actions/EditAction.php(86): EditAction->show()
#21 includes/Wiki.php(439): SubmitAction->show()
#22 includes/Wiki.php(305): MediaWiki->performAction(Object(Article), Object(Title))
#23 includes/Wiki.php(565): MediaWiki->performRequest()
#24 includes/Wiki.php(458): MediaWiki->main()
#25 index.php(59): MediaWiki->run()
#26 {main}

block statistics

Is there any way for me to get block statistics on how often someone gets blocked by the SpamBlacklist on the wiki I maintain? I.e., is there any way to get something to tell me "It's been 3 hours since the last time someone was blocked by the SpamBlacklist", or "Yesterday 17 edits were blocked by the SpamBlacklist"? (Ideally it would tell me how many times in the last month each item on my local blacklist had any effect, so I could prune the ones that are no longer used). --DavidCary (talk) 03:19, 9 November 2014 (UTC)Reply

You can enable the SpamBlacklist DebugLogGroup. I see the extension has plenty of logging, so it should be rather easy to add more details in log lines if you need to: maybe try hacking the logging code locally and then submit a patch when you reach a level of detail you find balanced? --Nemo 08:08, 9 November 2014 (UTC)Reply

Request, user whitelist or function

I would like to request a function, which can exclude selected users or specific groups such as sysop from being checked by the SpamBlacklist extension. This would indeed be very useful when you have an editor (or a group of editors) that are trusted. It can take a long time to check the lists, and a lot of waiting time is spent while editing (submitting). Personally I have tested the issue by taking out Spamblacklist, and the wiki speeds up tremendously. Thought that some trusted users should have some advantages. The function could be a list of users in an array, or a group of users specified in the same way by just using group names such as sysop or the like. Thanks. Peter Lee (talk) 11:22, 24 December 2014 (UTC)Reply

MediaWiki 1.25.1

I am getting the following error on MediaWiki 1.25.1 with SpamBlacklist and SemanticForms:

 Invalid or virtual namespace -1 given.

What should I have to do? Jaider msg 23:50, 30 May 2015 (UTC)Reply

Could regex batch be optimized?

I see in SpamRegexBatch.php:

$regex = 'https?://+[a-z0-9_\-.]*(' . implode( '|', $lines ) . ')';

I am concerned by the [a-z0-9_\-.]* part. Don't we have a lot of backtracking because of this?

It may be more efficient to do a first pass matching, on just implode( '|', $lines ) to find out possible matches, then filter out these candidates by running the stricter, full regex on them only.

Od1n (talk) 00:50, 13 April 2018 (UTC)Reply

On the other hand, we have an anchor with the starting part https?://, and removing it would negatively impact performances. It got out of my mind that the regex is tested on the \n-imploded list of URLs, not on each URL individually. Still, some ways to optimize this extension should probably exist ;) Od1n (talk) 00:06, 15 April 2018 (UTC)Reply

Unicode domain

This extension does not support Unicode domains, which are available by many registrars and need to detect with Unicode regex. I try blacklisting some Unicode domains but they can still be saved. Only thing you can do with this extension is to convert the Unicode domains to Punycode. (But no one promotes their web site with Punycode though.) Otherwise, you have to use more flexible Extension:AbuseFilter. --Octahedron80 (talk) 04:21, 13 July 2018 (UTC)Reply

Return to "SpamBlacklist/From MediaWiki" page.