For curiousity, do custom CSS and JavaScript run while an account is blocked?
If so, it could cause a user to be unable to turn off their custom code if they want to, since they are unable to edit it.
Never mind, I will try it out myself later. I just asked thinking someone who has already tried it could respond with yes/no.
For curiousity, do custom CSS and JavaScript run while an account is blocked?
If so, it could cause a user to be unable to turn off their custom code if they want to, since they are unable to edit it.
Basically, the code can be accessed and executed by any www user around the world.
However, the code is not “running” if an account is not permitted to log in or to edit, since the automatic execution is bound to login.
Turning off is not necessary as long nobody is starting the code, and this needs some kind of explicit transclusion if not per default after login for this account.
Thanks for response.
Basically, the code can be accessed and executed by any www user around the world.
An on-site block can obviously not affect CSS and JS code that is run using other tools such a browser extension, but I was referring to the custom CSS and JS pages that users can have in their user space (e.g. User:Example/common.js), which allows users to enter custom code that runs upon each page load.
However, the code is not “running” if an account is not permitted to log in or to edit, since the automatic execution is bound to login.
An ordinary block does not affect users' ability to log in, it simply disables editing. I presume it also disables the editing of custom CSS and JS pages that the user has in their user space.
But does a block prevent the existing code in user space from being run?
The code is still there, and if a blocked person is still permitted to log in and will personally execute any JS/CSS they want they are able to do so. There are no means to prevent this, and the code may be copied and executed via en:Greasemonkey or browser JS/config. However, it does no harm to the world around. Who cares?
If there are dangerous things implemented which might e.g. endanger privacy of other people invited to run this code, or manipulate their edits, every sysop is permitted to delete such code immediately without further JS/CSS permissions.
The code is still there, and if a blocked person is still permitted to log in and will personally execute any JS/CSS they want they are able to do so.
Sorry, apparently I didn't word my question well enough. When "User:Example" is logged in, will MediaWiki automatically run "User:Example/common.js" in the browser of "User:Example" while the account "User:Example" is blocked?
I could test it myself, but I have no access to a MediaWiki server at my current location and didn't want to bother setting one up just to test this.
My apologies if I failed to see some notes about conditional script inclusion but in a case a browser requires a pre-IE 9 script (add an additional JS script only for that particular situation), how is one to work around this problem with the ResourceLoader.
For example:
<!--[if lt IE 9]><script language="javascript" type="text/javascript" src="excanvas.js"></script><![endif]-->
Register a separated module but it should only be loaded in case the necessary conditions are due to (pre-IE 9).
$wgResourceModules['ext.srf.excanvas'] ...
Which method in the RL/JS could be used to load this module so that the condition [if lt IE 9]
is met?
This is currently not supported by ResourceLoader. However MediaWiki in general (the OutputPage class that builds the page output) does support it. You may use one of the following two methods:
OutputPage::addScript
in combination with Html::inlineScript
. For example:$excanvasLoad = Html::inlineScript(
ResourceLoader::makeLoaderConditionalScript(
Xml::encodeJsCall( 'mw.loader.load', array( 'ext.srf.excanvas' ) )
)
);
$out->addScript( '<!--[if lt IE 9]>' . $excanvasLoad . '<![endif]-->' );
);
OutputPage::addStyle( url, media, condition )
where url points to a file directly. For example:$out->addStyle( 'modules/IE70Fixes.css', 'screen', 'IE 7' );
Alternatively you coud use mw.loader.using( 'ext.srf.excanvas', function callback() { .. } );
from within your main module code, to load it on-demand when you need it and the browser is IE. Like this
-- modules/ext.James.display.js
var james = {
init: function () {
$(document).ready( function () {
/* do canvas stuff */
});
};
};
var p = $.client.profile()
if ( p.name === 'msie' && p.versionNumber < 9 ) {
mw.loader.using( 'ext.srf.fallback', james.init );
} else {
james.init();
}
Solved ... set php in ExpiresByType works. the js does not identify as js since loaded over php.
Thankfully the ini_set("zlib.output_compression","on"); works so i can move on for more optimize. Actually i stuck on caching / stay in browser for the output of RL.
Since the RL delivers over load.php it doensn't uses the ExpiresByType declaration in .htacess. Accordingly to https://www.mediawiki.org/wiki/Topic:Runnesgxgk2nc68f $wgResourceLoaderMaxage should be set, but it doesn't make any difference in [https://developers.google.com/speed/pagespeed/insights/
Google Insights] https://webpagetest.org/ . Is there any way to set the leverage browser caching? --Gunnar.offel (talk) 15:08, 7 November 2019 (UTC)
I
Solved ... set php in ExpiresByType works. the js does not identify as js since loaded over php. --Gunnar.offel (talk) 20:29, 7 November 2019 (UTC)
I have a script which makes certain DHTML modifications, but don’t know how to make it called after refreshing—such as pushing [↻ newest changes]—in new Special:Watchlist or RecentChanges.
“$( document ).ready( function () … )” doesn’t help. Also, I found no hooks related to Watchlist in https://doc.wikimedia.org/mediawiki-core/master/js/#!/api/mw.hook
I think you can attach your callback to the onclick event of the button in question (now I can't say more, I'm from mobile... apologies).
Pushing [↻ newest changes] isn’t the only way how such pages are updated without reloading. Of course, thought about redefining events manually, but it is crude and error-prone.
I'd also be interested in this. Mainly because I want to get this code which auto-expands grouped-entries, to be triggerable from a new button (which would be useful after clicking the [↻ newest changes] button). (See also phab:T176555 where I filed the above as a suggested feature)
I'm happy with your exercise and I'm really looking forward to see the delivery. If it is not too late, I have one suggestion:
.load('[[User:Foo/helpful.js]]')
generates entries on What links here?// [[User:Foo/helpful.js]]
in order to get knowledge about other users.This post was posted by Krinkle, but signed as PerfektesChaos.
Hi PerfektesChaos,
Thanks for taking the time to write up this feedback. It's most welcome!
To get right to the subject, right now ResourceLoader is not and can not be used to load user scripts. User scripts are a (powerful) invention by the community that has shown great results and makes sharing really easy, however they are currently not natively supported by the MediaWiki software. Scripts are stored on (technically article-like) normal wiki pages, and the method to load the raw wikitext (action=raw
) is used to load it without rendering and thus, when the url is inserted into an HTML <script>
tag, it is executed as javascript.
To make this easier to do for the community, the developers have added the popular importScript()
function to the core software so that it doesn't have be created from scratch on each wiki.
Also Extension:Gadgets was created which makes writing scripts even easier and is ultimately how scripts should be managed. Since Gadgets are built as an extension and connect into the right MediaWiki hooks, all the native features are available for Gadgets. That includes
But what it can't do is create gadgets on a per-user level (only on a per-site level). This is a much requested feature, but currently not possible yet in an efficient and scalable way (requires a few other changes as well). But I expect that this will be implemented as part of Gadgets 3.0.
So to get back at your suggestion. Although you can use mw.loader.load()
right now to load a url of choice (including, but not limited to, urls to the current wiki where your user script is stored), that is just loading it as a url (just like importScriptURI
does), that is not really a feature of ResourceLoader. And as such, since user scripts technically don't exist as a feature, ResourceLoader can't create an alias for it in the existing system.
Since the environment in which user scripts are written has changed a lot in the past few months/years, I don't think it is a good idea to make any more changes that require (or motivate) a lot of changes (such as changing to format to .load('[[Page name]]')
. Instead focus on keeping the environment stable, while developers work on a solution to migrate away from user scripts (the current scripts will continue to work, don't worry)... to a solution that allows the same (and more) functionality in a better way that
..in other words: Gadgets 3.0, in which users can create their own fully-featured Gadgets!
So for now I'd recommend to stick with what already users know, use and will work fine:
// [[Namespace:Page name]] importScript('Namespace:Page name') /* or */ // [[Namespace:Page name]] mw.loader.load('https://www.mediawiki.org/w/index.php?title=Namespace:Page_name&action=raw&ctype=text/javascript');
I'm developing an extension and I'm using the resource loader to load my stylesheet and script files. Now that I've developed on multiple branches for a while, I can see that the resource loader responses contain my stylesheet multiple times: Once in its current version and once in an old version or a version that may come from a different branch of my project. How can I purge the cache of the resource loader to remove that second sheet from the response?
Hi,
I'm trying to load all the static resources (js, css and image files) from a separate URL (a CDN actually), and using $wgStylePath has the images working just fine, but all the css and js loaded through the ResourceLoader still comes from the original domain.
Is it possible (now in 1.17.0 or in the future) to make these files load from a different URL?
Thanks,
-Dan
Wikipedia does this as well, they load resources through http://bits.wikimedia.org/
To achieve this set $wgLoadScript
to where you want requests for load.php to end up.
Note that this means that your CDN needs to have access to your database, MediaWiki and your configuration.
Thanks for the answer.
From my testing, setting $wgLoadScript also means the CDN has to be able to execute PHP (load.php), so in other words it has to be a full-on webserver, not just a static file server (as 99% of all CDNs are). The fact that it also needs to access the db and LocalSettings.php means it probably has to be on the same subnet, etc.
I'm starting to wonder about this resourceloader - it goes to a lot of trouble to minify a ton of css and js on every single request, using php to do so, and hitting the main web server (in this case, memory-hungry Apache).
Wouldn't it be better to minify as many of those files as possible beforehand and just serve them as they are (and then they can also come from a static-file-server CDN somewhere else on the internet (ie. entirely different subnet, etc.) ?
Is there any way to achieve this?
Thanks again,
-Dan
Well depending on your set up, it obviously is not intended to go through all minification, combination, embedding, localization etc.
Because of various unique combinations in the URL such as the version timestamps that ResourceLoader includes in requests to load.php, these requests are highly cachaeble!
Because of the wide range of scenarios that MediaWiki has to support to be able to scale to a platform like Wikipedia, it needs access to the filesystem and the database. (Consider localization into over 300 languages including right-to-left and non-latin languages, support for user-preferences that allow users to change the skin on a per-user base (and thus changing the modules to be loaded), gadgets (enabling custom scripts), scripts from extensions loaded only under certain conditions)
The way Wikipedia has this set up is by using a reverse-proxy like Squid or Varnish that serve a static cache of all resources and, because of the timestamps, it can cache these requests "for ever". Whenever a module has changes (which won't happen for an average MediaWiki install unless you upgrade MediaWiki, change configuration files or install/uninstall extensions), it will use the new timestamp in the request for that module, thus changing the url to load.php. Then the bits-server will only initiate MediaWiki if there is no static cache for the url.
According to the stats as of August 2011, bits.wikimedia.org/../load.php has a cache hit ratio of 98.2%. For all those requests MediaWiki was not initialized, no database connection etc.
However a simple static-file-server CDN does not suffice. It's not impossible to use a static-file-server CDN, but no implementation for that was made as Wikipedia uses Varnish as a reverse-proxy for cache. You could contact User:Catrope if you're interested in building in support for a static-file-server CDN (e.g. somehow upload static files through FTP or something to that CDN when new ones need to be generated and embed or 301-redirect to those urls directly.
Thanks for the great explanation Krinkle, I understand the situation much better now. The timestamp-on-the-resourceLoader-URLs makes a lot of sense for a reverse-proxy cache like Varnish
I had always intended to put Varnish in front of MediaWiki/Apache at some point, so I'll focus my efforts there instead of trying to offload more stuff into a static file-serving CDN.
Thanks again!
-Dan
You meant 99.8% ;) . Also, all Wikimedia wikis run off four web servers with a total of nine Varnish servers (in two different data centers) in front of them. And those four Apaches barely get any load; their CPU usage is like 10% so in theory ResourceLoader could run off a single Apache. See also our OSCON presentation, and the slide with these numbers. --Catrope 19:14, 18 October 2011 (UTC)
From your description, we could setup anything for load.php to goto the CDN. If the CDN uses our server as the origin (and includes the query params as part of the caching) then the very first request hits our server and what it produces is imminently cachable by the CDN (forever).
So why couldnt we use a CDN (with no Varnish, no access to the DB etc.) with load.php?
Hi, sorry if this is a wrong place to ask ... in our install of Mediawiki 1.19 (http://wiki.geogebra.org/en/Special:Version ) the ResourceLoader stores some Javascript which contains timestamps:
[["site","1352754747",[],"site"],["noscript","1352754747",[],"noscript"],["startup","1352859900",[],"startup"],["user","1352754747",[],"user"],["user.groups","1352754747",[],"user"],["user.options","1352859900",[],"private"],["user.cssprefs","1352859900" ...
in the cache (objectcache table in MySQL) with expiration date 2038. I guess such entries are never hit (as they contain timestamps, which are often unique for unregistered users) and since of the long expiration date they fill the cache -- in a couple of weeks it grew to 690MB.
What's the best way to circumvent this problem?
Ouch, that's a problem. The cache keys in question probably contain the word 'minify' and an md5 hash, right? If so, it's the JS minifier cache, which caches the result of JS minification, and because the same original code always results in the same minified code, it's cached forever as it can never change.
I'm afraid that for now you'll have to periodically purge the objectcache table of keys with this pattern; I've filed a bug about it, and once it's fixed, there will be a patch that you can apply that addresses this behavior. But it may take up to a few months for this patch to get written, unfortunately. An alternative workaround is to use memcached instead of the DB cache; if you have enough users that you're producing >30MB/day in cache cruft, that might be a good idea anyway. memcached uses a fixed maximum size for its cache, and when the cache fills up, it will start throwing out (evicting) old entries, with the least recently used (LRU) entries getting evicted first. This is what we use on our production wikis (Wikipedia and sister sites) as well.
Thanks for the report, this is a quite nasty issue that we hadn't heard about before. We never experienced it ourselves because we use memcached, which is not affected by this problem as it automatically removes unused entries if the cache grows too large.
Thanks for the fast reply! Yes, the keys contain "minify". We will consider switching to memcached, but for now I've just hacked ResourceLoader.php -- instead of
$cache->set( $key, $result);
we now use
$cache->set( $key, $result, time()+86400);
I guess this hack would be a bad thing to do on a large wiki, but it should be OK for our medium sized one (~5k pageviews per day).
Hi Catrope,
I was wondering what the bugId number was for this issue. We too have been having this issue (mw 1.18) on our corporate site where the objectcache table grows past a a few GB (at one point it was 28gb). After a while I just go in there and run TRUNCATE TABLE objectcache;
to fix it up, however this is not the correct solution.
I recently checked the latest build, 1.20.4 to see if something like the suggested fix was placed in ResourceLoader.php, but did not see it. I was hoping to find the modify for $cache->set( $key, $result );
to something like, $cache->set( $key, $result, time()+86400);
.
We love using MW here for our development and support personnel. This bug is our biggest show-stopper and want to upgrade to the latest version, but i want to make sure before I take our site down for maintenance.
Thank you! and keep up the good work
This post was posted by Dsuess~mediawikiwiki, but signed as Dsuess.
https://bugzilla.wikimedia.org/show_bug.cgi?id=42094 is the bug number.
(Sorry for the late response, I don't monitor this space very well.)
I've noticed some issues loading resources where source files might have had byte order markers at the start \uFEFF
Not sure how I got them there but I had to use a Notepad++ to strip them for resourceloader to work correctly, might be worth checking for such characters before emebding in resource script object.
Thanks for your bug report. This is tracked at, https://phabricator.wikimedia.org/T119379
Is a script or stylesheet assembled by ResourceLoader unambiguously defined by its URI, regardless of the user for whom it is served?
In other words, can on safely cache such resources with a caching proxy?
Hi Alex, this is not an authoritative answer, but AFAIK the answer is yes, and I certainly do so on my wiki (I use Varnish). The RL module are language dependant, but not exactly user-dependant - user information, such as the config variable wgUserName, are injected into the HTML document itself, which should not be cached for logged-in users.
Wikimedia's own configuration files for Varnish can be found here, if you want to take a look: https://github.com/wikimedia/operations-puppet/tree/production/modules/varnish/templates/vcl