Manual talk:File cache

The following discussion has been transferred from Meta-Wiki.
Any user names refer to users of that site, who are not necessarily users of MediaWiki.org (even if they share the same username).

Invalidation does not work for me

edit

Please help, I am trying to correct translation of wikopedia, but I have problems with invalidating the cache. I correct the cur_text field in the database, but nothing changes. I have tried to manually set the cur_touched field to current time and use $wgCacheEpoch to invalidate but nothing has worked.

Thanks for helping.

Please edit the article instead of the database. That will cause the caches to be updated. If it is in a configuration file, you can delete every file in the file cache and that will leave no choice but to create a new file. If you still have trouble, please give more details in an email to the MediaWiki mailing list. Jamesday

It would be helpful if an example of how to set $wgCacheEpoch to the current time, and also it would be helpful if more explanation of how one could go about deleting the cache. For example, where is the cache? Can I just find a cache directory on the hard drive and delete it? GregGarner

need defintion for cache for less computer savy wikiusers

edit

Need definition. 68.92.156.227 17:14, 21 December 2006 (UTC)Reply

Help clearing article cache?

edit

Hi all. I'm working on a tag-extension for uploaded images in which I make it possible to edit tags succesfully in image description pages without having to make changes to the article itself. I did this by sending the tags to updateArticle() when posting and then bypassing the "if ( 0 != strcmp( $text, $oldtext )" (check if anything changed in the article) by putting my function addTags() after the end of the if (and else) brackets. This way the tags are added anyway, even when the article wasn't updated.

My problem is: when an article isn't updated the cache isn't cleared, requiring a hard refresh to see the new tags. How can I make sure the cache of that particular article is cleared even when it hasn't changed? Or should I just force a change in the article? Many thanks! Litso 21:41, 30 January 2008 (UTC)Reply

Never mind, I solved it. Calling the Title::touchArray( ); function didn't work for only one article because it isn't an array, so I just stripped the function and used $dbw->update to change the page_touched field to the current time. If anyone is interested, the following code worked for me:
				$dbw->update( 'page', <br>
				array( 'page_touched' => $dbw->timestamp(); ), <br>
				array(  'page_namespace' => $this->mTitle->getNamespace(),<br>
					'page_title'     => $this->mTitle->getDBkey() ),<br>
				$fname );<br>
where of course $this->mTitle represents the article name in the function the code was placed in, I don't know if this is used consistantly throughout the code. Litso 09:24, 31 January 2008 (UTC)Reply

removing old cache items through linux command

edit

find /path/to/cache -type f -mtime +5 -exec rm {} \;

This will clear out all files that are older than 5 days old.

BE EXPLICIT WITH YOUR PATH. You don't want this to go wrong. You can change rm to rm -v to see what is being deleted as it is being deleted

We use a /html folder in the cache folder to store files. $wgFileCacheDirectory = "$IP/cache/html";
This allows to separate out localisation files in the cache, although there are some other sub folders created by file caching, /resources or /history are a couple.
The command line can simplified even more with find /home/path/to/w/cache/html -type f -mtime +5 -delete
Use a cron job to execute it once a week, every five days, daily or hourly.
Test with -print in the command line to see the results. find /home/path/to/w/cache/html -type f -mtime +5 -print Better to find a path error before you run a delete command. This way you can view exactly what files are found with the find command.
Here are a few more useful commands to keep your cache up-to-date:
  • find /home/path/to/w/cache/html -type f -mtime +1 -delete delete files older than a day
  • find /home/path/to/w/cache/html -type f -mmin +720 -delete delete files older than 12 hours
  • find /home/path/to/w/cache/html -type f -mmin +60 -name Portal\* -delete delete all the Portal:Page files older than 1 hour to keep them fresh.
  • add >/dev/null 2>&1 to the end to supress email output of execution if using in a cron job.
Hutchy68 (talk) 02:08, 29 July 2013 (UTC)Reply

Cached Errors

edit

I'm running into infrequent problems where an error occurs, and it caches page with the error message. (I noticed that this is listed as a known problem -- and I've also got an open bugzilla ticket for it.)

I'm trying to figure out a way to build a hook or something to not cache these pages when an error occurs. I've looked through the hooks, but I can't figure out which one(s) I could use to accomplish this. Any ideas? --Duke33 14:38, 8 July 2009 (UTC)Reply

It looks like these error pages are no longer cached, as part of the 1.15 release. --Duke33 19:39, 28 October 2009 (UTC)Reply

Make robots only able to access cached content

edit

I don't want bots (such as web crawlers and search engines) to grab content that hasn't been previously cached. To stop it from accessing non-cached content I did the following:

Edit includes/HTMLFileCache.php

add this to the begining of the function isFileCached()

      global $custom;
      if($custom->cache_only){
      if(!file_exists($this->fileCacheName())){
          echo "This page has not been cached. Only humans can see pages that are not cached.";
          exit;
      }

In your localsettings.php file check to see if the useragent is a bot that is giving you trouble, and if so, set $custom->cache_only for true when that is the user agent. — Preceding unsigned comment added by 207.47.30.250 (talkcontribs)

Bots won't know that this isn't the page's real content; you should send an HTTP status code that indicates you're not returning the requested resource (404 and 500 are the most obvious choices).
I don't think this is a very good idea in general, though, unless you have a significant long tail and lack the space to cache it. —Emufarmers(T|C) 03:44, 24 November 2009 (UTC)Reply
I do, I have a rather large wikifarm (http://editthis.info), and a decent percentage of wikis are dormant. I changed the message to a checkbox form, which will force it to show and cache the page. I tried a header type of 404 error, but I think at this point in the code, headers have already been sent.

Caching for registered users?

edit

Is there a setting or a code edit I could use to make it so it caches for regular users too? I'm just using the plain file cache. The problem is that there are very few people who use my wiki anonymously, and my server doesn't have that much bandwidth or computing power. -- Spencer8ab 04:41, 11 March 2010 (UTC)Reply

Not really, as the file cache won't play well with preferences and such. Try using object caching (See Manual:Cache) which should improve performance for logged in users. Bawolff 19:10, 24 January 2011 (UTC)Reply

Can '$wgUseGzip = true' be used with 'gzip on;' of the nginx?

edit

I already set gzip on in nginx conf file. Should/Can I set $wgUseGzip option to true? Will nginx recompress the compressed cached file before send it to users?

Enabling file cache doesn't turn off GZip

edit

In my deployment (MW 1.16 on shared hosting) $wgUseFileCache = true; $wgUseGzip = false; and pages are still served as GZip (cache files are not gzipped).

It may be your web server (e.g. apache or nginx) that is doing the compressing job.--Superxain (talk) 05:19, 18 August 2012 (UTC)Reply

category pages in cache do not get updated

edit

Recently I upgraded from mediawiki 1.13 to 1.17 Since then, the file cache for category pages does not work correctly. When I update a page which has categories, the corresponding category pages do not get deleted from the file cache. A not logged in user sees the old copy of the related category pages. For normal pages, the file cache works correctly, i.e. after an update the cache page gets deleted, so that an anonymous user sees the fresh page.

What could be the cause of this problem?

Hi, I'm running MediaWiki 1.19.0 and I have the exact problem. There is a bug report and another one. My main page doesn't change too, but that kinda makes sense because it doesn't get edited, the changes are just made through template rotation. I'm gonna add a shell script (which deletes the category files and the main page from the file cache) to crontab. Cheers, --Till Kraemer (talk) 11:42, 6 June 2012 (UTC)Reply

I am having the same problem. Any solution? or where can I find script? Is there a way I can do it manually from filezilla or something? Jake 18 April 2013 (UTC)

Hi, sorry it took me so long to respond. You can delete a file from the cache by adding &action=purge to the real URL (not the short URL), for example if you have a category named actors, use http://en.domain.com/w/index.php?title=Category:Actors&action=purge Cheers, --Till Kraemer (talk) 11:54, 10 July 2013 (UTC)Reply

how to ENABLE cache for custom extensions

edit

Hi, I am wondering how to force Mediawiki to use its File Cache for custom SpecialPages and/or transcluded SpecialPage content (from custom extensions). All I've seen is ways to turn it off (which it seems to do by default for any SpecialPage or any page transcluding any SpecialPage, at least in 1.21.x). Thanks! --131.142.152.23 15:45, 3 September 2013 (UTC)Reply

Webfonts

edit

Hi, we use webfonts and it seems that they are not cached. Is that correct and is there a way to have the right webfonts when using file cache? --AdSvS (talk) 10:01, 14 May 2014 (UTC)Reply

Does varnish/squid provide better performance compare to file cache?

edit

Does varnish/squid provide better performance compare to file cache? --Zoglun (talk) 22:45, 3 September 2018 (UTC)Reply

Yes, because caching is done in memory, and requests that hit the cache doesn't hit your app server, which means PHP isn't handling such requests and has more CPU time to do other things. --Ciencia Al Poder (talk) 09:14, 4 September 2018 (UTC)Reply
I don't know how this stacks up against Varnish or Squid, but on a Linux machine you can create a virtual file system and use it for your cache (file cache and message cache). See for instance this page (where it is incorrectly termed a RAM disk). Henryfunk (talk) 01:05, 24 September 2020 (UTC)Reply
With varnish/squid you also cache scripts and CSS, and has a memory control of how much memory it can consume. With a ram filesystem you either have no limit (once you reach the available memory limit your applications will start being killed) or you have to set a fixed limit that will cause errors when you reach that limit. --Ciencia Al Poder (talk) 16:46, 25 September 2020 (UTC)Reply
Thanks, yes it's true, one has to monitor the cache too make sure it doesn't get "filled", but caching in RAM can be a solution if Varnish/Squid isn't an option due to hardware requirements etc. Henryfunk (talk) 13:35, 12 December 2020 (UTC)Reply

Can file cache use together with Varnish

edit

Is it Ok or possible to use file cache with Varnish at the same time? As Varnish use memory which is limited, if used together, if miss Varnish cache, it still can get page from file cache?

There should be no need to do this. See for instance this question and the first answer to it. Henryfunk (talk) 01:18, 24 September 2020 (UTC)Reply
Return to "File cache" page.