Is there a lossy or lossless compression tool for mediawiki in order to improve the page load? something like Smushit.com https://developers.google.com is telling me to 'enable compression' for some images to speed up the page load.
Topic on Project:Support desk
The HTTP protocol supports compression of any content.
could you give me more information how I can use this or links to pages that could help me? Thanks
LTech did not ask for a wa to do HTTP compression, but he in fact wants to compress images. Smushit does some kind of minification, which reduces the file size of the actual image. It does not do any HTTP compression (which in fact would help in addition to already having smaller file sizes right from the start).
When you upload an image into MediaWiki, you would be able to compress it, if, directly after upload, you could run an ImageMagick command. Does MediaWiki support that?
In my test, Smushit reduced the size of a test image by 3%; 97% were still there. A nice little start, but not very impressive. If you really want remarkably smaller image sizes, maybe you should think about lowering the value of $wgMaxUploadSize. That would enable you to "force" small image sizes.
Another aspect would be to use GZip to compress what later is transferred to the client. However, GZip is not extremely effective on images. Usually, compression is used for things like JavaScripts and CSS styles, not for images.