Apache-konfiguráció

This page is a translated version of the page Apache configuration and the translation is 42% complete.

Az Apache HTTP-szerver (httpd) a MediaWiki esetén leggyakoribb webszerver.

Modulok

PHP

A PHP Apache-modulként

A MediaWiki a PHP-t Apache-modulként kezeli. A PHP valószínűleg modulként van konfigurálva, ha az URL-jeid a következőképp néznek ki:

example.com/index.php/Main_Page

A PHP konfigurációját és -verzióját a Special:Version lapon, illetve a phpinfo() oldalon ellenőrizheted.

Install PHP, Apache, and the PHP module for Apache. Afterwards, make sure that the Apache service is running. For specific commands, refer to the documentation of your operating system or distribution.

PHP CGI-ként

Ha a PHP-t CGI-ként futtatod, alapértelmezésként „ronda” URL-jeid lesznek, de továbbra is beállíthatod a rövid URL-eket.

CGIWrap

Ha saját szervereden futtatod az Apache-ot és a PHP-t CGI-ként futtatod, telepítheted a CGIWrapet. Az eszközzel az Apache-szervert a CGI-knek más felhasználóként is futtathatod.

Ezen módon a MediaWiki-lapokhoz új fiók jön létre. A CGIWrap telepítése kívül esik ezen dokumentáció témáján, főleg, hogy saját szerveredre vonatkozóan kell lefordítani. Gyors útmutatóként a következőket követheted:

  • Wikimédia-felhasználó létrehozása
useradd -M -s /sbin/nologin wikiuser
  • Egy cgi-bin mappa megléte, amely a CGIWrapet tartalmazza (például /home/myuser/cgi-bin). A beállítások után csak a cgiwrapet tartsd meg, a hibakeresési verziót más könyvtárba mozgasd (ha egyáltalán szükséged van rá). A cgiwrap fájlhoz csak az Apache férjen hozzá (chown és chmod).
chown apache:apache cgiwrap
chmod 500 cgiwrap
  • A cgi-bin mappában a Wikimédia-gyökérre mutató szimbolikus link létrehozása.
ln -s /home/myuser/public_html/wiki /home/myuser/cgi-bin/wikilink
  • A wikid .htaccess fájljában add meg a következőket:
AddHandler php-wrapper .php
Action php-wrapper /cgi-bin/cgiwrap/wikiuser/wikilink
  • Finally, chown and chmod all the .php files of your Wikimedia folder to be accessible solely by wikiuser.
find . -name \*.php -exec chown wikiuser:wikiuser {} \;
find . -name \*.php -exec chmod 500 {} \;

The files will be accessible as usual. You do not need to specify in your path any cgi-bin, as this is transparently taken care of for you.

I strongly suggest you start out with /cgi-bin/cgiwrapd/... as your php-wrapper, as it will precisely show what is currently working. I also strongly suggest you do not delete your CGIWrap source folder until everything works perfectly as this is a real trial and error process, taking a long time. However, it's all worth your time as your MediaWiki will be run in its own separate process, in its own uid, without being able to interfere any other uid. Inverse is also true, except for root, that can read anything anywhere.

mod_alias / mod_rewrite

The recommended method of beautifying URLs involves mod_alias. Other methods use mod_rewrite instead.

mod_security

ModSecurity has been known to cause problems with MediaWiki. If you get errors seemingly at random, check your error log to see whether it is causing problems.

VisualEditor and Subpages

In order to prevent errors contacting the Parsoid server, AllowEncodedSlashes NoDecode must be added to the wiki's VirtualHost config block (or to the general server config if VirtualHosts are not used).[1]

Thread stack size

The stack size for each Apache thread is configurable and the default varies on different operating systems. To run MediaWiki on Windows environments it may be necessary to increase the stack size (if there are problems), as the 1MB default is small and can cause stack overflows during PHP script execution. The following httpd.conf setting will set the stack size to about 8MB (about a typical Linux default):

<IfModule mpm_winnt_module>
ThreadStackSize 8388608
</IfModule>

Spiders and bots

You really should use a robots.txt file to tell well-behaved spiders not to download dynamically generated pages (edit pages, for instance). This can reduce the load on your webserver, preserve your bandwidth, and prevent duplicate content issues with search engines. However, malicious bots could tie up your webserver and waste your bandwidth by downloading a large volume of pages extremely quickly. Request throttling can help protect against this.

See also

References