Manual:Wiki in site root directory

It is possible to install MediaWiki in the root directory and have URLs like example.com/pagename, but some people do not recommend it.

Guides

edit

See Manual:Short URL.

Considerations

edit
Potential problem Potential solution
Potential need for special rules. You might need special rules for "robots.txt" or "favicon.ico", also for all wiki support files like skin images, extensions that load content from the "/extensions/" folder (such as CSS, JS, or images), and root scripts like api.php, thumb.php, and image_auth.php. You do not need special rules on Apache if you use the following in .htaccess:
RewriteCond %{DOCUMENT_ROOT}%{REQUEST_URI} !-f
RewriteCond %{DOCUMENT_ROOT}%{REQUEST_URI} !-d
RewriteRule ^(.*)$ %{DOCUMENT_ROOT}/index.php [L]

This rule allow existing files to be accessed as normal, whether they are in the root directory (e.g. robots.txt) or in sub-directories.

Periodically bugs pop up with various configurations and root URLs because MediaWiki was not designed for them. These are being tracked in task T34620; currently these are the outstanding issues: See below.
  • task T34621: "Root /Foo style article paths and action paths conflict"
It is possible to set up action paths on Apache with the following in LocalSettings.php:
$actions = array( 'view', 'edit', 'watch', 'unwatch', 'delete','revert', 'rollback', 'protect', 'unprotect', 'markpatrolled', 'render', 'submit', 'history', 'purge', 'info' );

foreach ( $actions as $action ) {
    $wgActionPaths[$action] = "/$action/$1";
}

With the following short URL settings in LocalSettings.php:

$wgScriptPath = "";
$wgArticlePath = "/$1";

and in .htaccess:

RewriteCond %{DOCUMENT_ROOT}%{REQUEST_URI} !-f
RewriteCond %{DOCUMENT_ROOT}%{REQUEST_URI} !-d
RewriteRule ^(.*)$ %{DOCUMENT_ROOT}/index.php [L]

See Manual:$wgActionPaths for details.

  • task T40048: "Root article paths allow creating local-like links to external sites (bypass nofollow, attacks Special:Random)". If a page name begins with a slash (e.g. /example.com) it will be rendered as an external link: [[/example.com]] displays as /example.com but links to https://example.com/. This also means that Special:Random would redirect straight to the external page.

Similar bugs, not tracked at task T34620, are:

  • task T113160: "Super-short URLs are formed incorrectly for articles with titles starting with a slash".
  • task T141444: "Incorrect treatment of titles starting with a slash when URLs are ultra-short"
You can prevent pages from beginning with a slash by enabling Extension:TitleBlacklist, which comes bundled with MediaWiki, and adding the following line to MediaWiki:Titleblacklist on your site:
\/.*

This solves the Special:Random and page creation problems, but does not stop someone from adding an internal link to a (non-existent) page like /example.com. If you have subpages enabled (as on mediawiki.org's manual pages) then it is still not a problem. But if you don't, then the link will be rendered as an external link.

There are a lot of bots scanning random websites searching for known vulnerabilities to exploit, usually performing checks against a list of known URL paths, and they usually request many URLs without delays between them. If your pages start at the root directory, this means each request will hit MediaWiki instead of a plain HTTP 404 error page, which means they will put more strain on your server. Some of the big performance problems have been mitigated (see reduced the DOS potential of 404 page floods). This shouldn't get a significant amount of strain. Consider banning IPs (at firewall level or with a deny rule) of known abuser networks. Implement request throttling.
Use the configuration used by Wikimedia (example.com/wiki/pagename) if you want to be on the safe side. Apply the solutions above and use example.com/pagename if that is your preference.