Requests for comment/Extensions continuous integration

Request for comment (RFC)
Extensions continuous integration
Component General
Creation date
Author(s) Antoine Musso (WMF)
Document status accepted
Approved. Tim Starling (WMF) (talk) 22:02, 26 November 2014 (UTC)[reply]
See Phabricator.

Background

edit

MediaWiki core depends on third-party libraries as of MediaWiki 1.24. For technical reasons they are held in an independent repository mediawiki/vendor.git. From a continuous integration point of view, this introduces various challenges.

When someone proposes a patch against the master branch of MediaWiki core we can just clone the master branch of vendor and run the tests. But how will we handle a patch sent to REL1_25, or some wmf branch? We need to check out the matching branch in both repositories.

We also had a long standing issue with patches proposed to extensions' release branches. There tests ran with the master branch of MediaWiki core. That caused issues whenever the MediaWiki core API changed between the targeted release and master branch.

Both of these issues have been solved during the Summer of 2014 by introducing Zuul cloner. However we want to improve this setup so that we can run tests with multiple extensions installed.

Detailed below:

  • How Zuul maintains states of repositories for a given patchset.
  • Introducing Zuul cloner which makes it trivial to reproduce the same state on multiple Jenkins slaves.
  • Proposal to extend this system to all MediaWiki extensions deployed at Wikimedia.

Zuul behavior

edit

When a change is proposed for merging (by voting Code-Review+2), Zuul adds it to a shared queue and creates a unique git reference for the change. For each subsequent change, an additional reference is created for the changes ahead in the queue. This lets us speculate the future state of the repositories before the changes ahead are effectively merged. If you were to propose two patchsets in a row for mediawiki-core, Zuul initially assumes the first patch is will pass the tests and land in the branch, thus for the second patchset, it crafts a merge commit from the tip of the branch with a merge of the first patch:

Change1: mediawiki/core + Change 1
Change2: mediawiki/core + Change 1 + Change 2

Zuul then triggers jobs for both states, which thus run in parallel. On success, both changes are merged in the order they entered the queue. If the test for Change 1 ends up failing, Zuul dequeues Change 1 and re-enqueues Change 2 against the tip of the branch (ie. without the merge of Change1) and retriggers the tests.

This workflow is only used by Zuul when changes share the same internal merge queue, which happens if their pipelines share at least one common job. In September we added a dummy job mediawiki-gate to all MediaWiki related repositories to make sure changes end up in the same merge queue. That lets us ensure that mediawiki/core and mediawiki/vendor are being tested properly.

The mediawiki/core and mediawiki/vendor repositories share a job mediawiki-phpunit, thus when voting +2 on two changes made against those repositories, Zuul detects they are coupled and enter them serially in the same queue. If you were to +2 a Change 1 against mediawiki/core @REL1_25 then Change 2 against mediawiki/vendor @REL1_25, Zuul would craft the following git references:

When the first change against mediawiki/core enters the queue:

mediawiki/core    zuul/refs/REL1_25/Z1  (core @REL1_25 + Change 1)

Then for the second change against mediawiki/vendor:

mediawiki/core    zuul/refs/REL1_25/Z2  (core @REL1_25 + Change 1)
mediawiki/vendor  zuul/refs/REL1_25/Z2  (vendor @REL1_25 + Change 2)

Since the second change is behind a change that modifies mediawiki/core, Zuul marks the future state of mediawiki/core with Change 1 applied since that is what would ultimately be the new state of the repository.

Zuul then triggers jobs for each Change by passing the project named for the change and the Zuul reference. When testing Change 2, the Jenkins job clones both repositories and fetches the reference zuul/refs/REL1_25/Z2 on both repositories.

This behavior has been documented by Antoine Musso in upstream documentation as http://ci.openstack.org/zuul/gating.html#cross-projects-dependencies which you might want to read as well.

Introducing Zuul cloner

edit

Cloning multiple repositories and checking out the proper Zuul reference is not straightforward. Other repositories may not have a matching branch, in which case one should fall back to a default branch. OpenStack uses a custom set of shell scripts to handle this but it hardcodes expectations that would not suite our use case.

Antoine developed Zuul cloner, a python script that makes it easy to clone repositories and match the Zuul crafted state. Zuul cloner has been upstreamed to Zuul and already improved by them. The script takes a list of Gerrit projects to clone and attempts to make each repository match the Zuul reference for the proposed patchset, falling back to the target branch of the proposed patchset, or ultimately the 'master' branch.

For a patch sent to mediawiki/core @REL1_25, the Jenkins job receives parameters as environment variables from Zuul and runs:

$ zuul-cloner mediawiki/core mediawiki/vendor

Creating repo mediawiki/core
Fetched ref refs/zuul/master/Z123456 from mediawiki/core

Creating repo mediawiki/vendor
mediawiki/vendor in Zuul does not have ref refs/zuul/master/Z123456
Falling back to branch master

After which it runs the tests.

Examples runs can be found in the mediawiki-phpunit job.

To bring in mediawiki/vendor in extensions tests, this system has been applied on all extension jobs (mwext-*-testextension) to ensure we use the proper core and vendor states.

For extensions having dependencies (such as Flow requiring Mantle), this lets us ensure a Flow change is tested with a Mantle change ahead in the queue. Such dependencies are manually defined in the Jenkins jobs and do not define all potential interactions between all our extensions. We need a better plan.

Problems

edit

Testing extensions together

edit

The Wikimedia cluster has 155 extensions and 5 skins. Any change made to one of these repositories can potentially impact the others and we do not test that. A change proposed for a skin does not trigger the tests of other extensions, nor would a change to Mantle run the tests of dependant extensions (e.g. Flow). To achieve that, we would need a single job triggered by all the repositories and running all the tests. Thus changes will share the same gating queue, ensuring no change may cause failing tests in downstream repositories.

Antoine had this idea back in December 2013 (task T60772 Common gating job for mediawiki core and extensions) which led to the creation of Zuul cloner last summer.

Making tests pass together

edit

While Ori Livneh prepared the migration to HHVM, he crafted a MediaWiki-Vagrant role to easily clone all extensions and run their tests together. That highlighted a lot of issues:

  • No support for SQLite as a backend.
  • Lack of a default PHP entry point such as MyExtension/MyExtension.php.
  • Extensions registering hooks that change core behavior, causing other extensions to fail (ex: thumbnailBeforeProduceHTML task T69302)
  • Obviously broken tests.

Most are tracked by Task #69216 – Have unit tests of all wmf deployed extensions pass when installed together, in both PHP-Zend and HHVM. Only a handful of extensions are actually problematic. We could ignore those while they are being worked on.

Selecting extensions

edit

We have a few challenges to solve to establish a list of repositories (extensions, skins) to be cloned when a patch is proposed for a given branch. A rather incomplete list:

  • We can't assume all repositories have the branch of the proposed patchset. Wikidata and Wikimedia fundraising uses a 'production' branch which is updated outside of the main wmf release cycle. Though Zuul cloner supports specifying a fallback branch other than "master" .
  • In the previous section, we have seen some extensions badly interact with each other when they change core behavior, hence we may have to blacklist some.
  • On the master branch of MediaWiki core, we might want to introduce extensions not deployed at Wikimedia.
  • MediaWiki wmf branches define dependencies via submodules, which might be ahead of the wmf branches in the original repositories. Thus we might end up testing an outdated branch instead of the one referenced as a submodule.

Proposal

edit

Selecting extensions

edit

Our mediawiki/tools/release.git repository has utilities that carry some dependencies information: make-wmf-branch and make-release.

make-wmf-branch

The make-wmf-branch/default.conf defines what will be included in the future wmf branch. It contains a list of extensions and skins ($branchedExtensions and $branchedSkins) to cut branch for and a list of repositories for which we want an explicit branch ($specialExtensions).

We could make those extensions have release and wmf branches as well. The Jenkins job would thus clone it first and checkout the proposed patchset's target branch in the repository. From there we can craft the list of repositories to be cloned by Zuul cloner with project specific branches as needed.

Antoine experimented with such a script and proposed it as Gerrit change 161225. When run, it inspects default.conf and crafts a list of parameters suitable for usage by zuul-cloner. Hence a Jenkins job triggered by a patch on the REL1_25 branch, would look as follows:

$ zuul-cloner mediawiki/tools/release
Creating repo mediawiki/tools/release
Checked out REL1_25

$ zuul-cloner $(mediawiki/tools/release/zuulparams)
Creating repo mediawiki/core
Checked out REL1_25
Creating repo mediawiki/extensions/Foo
...

Other interesting use cases include bundles already being produced continuously. We might craft integration jobs to ensure the extensions in those bundles works fine together. Antoine is aware of at least two such bundles:

make-release

The make-release/make-release.yaml file defines a single bundle for Semantic MediaWiki. Definining the language extension bundle would be as easy as adding a new definition listing the extensions. Additionally, the team in charge of MediaWiki core releases has interest in defining such bundles for the different supported MediaWiki versions..

A Jenkins job could clone the repository first, inspect make-release.yaml and then trigger a run of Zuul cloner to clone them all and test the bundle.

Implementation / migration

edit

With dependencies figured out, we can remove the mwext-*-testextension jobs and replace them with a new job that fetches dependencies (similar to the existing mediawiki-phpunit job, but on steroids). The branch cut before Summer 2014 (REL1_22 and REL1_23) will need a lot of backports. Branches other than master will definitely need a lot of effort before we can make the job passing again.

Another use case is to replace the qunit jobs. They already run the qunit tests for core + the dependencies that are fetched, we can come up with a single job for all those repositories.

Long term possibility

edit

Possibly, knowing a set of HEAD passes together, we could add them all as submodules in a dedicated git repository. Each commit in that repository would thus be known to pass together and hence we have a good indication that they can be deployed on a production system. If we ever attain such a nirvana, we could most probably stop using wmf branches entirely in favor of deploying from an integrated repository. That will remove the need to cherry pick changes to the wmf branch since the continuous integration system would handle that for us.

At Wikimedia, we could also reuse that system to run tests for operations/mediawiki-config.git configuration changes against the deployed versions and the HEAD of the integrated repository.