Language Testing Plan/LanguageTestingWorkflow

THIS IS A PROPOSAL

The Wikimedia Language Engineering team presently uses the Cucumber and Selenium to test implemented features in web-browsers. Besides these automated tests implemented using Ruby, the team also tests manually when automated tests are not feasible or not implemented. For all tests the scenarios to be tested are written in the Given-When-Then format. This can be explained as:

Given a Situation
When a certain event occurs
Then a certain result is expected

This document outlines the process and best practices to implement tests that fully cover the user interface functionalities for repeated runs.

Phase 0: Conceptualization

edit

When a feature is identified for implementation, the details of its functionalities, interface design and interactions are ascertained by the Product Owner in consultation with other stakeholders and described in the design documents. These descriptions are the blueprints for the developers when they implement the functionalities through code. It is imperative that the functionalities are clearly described for all possible user activity. A means to this end is the conceptualization of the individual Given-When-Then scenarios, that provide the developers with the description of the events that can be executed under different conditions and the intended reactions.

Phase 1: Scenario Building

edit

The Given-When-Then scenarios are the smallest levels of activities that can be put together to describe a functionality. It is if extreme important to create the scenarios from the defined designs, including user interface mockups. This ought to be done by simulating interactions by the user on the dialogs, forms, buttons, input boxes and other user interaction elements described in the mock-up.

The number of scenarios can be extremely elaborate at this stage. It is also preferable that the original concept creators are not consulted at this point, to avoid leading the scenario writer.

Completed: A list of Given-When-Then scenarios of possible user activities for all functionalities is prepared
People Involved: Language Engineering QA

Phase 2: Feature Review and Consensus

edit

The list of scenarios created in Phase I is the foundation for all tests that need to be run against the feature. This exercise of granularity may also identify any missing functionalities or scenarios that may have been overlooked during the original design decisions. At this stage it is important to verify these inconsistencies with the Product Owner and/or UI Designer. This may or may not require the scenarios to be rewritten or modified.

In cases where the scenarios are written for extension of already implemented features, the original developers' may also be consulted.

Completed: The GWT scenario list is ratified for consistency with the intended feature
People Involved: Language Engineering QA, PO, UI Designer, (Developers)

Phase 3: GWT Scenario Refinement

edit

In the next stage the ratified scenarios need to be refined for actual use. This will have to be done depending upon how the scenarios will be used. For automated tests, the GWT scenario descriptions will have to be reformatted and committed into the appropriate gerrit repository as .feature files, suitable for writing the step definitions. Since this step may somewhat reduce the verbosity of the scenario, the original set of scenarios should remain unchanged to aid the manual tests. The text for the .feature files should be checked against the original GWTs for any loss of information.

Completed: The GWT scenarios identified for the automated tests are refined and submitted to the repository as .feature files
People Involved: Language Engineering QA (Scenario writer and code writer)

Phase 4: Implementation

edit

The test scenarios are now ready for implementation.

Automated

edit

The process for writing the automated tests is elaborately described in the QA document.

Completed: The code for the scenarios in the .feature files is written and set in the build-system
People Involved: Language Engineering QA (Code writer) and WMF QA team member (pairing)

Manual

edit

The tests that are not set for automated tests are written as manual test cases, with 'execution steps'.

Given I am in the Main Page of the wiki
When I press the 'Log-In' link
Then I should see the 'Log-In' dialog
Test Case 1
Step 1 : Go to URL-of-Main-Page-of-the-wiki
Step 2 : Click on the 'Log-In link on this page

Are you taken to the 'Log-In' page?  - Yes/No

Completed: The scenarios for the manual tests are written as manual test cases i.e. it contains the steps for executing the test and for reporting the result
People Involved: Language Engineering QA (Scenario writer)

Phase 5: Result Collection and Analysis

edit

Automated

edit

Manual

edit

(WIP) The results of the tests are collected from the TCMS. Failed tests are verified and bugs filed.


Completed: Regressions, bugs etc. are verified and submitted in the bug-tracking system for prioritization
People Involved: Language Engineering QA