Design System Team/Codex Manual Testing Guidelines
Introduction
editThe following guidelines aim at providing simple and direct instructions for contributors wanting to sign off any implementation tasks that introduced or modified Codex components via manual testing.
In the context of the Codex project, manual testing is subsidiary to the automated functional, visual and accessibility testing processes integrated in the library. Nevertheless, the following practices are fundamental for maintaining the quality of components, as they can help identity first-hand issues that could otherwise remain undetected.
đ Testing approach and process
editAt any stage of implementation of a Codex task that involves visual or interactive changes, the team members (Designer, Engineer or Test Engineer) might need to review affected components with respect to the scenarios with the aim of identifying any missed requirements. The goal is to test early as much as possible, to reduce any redesign efforts later in the development stage.
When a task needs to be manually signed off, there is the need to evaluate the following:
- In case a new component has been added to the Codex library: that the component follows the visual and interactive requirements provided in the Figma design specifications (see example), and that the acceptance criteria defined in the ticket is met.
- In case a library component has been updated or fixed: that the changes meet the visual and interactive definitions provided in the specs and/or the taskâs acceptance criteria; and that no other properties were altered as a result of said changes.
To verify the functional behavior and evaluate the componentsâ visual features against the mentioned design specs and acceptance criteria, designers, engineers and/or testers will interact with components directly. Theyâll need to trigger the different states using any of their available device(s), just like end-users would do. Nevertheless, itâs recommended for manual testers to use browser inspection capabilities too in order to verify the value of CSS styles, as a way of supporting visual detection.
Where to test Codex changes
editđľ Active patch
editIn case that the patch containing the changes that need to be tested hasnât been merged yet, you can access the staged Codex demo page in a Netlify build.
To access the build, simply add the patch number to be reviewed in the following URL (replacing âPATCHNUMBERâ):
https://PATCHNUMBER--wikimedia-codex.netlify.app/
The patch number (which is appended to the patchâs URL) can be easily found in Gerrit:
Or extracted from the link provided by gerritbot in the relevant Phabricator ticket:
đ˘ Merged patch
editTesting on the Codex Demo site
editOnce a patch has been merged, all changes are automatically published to the Codex demo site and can be reviewed there. Engineers will add an interactive demo page for each new component as part of the component development process.
Testing in MediaWiki
editNew components should also be tested inside a MediaWiki instance that more closely resembles our production environment. The Design System Team maintains a MediaWiki extension called VueTest that can be used for this purpose. The same component demos which have been prepared for the Codex Demo site can also be used in a MediaWiki instance on the Special:VueTest page. To test out a new Codex component in MediaWiki, the following steps need to be followed.
- Developers will ensure that the new component demos are available in the VueTest extension prior to moving the component task over to the âQTE Sign-offâ column. Even pre-released âexperimentalâ components can be tested in this way once the relevant patch has been merged.
- Set up a new PatchDemo instance (this creates a custom Wiki with appropriate software installed). Under the âconfiguration presetâ settings, choose âCustomâ and select the options below. For convenience, you can also set the âLanding Pageâ option to:
Special:VueTest
.
- Once the wiki has finished building, you can navigate to mywikiURL/wiki/Special:VueTest/codex. Once there, navigate to the appropriate component; here you can find the same interactive demos that exist on the Codex Docs site. Once a PatchDemo wiki has been created, it will persist for a while and can be used by multiple testers (useful if you are testing out different browser/OS/device combinations).
- If you need to test in a way that the VueTest demos donât cover, let DST engineers know and theyâll help figure out a solution.
How to test: process
editThe manual testing process consists of the following steps:
Step 1: Gather and analyze requirements
editCheck the acceptance criteria included in the task and access to the appropriate version of the component design specs in Figma (they should be linked in the task description).
Step 2: Create a test plan
editThe Spec Sheet is the source of truth for visual/functional requirements of components. Document test cases from this which would cover all its specifications and check the Test Case document (which is often linked in the Test Cases section of the task) to see if there are any previous test cases defined. Add any test cases that are specific to the component under review.
Test cases should be based on the goal of the technical task (i.e. the mentioned list of acceptance criteria), and they should allow testers to verify the result of applying any visual and interactive changes against the expected designs (i.e. the relevant component specs in Figma).
Verifying a component against the entirety of its Figma specifications (see image below) will be specially relevant in the case the said component is completely new to the library. You might let the different sections help you define the specific test cases for the component being verified:
Mapping the specification sections, typical test cases would cover the following:
- Visual
- Global stylistic properties: padding, sizes, fonts, etc
- States: active, hover, disabled etc. display the right, specified styles.
- Functionality
- Making sure the right states and actions are displayed when the component is clicked, selected, searched, text input etc.
- Use cases
- Minimum and maximum examples (e.g. text overflow behavior).
- Responsiveness
- RTL behavior
- Accessibility and keyboard navigation specs
Test cases should both reflect the predefined component specs and properties, and attempt to explore how to break the component.
Step 3: Execute the test cases
edit- In case a totally new component is being reviewed, we recommend executing the totality of the test cases defined in all the test scenarios listed below (browser compatibility, responsiveness, accessibility and internationalization).
- For changes to existing components, execute the tests cases only in the relevant scenarios below (with regards the scope of the change), and record any visual or interaction bugs detected.
The scope of the change â which component properties and behaviors were altered in the last implementation effort â will determine the type of test or tests that would need to be performed as part of the verification process (e.g. if the color of an icon was updated, thereâs no need to test for responsiveness or internationalization). In case of doubt, you can reach out to your friendly test engineer to help you define the best manual testing strategy.
The following are the possible test approaches, to be performed based on their suitability depending on the task goals and acceptance criteria. Team members might feel free to decide the most convenient combination of tests and their scope, based on their expertise and available resources.
Browser compatibility
editWhen to test browser compatibility: Always. Providing basic browser support is essential for our project, so browser compatibility tests should be carried out in all cases.
Validate the defined visual and functional tests cases in the following selection of browser versions:
- Chrome: Current and previous version
- Firefox: Current and previous version
- Safari: Current and previous version
Tests can be performed in any operating system. Simply open the relevant demo or Netlify link in each one of the browser versions listed, either directly in your computer or in BrowserStack, and execute the test.
The browser compatibility list is based off the Wikimediaâs browser support matrix, and has been refined with data from the Browser usage breakdown dashboard. The selection was simplified by leaving only one representative of each one of the three most used browser engines.
Responsiveness
editWhen to test componentsâ responsiveness: specifically when new components are introduced to the Codex library. Also necessary in case the taskâs goal was to modify the responsive behavior of the element being tested.
Test in the relevant or adequate device/ screen sizes to evaluate whether the component adjusts and behaves as specified in the designs (i.e. as specified in the âResponsive behaviorâ section in the Figma specification sheets).
Overall responsive performance
editGiven that Codexâs focus is on web and mobile web (basic responsiveness), it is sufficient to use the device tools provided by any of the specified browsers to test the responsiveness of components in the following breakpoints*:
- 2000px
- 1200px
- 1000px
- 720px
- 320px
If specific media queries, new responsive behaviors or breakpoints were introduced, be sure to evaluate the componentâs behavior and visual aspect in each relevant screen size, as well as between them (i.e. check for responsiveness transition issues).
In-depth mobile testing
editFor more in-depth mobile testing, we recommend executing the test cases in as many of the following mobile operative systems and browsers scenarios as possible:
iOS | Android | |
---|---|---|
Mobile Safari | Current and previous version | - |
Chrome Mobile | Current and previous version | Current and previous version |
Samsung Internet | - | Current and previous version |
The test matrix above is based off the data from the Browser usage breakdown dashboard.
Accessibility
editWhen to test componentsâ accessibility: Specially required when new components are added to the library. Also a must in cases where the componentâs interaction flow or keyboard shortcuts were modified, or to verify the contrast in case of color adjustments.
Verify the accessibility test cases using assistive technology in order to make sure that the component can be navigated and fully interacted with using the keyboard shortcuts specified in the design spec sheet (see the relevant âKeyboard shortcutsâ section in the Figma spec sheet). Check the Screen reader and keyboard navigation testing section for more information.
In general, accessibility testing might include any combination of the following optional and recommended practices:
Reviewing color contrast (Optional)
editIn general, designers aim at applying color combinations with sufficient contrast during the definition and specification of components. Nevertheless, oversights are possible. If you suspect that a text might not have enough contrast against its background (we aim at being compliant with an AA level of contrast â check Success Criterion 1.4.3 Contrast (Minimum) in the WCAG 2.2) please report the issue as described in Step 4 of the How to test: process section.
Recommended online contrast-checking tools: WebAIM Contrast Checker, Colour Contrast Checker.
Please observe font size as a factor when evaluating contrast, as described by the Success Criterion 1.4.3 Contrast (Minimum) in the WCAG 2.2.
Legibility (Optional)
editMake sure that the text included in components or patterns is visible and recognizable, and that it can be read with ease. The following font properties play an important role in ensuring legibility and should be observed:
- Font-size: Font sizes should match the design specifications, and never be smaller than 12px. Pay attention to compounding effects that might render font-sizes smaller than intended in context. Users with low vision and other sight disabilities commonly (3% of users) define bigger default font sizes in their browsers in order to keep content readable for them. User-interface text should always adjust to these user settings. You can easily verify this by increasing the font size to 24px (Firefox or Chrome) in the Appearance settings of the browser youâre using to perform the tests.
- Color: Text should display a sufficient contrast ratio against its background to remain easily visible for all users (a minimum AA contrast level is expected). Check the Color contrast section for more details.
- Line height and spacing: The right spacing (defined by line-height) will prevent lines of text from looking too close or too distant to each other, conditions that would impact legibility. System font styles have predefined, optimal line-heights applied to them, based on their size and hierarchy. Validating that said specifications are followed is key to ensure a correct vertical text spacing, and thus an optimal level of legibility.
- Line length: Lines of text should present an optimal length in order to ensure the readability of a paragraph. In general, a general maximum line length of 80 characters should be observed (40 characters for CJK languages) (Source: WCAG 2.2: Success Criterion 1.4.8 Visual Presentation).
Screen reader and keyboard navigation testing (Recommended)
editExecute the defined test scenarios using screen reader technology in order to make sure that the right components, elements, states and content are announced (find recommended screen reader software in the section below).
While working with screen readers, make sure to interact with components using only your keyboard. This is helpful to verify that all content is accessible, and that all states can be triggered using the specified keyboard shortcuts (see the relevant âKeyboard shortcutsâ section in the Figma component specs).
It is worth noting that this step is 100% voluntary during design sign-off processes: most of accessibility testing will be performed during development, and during the Functional Testing stage. Automated a11y testing introduced to Codex CI will serve as an additional step to catch current issues, but even more importantly future breakages (which will be constantly tested). Thereâs the possibility to set up testing with people with disabilities with the help from external contractors in case thereâs the need to evaluate more complex components or patterns.
Recommended screen reader and browser combinations
editThere are dozens of possible combinations of browsers and screen readers. For pragmatic reasons, we collected here the most widely used combinations, that testers can feel free to choose at their convenience. (Sources: Accessibility Developer Guide, WebAIMâs Screen Reader User Survey #9):
Testing assistive technology on desktop
editHere are the suggested screen readers with which to conduct manual accessibility testing on desktop devices:
NVDA (Windows), Windows built-in Narrator or VoiceOver (MacOS built in accessibility tool) on either Chrome, Firefox and/or Safari.
Aside from trying to reproduce the regular test cases, please make sure to verify that the specified Keyboard shortcuts â which can be found at the bottom of the componentâs specification sheet in Figma (see an example) â have been correctly applied.
Testing accessibility on mobile
editHere are the suggested screen readers with which to conduct accessibility testing on mobile devices:
VoiceOver (iOS built in accessibility tool) or TalkBack (Android built in accessibility tool) on Safari, Chrome, Firefox
Internationalization
editThe Codex demo site allows you to toggle between an LTR and RTL display of components. You can use this feature to check whether a given element follows the bidirectionally specifications provided in Figma spec sheet (read the âRTL behaviorâ section).
When to test internationalization: Specially required when new components are added to the library.
Step 4: Report bugs and visual fixes related to the patch
editYou can document the issues found either in a comment or in the description of the Phabricator task being reviewed (see example).
Make sure to provide a checklist of the issues, and add clear individual descriptions, context (device, operative system, browser) and visual media (screenshots, videos, gifs) per problem if necessary. Donât forget to ping the engineer that worked on the changes.
- In case unrelated bugs are found during the testing process, report them separately, using the bug report template in Phabricator. New bugs can be added to the Needs triage (Incoming request) in Codexâs Phabricator board.
- Test again once the bugs/ design fixes have been implemented in order to re-verify the changes.
If any needed requirements were missing from the specs and werenât implemented, rather than adding them to the current task, they should be included in a separate ticket and tagged with Design.
Step 5: Add your final approval message
editCheck items from the checklist as individual fixes as applied, and add your final sign off approval message to the relevant Phabricator ticket once thereâs nothing left to fix. Donât forget to mention which type of tests were performed and in which set up. Once signed-off, the manually tested ticket should be moved to the âFunctional testingâ column in the Design System Sprint Phabricator board.
⨠Manual testing checklist
editA simplified list of the steps involved in the manual testing process:
- [ ] Gather relevant resources such as the componentâs spec sheet
- [ ] Define the test cases (what needs to be tested) based on the acceptance criteria defined in the Phabricator task
- [ ] Access the relevant testing environment: this can be either the Netlify build of the Codex library (in case the patch is active) or the Codex demo page
- [ ] Execute the test cases and make sure to validate:
- [ ] That the component displays the specified functional states and visual properties (depending on the scope of the task) are correctly in the current and previous versions of Chrome, Firefox and Safari in your operative system.
- [ ] That the component displays the correct responsive behavior. At least for the current breakpoints of 320px, 720px, 1000px and 2000px. More about mobile testing browsers and breakpoints.
- [ ] That the accessibility and keyboard navigation specs are followed: Test using JAWS (Windows), NVDA (Windows) or VoiceOver (MacOS built in accessibility tool) on Chrome, Firefox or Safari (one of these is recommended). Find out more about testing for accessibility
- [ ] That the component follows the bi-directionality design specifications
- [ ] Report bugs: Provide a list of needed fixes in the shape of a checklist in the relevant task. Make sure to include individual descriptions and visual media if necessary. Donât forget to ping the engineer that worked on the changes.
- [ ] In case unrelated bugs are found during the testing process, report them separately, using the bug report template in Phabricator. Add new bugs to the Needs triage (Incoming request) in Codexâs Phabricator board.
- [ ] Test again once the bugs/ design fixes have been implemented ****in order to re-verify the changes.
- [ ] Check items from the checklist of fixes as theyâre solved, and add your final sign off approval message to the relevant Phabricator ticket once thereâs nothing left to fix. Again, donât forget to mention which tests were performed and in which set up, and to ping the person assigned to the task.
- [ ] Once signed-off, a manually tested ticket should be moved to the âProduct Sign-Offâ column in the Design-System-Sprint Phabricator board.