Skip to content

Master Test Plan

Document Master Test Plan
Author: Alena Galysheva
Version: 0.1
Date: 05.06.2023

General information

A Master Test Plan is a document that outlines the overall testing strategy, objectives, and scope for a particular software project. It includes information on the testing process, test environment, test schedules, and test resources, and is designed to provide a high-level view of how testing will be conducted throughout the project. The Master Test Plan also includes information on how testing results will be tracked, reported, and managed.

About the Test Target

Skill Collector is a web application designed for end users, specifically individuals from ICT companies in Finland. Its primary purpose is to enable these users to select skills that are relevant and important to them as industry professionals. The workflow for the end user involves visiting the Skill Collector website, logging in using their unique hash, reading the provided instructions, selecting desired skills, submitting their choices, reviewing the selected skills, and finally submitting them.

Behind the scenes, there are two databases at work. One database is responsible for collecting and storing the user's answers, while the other database contains the SFIA and soft skills data that are utilized within the application. Additionally, the process of delivering the hash to the end user is automated through the use of Power Automate. This automation retrieves the user's email and hash from an Excel sheet, sends it to the end user, and also records the hash in the database upon addition.

Test Goals and Primary Needs

  • Validate the functionality of Skill Collector data handling to ensure it works as intended.
  • Verify that the service web UI is functional and complies with accessibility standards.
  • Test the user management and email automation processes to ensure they operate smoothly.
  • Validate that Power BI utilizes up-to-date data and generates accurate and representative visualizations for the process owner.

Schedule

Agile testing is not treated as a separate phase but rather as an ongoing activity carried out in collaboration with developers. For more details, refer to the Testing Guidelines. The following testing approach is followed:

Epic Level Testing

  • Identify specific testing targets at the beginning of the sprint and initiate test planning.
  • Perform regression testing for user stories completed in the previous sprint.
  • Conduct a comprehensive review of all features at the end of the sprint.
  • Execute smoke testing to ensure the stability of the Skill Collector application.
  • Automate regression and smoke testing processes.

Feature Level Testing

  • Test the feature once all related user stories are completed.
  • Perform integration and end-to-end testing.
  • Validate the feature against the specified requirements and acceptance criteria.
  • Conduct accessibility testing to ensure inclusivity.

User Story Level Testing

  • Validate each user story against its acceptance criteria.
  • Manually test each user story.
  • Include at least one acceptance criteria for scenarios where the expected outcome is "this should not happen" or "this should not work."
  • Ensure that each user story related to the Skill Collector service UI has at least one accessibility-related acceptance criteria.

Features

The following table lists the features and their status.

Feature Implemented Tested Deployed Acceptance test
FEA03-Customer-Feedback-system 1.0 In Progress No No -
FEA04-GDPR-Info 1.0 In Progress No No -
FEA06-Service-Containerized Yes Yes yes -
FEA07-Service-Regression-Test-Automatized In Progress No No -
FEA20-skill-info-view Yes Yes Yes FEA20 Test Results
FEA21-skill-selection-view Yes Yes Yes FEA21 Test Results
FEA29-Softskill-selection-view Yes Yes Yes FEA29 Test Results
FEA30-Progress-Bar Yes Yes Yes FEA30 Test Results
FEA-31-data-importer-from-SFIA-database Yes Yes Yes -
FEA-32-navigation-bar Yes Yes Yes FEA32 Test Results
FEA-40-language-selection-option In Progress No No -
FEA-41-branding-option In Progress No No -
FEA-42-power-bi-service-analytics In Progress No No -
FEA-43-power-automate-emailing Yes Yes Yes FEA43 Test Results
FEA-45-CSC-development-environment Yes Yes Yes -
FEA-47-skill-collector-database Yes Yes Yes -
FEA-48-power-automate-update-user-management Yes Yes Yes FEA48 Test Results

Testing Environments

The testing environment for the project has the following hardware requirements:

  • The development, testing and development servers are identical
  • Operating System: Ubuntu 22.04
  • Processor: 4 Virtual CPUs (VCPUs)
  • Memory: 7.8 GB RAM
  • Storage: 80 GB disk space

Resources and responsibilities

Resource Responsibility Notions
Alena Galysheva Test engineer Design tests, manage tests, execute tests, report results, implement test cases, document results, write test automation scripts, prepare test data.
Test server A testing environment for the service. Refurbish the production server for testing activities.
Testing tools Tools used to automate behavior-driven tests. Robot Framework with Selenium and Browser libraries.
Network Network connection used to conduct testing Localhost with Docker

Testing Levels

Acceptance Testing

  • Verify that the feature meets the acceptance criteria.

System Testing

  • Test the application as a whole.
  • Conduct black-box testing to evaluate the overall behavior and compliance with functional and non-functional requirements.
  • System testing should be performed after integration testing and before acceptance testing.
  • Conducted on completed features.

System Integration Testing

  • Combine and test multiple subsystems or components of a software system.
  • Ensure they work together as intended.
  • Conduct black-box testing to verify interaction and data exchange between subsystems and compliance with requirements.
  • Ensure components work together and data flows as designed.

Unit Testing

  • Test individual software components or modules in isolation.
  • Conduct white-box testing to verify functional and technical correctness of each component or module.
  • Unit testing is the responsibility of the developers.

Testing and Troubleshooting Processes

General Acceptance Criteria

  • The feature is properly implemented.
  • The feature is accessible, especially for UI elements.
  • The feature maintains a high level of security.
  • The feature functions as intended without any critical failures.
  • The feature is capable of handling and processing various inputs, including unusual or unexpected data.

General Rejection Criteria

  • The feature is not implemented at all.
  • The feature is incomplete or unfinished.
  • The feature fails to meet user, technical, or security requirements.
  • The feature has poor visual design or lacks accessibility for UI elements.
  • The feature has vulnerabilities that could lead to application breaches or data theft.
  • The feature breaks or malfunctions when used according to its intended functionality.
  • The feature does not properly sanitize input data, posing a security risk.
  • The feature breaks or displays incorrect behavior when the screen is resized.

Chosen test strategy

Scope

Critical business functions

  • Generating a unique hash for users
  • Sending the generated hash to users
  • Allowing users to log in using their hash
  • Providing a survey for users to fill out
  • Storing survey answers in the database

Components in scope

  • Implementation of hash sending functionality using Power Automate.
  • Web login system for users to access the Skill Collector with their hashes.
  • Skill selection view.
  • Skill selection functionality.
  • Sending and storage of survey results.
  • CRUD operations for interacting with databases.
  • UI accessibility in accordance with WCAG 2.1.
  • Integration of Power BI for data visualization purposes.

Components out of scope

  • Docker containers
  • CI/CD pipeline

Testing type

Manual

  • Integration testing
  • End-to-end testing
  • Acceptance testing
  • Usability testing
  • Accessibility testing

Automated

  • Smoke testing
  • Regression testing

If Time Permits

  • API testing
  • Performance testing

Testing logistics

  • Testing will be performed by the test engineer.
  • Test automation will be handled by the test automation engineer.
  • Testing will be conducted when user stories and features are delivered or updated.

Test tools and software used

Functional Testing

  • SquashTM: Used for testing documentation.
  • Robot Framework: Employed for test automation.
  • Postman: Utilized for API testing.

Accessibility Testing

  • WCAG 2.1 guidelines: Followed to meet Level A accessibility standards.
  • WAVE: Used for online accessibility testing.