One of the key strengths of Apertis is the testing done to the images allowing to catch regressions early in the development cycle. The QA process heavily relies in LAVA to perform automated tests on the reference hardware, which is then reported on QA website.
Running tests on real hardware allows developers and product owners to make sure that products meet their expectations on the target platform. Thanks to these integration tests the combination of hardware, kernel base system and application is stressed in different ways to ensure the quality of the final product.
Apertis currently runs:
- Automated tests on daily basis for daily builds
- Automated and manual tests on released images
This section is for the Quality Assurance team. It includes links for QA services, documentation, tests and any information related to the different QA processes in the project.
Apertis issues
Apertis uses the Apertis issues board to keep track of the current known issues with the project, as well as any proposed enhancement from the community.
Community members are encouraged to contribute to it by reporting any issues they may find while working with Apertis, or by suggesting improvements to the project.
Services
Tools
Tools used for QA tasks and infrastructure.
- apertis-test-cases: Source of automated and manual test cases. Used to generate the test case site and to perform automated tests.
- apertis-test-cases-web: Source used to generate the index page for the test case site.
- Apertis infrastructure dashboard: Tracking of potential packaging issues.
- lqa: It submits the automated tests jobs to LAVA. It also offers a LAVA API that can be used by other scripts.