Fedora 9 Installation Test Plan
Revision history
Date |
Revision |
Comment
|
Template:Void10 December 2007 |
0.1 |
Initial version
|
Introduction
This document describes the tests that will be created and used to verify the installation of Fedora9
The goals of this plan are to:
- Organize the test effort
- Communicate the strategy, scope and priorities of the planned tests to all relevant stake-holders for their input and approval
- Serve as a base for the test planning for future Fedora releases
Test Strategy
Instead of outlining all possible installation inputs and outputs, this test plan will focus on defining inputs and outputs at different stages in anaconda. This will also allow different tests to be performed independently during a single installation. For example, one may execute a kickstart delivery via HTTP, raid0 partitioning using 3 physical disks, and a minimal package installation on a para-virtualized xen guest all in single installation. Scenarios where the stages are dependent will be indicated as such in the test case.
Where possible, SNAKE will be used to automate and aid in reproducibility.
Test Priority
This test plan will use a 3 tier classification for test execution priority.
Tier1 is intended to verify that installation is possible on common hardware using common use cases. Verification includes:
- Common boot media
- Common Installation source
- Installation using defaults installation options
- Default Partitioning
Tier2 takes a step further to include more use cases. Tier2 verification consists of:
- All boot media
- All installation sources
- All kickstart delivery methods
- Some architecture specific verification
Lastly, Tier3 captures the remaining identified use cases:
- More exhaustive partitioning schemes
- More complex networking scenarios
- More architecture specific verification
- Network device
- Storage device
- Upgrade testing
Scope
- Testing will include:
- Various methods of booting the installation program
- Manual and kickstart execution of the installation program
- System setup performed by the installation program (networking, modprobe.conf, bootloader, runlevel)
- Booting the installed system
- Items outside the scope of this test plan include:
- Functional verification of software installed on the system
- Installation from media not generated by fedora release engineering
Test Pass/Fail Criteria
- Entrance criteria
- Trees must be generated using release engineering tools (not hand crafted)
- There must be no unresolved dependencies for packages included in the installation tree
- There must be no dependency conflicts for packages included in the installation tree
- Any changes in composition of the installation tree are explainable by way of bugzilla
- Alpha criteria
- Entrance criteria have been met
- All tier#1 tests have been executed
- Beta criteria
- Alpha criteria have been met
- All tier#1 tests pass
- All tier#2 tests have been executed
- GA criteria
- Beta criteria have been met
- All test tiers must pass
- Any open defects have been documented as release notes
Test Deliverables
- This test plan
- A test summary document for each major milestone
- A list of defects filed
- Any test scripts used for automation or verification
Test Cases (Functional)
Test Cases (Non-Functional)
Tier#1
Install Source
Package Sets
Partitioning
User Interface
Tier#2
Boot Methods
Installation Source
Kickstart Delivery
Package Sets
Partitioning
Storage Devices
User Interface
Tier#3
Boot Methods
Installation Source
Kickstart Delivery
Package Sets
Partitioning
Storage Devices
User Interface
Recovery
Test Environment/Configs
- Hardware
- i386
- ppc
- x86_64
- Hardware (subject to secondary arch availability)
- ia64
- s389x
Responsibilities
Schedule/Milestones
Risks and Contingencies
- what might go wrong and how we'll handle it
Approvals
Date |
Approver |
Comment
|
Template:Void10 December 2007 |
JamesLaska |
I approve this message
|
References
Appendices
- Outstanding issues
- How do we collect test feedback?
- Option 1: privileged users can modify wiki directly
- Option 2: email (eeew) ... probably going to have a bit of this
- ... ?
- How do we present test results?
- Separate wiki page / test plan ... a test summary report ?
- A application to store and query test results?