From Fedora Project Wiki

Revision as of 02:57, 8 July 2014 by Adamwill (talk | contribs) (update for Fedora 21: add server/workstation products, add two new test cases that have shown up for new criteria)

This page records base validation testing test results for the Fedora (VERSION) (PRE-RELEASE) release.

How to test

  1. Download the ISO images for testing: please use either a DVD installer image, or a nightly live image. Tests in this page should not be desktop dependent, so which live image you select should not matter. Delta_ISOs for installer images are also available here. Some tests may specify use of either a traditional installer image (the DVD image, or a net install image) or a live image; please follow these specifications.
  2. Perform one or more of the test cases and add your results to the table below.
  3. If a test fails, file a bug report, and propose the bug as a blocker for the appropriate release (see blocker bug process). If you notice a problem during a test which does not constitute a complete failure of the test, you should still file a bug report, but it may not be appropriate to propose it as a blocker. Use your judgment in deciding this, with reference to the Fedora_Release_Criteria, which these tests are intended to verify. If you are unsure, err on the side of proposing the bug as a blocker.
  4. Don't install updates before performing any of the tests, as when you are testing pre-releases, available updates are not part of the proposed released package set.
Virtual machine testing
In most cases, testing in a virtual machine is OK.

Add or Remove a Test Case

  1. Please request review for your changes by publishing your test case for review to test@lists.fedoraproject.org.
  2. Once reviewed, make your changes to any current documents that use this template (e.g. QA:Base validation results template)
  3. Lastly, update QA:Base validation results template with the same changes.

Key

See the table below for a sample format for test results. All test results are posted using the format specified Template:Result.


Test Result Explanation Code Entered
none
Untested - This test has not been run, and is available for anyone to contribute feedback. {{result|none}}
Pass pass robatino
Passed - The test has been run and the tester determine the test met the expected results {{result|pass|robatino}}
Inprogress inprogress adamwill
Inprogress - An inprogress result is often used for tests that take a long time to execute. Inprogress results should be temporary and change to pass, fail or warn. {{result|inprogress|adamwill}}
Fail fail jlaska [1] [2]
Failed - Indicates a failed test. A link to a bug must be provided. See Template:Result for details on providing bug information.
  1. RHBZ #XYZ
  2. RHBZ #ZXY
{{result|fail|jlaska|XYZ|ZXY}}
Warning warn rhe
[1]
Warning - This test completed and met the expected results of the test, but other issues were encountered during testing that warrant attention.
  1. Brief description about the warning status
{{result|warn|rhe}} <ref>Brief description about the warning status</ref>
Pass pass hongqing
Warning warn kparal
Multiple results - More people can easily provide results to a single test case. {{result|pass|hongqing}} {{result|warn|kparal}}
Pass pass previous <build> run
Result from previous test run - This test result is directly moved from the test run of previous <build>. {{result|pass|previous <build> run}}
Unsupported - An unsupported test or configuration. No testing is required.


Test Matrix

Release Level Test Case Workstation Server ARM Cloud References
Alpha QA:Testcase_base_initial_setup
none
none
none
Alpha QA:Testcase_base_startup
none
none
none
none
Alpha QA:Testcase_base_system_logging
none
none
none
none
Final QA:Testcase_Services_start
none
none
none
none
Final QA:Testcase_base_selinux
none
none
none
none
Final QA:Testcase_base_service_manipulation
none
none
none
none