No longer currentThe compose for which this page contains results is no longer the current one.
This page contains the results for the current compose.
This page records Installation validation testing test results for the Fedora 30 20190214.n.0 nightly compose.
Which tests to run
Test coverage pageThis page provides information about test coverage for the tests on this page across all the composes for the current release: it can help you see which test cases most need to be run.
Tests with a Milestone of Basic, Beta or Final are the most important. Optional tests are less so, but still useful to run. The milestone indicates that for that milestone release or a later one to be approved for release, the test must have been run against the release candidate build (so at Beta, all Basic and Beta tests must have been run, for instance). However, it is important to run the tests for all milestones as early and often as possible. Please refer to the test coverage page linked above and try to find and run tests which have not been run at all, or not run recently, for the current release.
How to test
1. Download one or more media for testing:
2. Perform one or more of the test cases and add your results to the table below
- You can submit results by editing the page directly, or by using relval, with the
relval report-results
command. It provides a simple text interface for reporting test results.
3. If a test fails, file a bug report. You may propose the bug as a release blocker or freeze exception bug for the appropriate release - see blocker bug process and freeze exception bug process.
Some tests must be run against particular Products or images - for example, the #Default boot and install tests. If no particular product or image is specified either in this page or the test case page, you can use any appropriate image. For example, you can run most of the #General Tests with the Workstation live image, or either of the Server install images.
If you notice a problem during a test which does not constitute a complete failure of the test, you should still file a bug report, but it may not be appropriate to propose it as a release blocker bug. Use your judgment in deciding this, with reference to the Fedora_Release_Criteria, which these tests are intended to verify. If you are unsure, err on the side of proposing the bug as a blocker.
Don't install updates
Don't install updates before performing any of the tests, as when you are testing pre-releases, available updates are not part of the proposed released package set.
Results summary page
The Test Results:Fedora 30 Rawhide 20190214.n.0 Summary page contains the results from this page and all the other validation pages for the same compose listed together to provide an overview.
Add, Modify or Remove a Test Case
- Please request review for your changes by publishing your test case for review to the test mailing list and/or the appropriate working group mailing list (e.g. server, cloud, or desktop).
- Once reviewed, make your changes to any current documents that use the template (e.g. Test_Results:Current_Installation_Test).
- Lastly, update Template:Installation_test_matrix with the same changes.
Key
See the table below for a sample format for test results. All test results are posted using the result template.
Test Result |
Explanation |
Code Entered
|
none
|
Untested - This test has not been run, and is available for anyone to contribute feedback.
|
{{result|none}}
|
pass robatino
|
Passed - The test has been run and the tester determine the test met the expected results
|
{{result|pass|robatino}}
|
|
Inprogress - An inprogress result is often used for tests that take a long time to execute. Inprogress results should be temporary and change to pass, fail or warn.
|
{{result|inprogress|adamwill}}
|
|
Failed - Indicates a failed test. A link to a bug must be provided. See Template:Result for details on providing bug information.
|
{{result|fail|jlaska|XYZ|ZXY}}
|
[1]
|
Warning - This test completed and met the expected results of the test, but other issues were encountered during testing that warrant attention.
- ↑ Brief description about the warning status
|
{{result|warn|rhe}} <ref>Brief description about the warning status</ref>
|
|
Multiple results - More people can easily provide results to a single test case.
|
{{result|pass|hongqing}} {{result|warn|kparal}}
|
|
Failed - Same issue with LVM again
|
{{result|fail|pboy|2246871|2244305}}
|
pass previous <build> run
|
Result from previous test run - This test result is directly moved from the test run of previous <build>.
|
{{result|pass|previous <build> run}}
|
|
Unsupported - An unsupported test or configuration. No testing is required.
|
|
Test Matrix
Please click [show] in each table to view the tests of each media installation, and click [edit] to post your test results using the syntax in Key Section.
Image sanity
- ↑ LXDE live i386, size 1285537792, max 734003200
- ↑ Mate live i386, size 1971978240, max 1500000000
- ↑ SoaS live i386, size 939524096, max 734003200
- ↑ LXDE live x86_64, size 1340866560, max 734003200
- ↑ Mate live x86_64, size 2009333760, max 1500000000
- ↑ SoaS live x86_64, size 990904320, max 734003200
- ↑ LXDE live i386, size 1285537792, max 734003200
- ↑ Mate live i386, size 1971978240, max 1500000000
- ↑ SoaS live i386, size 939524096, max 734003200
- ↑ LXDE live x86_64, size 1340866560, max 734003200
- ↑ Mate live x86_64, size 2009333760, max 1500000000
- ↑ SoaS live x86_64, size 990904320, max 734003200
Default boot and install (x86_64)
Single test tableIn all of these tests, the test case used is
QA:Testcase_Boot_default_install. That is where the links point. The same test needs to be run for multiple images, target platforms, and install media. Note that the non-installer-based ARM disk images are covered by the later
#ARM disk images section. The
VM columns are for results from testing in a virtual machine. The
CD/DVD columns are for results from testing on a real system with the image written to a real CD or DVD. The
USB columns are for results from testing on a real system with the image written to a USB stick.
Expected coverageFor Beta, we expect a reasonable sampling of tests across the table, with at least some testing for VM and USB boot method, both firmware types, and each major class of deliverable (netinst, live and DVD). Optical boot testing is optional at this stage. For Final, we expect full coverage for
Basic / Final rows with VM and USB boot method. Optical boot testing in Final is mandatory for
supported images.
Default boot and install (i386)
Below is the table for i386 tests, collapsed by default (click 'show' in the Milestone column to show it). Since Fedora 24, i386 is no longer a release-blocking architecture, and these tests are all optional.
Fedora Media Writer
Milestone |
Test Case |
Fedora 28 |
Fedora 29 |
Fedora 30 |
Windows 7 |
Windows 10 |
macOS
|
Beta / Optional
|
QA:Testcase_USB_fmw
|
none
|
none
|
none
|
none
|
none
|
none
|
ARM disk images
Single test tableIn all of these tests, the test case used is
QA:Testcase_arm_image_deployment. That is where the links point. The same test needs to be run for multiple images and target platforms.
PXE boot
Virtualization
Storage devices
Guided storage configuration
Guided storage shrinking
Environments
For this test, the column headings refer to the storage volume type to be shrunk, not the one chosen to replace it for the new installation.
Custom storage configuration
User interface
Installation repositories
Package sets
Kickstart
Upgrade
Internationalization and Localization
Miscellaneous