From Fedora Project Wiki

Introduction

The QA group co-ordinates release validation test events before each Fedora release and pre-release is made, to ensure they reach Fedora_Release_Criteria - see the QA:Release_validation_test_plan. This page provides guidance on arranging and announcing a release validation test event. For any concerns, please contact the QA group.

Nightly validation events

The pre-Alpha nightly validation test events are usually automatically created by a bot/script running relval. If it becomes necessary to create one manually for some reason, the relval nightly sub-command can be used, e.g. relval nightly --username user --release 42 --date 20241225 --build Rawhide --email user@fedoraproject.org to create an event for today's Rawhide nightly and send an announcement email from user@fedoraproject.org.

Look up the date

Validation test events are held for each candidate build: test compose or release candidate. Several of these builds are made in the weeks before the release date for any given milestone (Alpha, Beta and Final). QA is also responsible for requesting these composes, under the compose request SOP - you may also take responsibility for that work, or co-ordinate with the person who does.

See quality task schedule for the planned TC1 dates, listed as Test 'version' Test Compose, hen make the following preparations with reference to those dates. There is no specific date for the first release candidate: it is built as soon as possible after the milestone freeze (as soon as no unaddressed blocker bugs remain).

Track image creation tickets

Find the image creation ticket from this list. There is a separate ticket for each milestone (Alpha, Beta, Final) of each release, so three tickets per release. Add yourself to the CC list. Then keep tracking this ticket until the images are available. If the images are not posted on time as scheduled or have a critical bug, please inform the rel-eng team about this by adding comments to the ticket.

Create test result pages and categories (using relval)

Test result pages are needed to gather test results of installation and desktop against Fedora candidate builds. The recommended way to create the result pages for the event is to use the relval tool.

Installing relval

Currently relval is not in the Fedora repositories. To install it, place this repository file in /etc/yum/repos.d and run su -c 'yum install relval'.

Normal relval usage

To use relval to generate pages for Fedora 42 Beta RC1, you would run: relval compose --release 42 --milestone Beta --compose RC1 --username (fas_user_name)

Test_Results:Current_(testtype)_Test redirects and CurrentFedoraCompose page

relval usually updates the Test_Results:Current_(testtype)_Test redirect pages and the CurrentFedoraCompose template, so if you are creating pages for an event which should not be considered the 'current' event yet, pass the --no-current option. If you want to update the current redirects later, you can run the command again, this time leaving out the option - as long as you do not include the parameter --force, the existing pages will not be overwritten, only the Current redirects will be updated.

Results

relval will print the pages it is creating as it works. It will always add the created pages to the test results category for the release and milestone - so in our example, you could find the created pages in Category:Fedora 42 Beta Test Results.

Create test result pages and categories (manually)

Manual creation strongly discouraged
Creating the pages manually is now strongly discouraged; relval will ensure the pages are created consistently and correctly, and is much easier than manual creation. In all normal circumstances, please use relval. The manual process is documented only as a fallback in case of problems with relval.

Create the pages

To do the creation manually, you would create pages for each of the 'test types' in this category first. For instance, for Fedora 42 Beta RC1, you might create the pages:

  • Test_Results:Fedora 42 Beta RC1 Installation
  • Test_Results:Fedora 42 Beta RC1 Base
  • Test_Results:Fedora 42 Beta RC1 Cloud
  • Test_Results:Fedora 42 Beta RC1 Desktop
  • Test_Results:Fedora 42 Beta RC1 Server
  • Test_Results:Fedora 42 Beta RC1 Security_Lab

The release, milestone and compose should be changed appropriately, of course.

Create the page content

Give each page only this content:

{{subst:Validation_results|testtype=Installation|release=42|milestone=Beta|compose=RC1}}

Set the parameters - testtype, release, milestone and compose - appropriately, and save the page. Valid choices for testtype are those in the Category:QA_test_matrix_templates category. This should generate the complete results page. If you mistype or leave out a parameter, do not attempt to fix the generated page by hand - simply edit the page, delete the entire contents, and try the generation again. A mistake causes no harm if it's promptly fixed, so don't worry.

If you edit the page after creating it, you will see it looks very different! The contents that you now see in the edit dialog is a 'static' part of the page and will not be changed if the templates which enable this generation magic are changed. Particularly, adding new test cases to the test matrix templates does not cause them to magically appear in existing test result pages; they must be added manually if appropriate.

Categories

Often you will need to deal with categories. For each release, there should be a category named Fedora (Release) Test Results. For each milestone, there should be a category named Fedora (Release) (Milestone) Test Results.

All of these category pages can be created using the Template:Validation_results_milestone_category template, as described in that template's documentation. For both, you pass the template the release parameter. For a milestone category page, you also pass it the milestone parameter.

The template will put appropriate explanatory text into the category page and ensure it is a member of the correct parent category.

Summary page

If you create the pages with relval, a Summary page will be created. There is no template to create this manually; the easiest way to do it is probably to copy a previous page and make the appropriate adjustments.

Current redirects

One or two days before test event day, update the following redirect links and make sure they point to the pages you created:

CurrentFedoraCompose page

The CurrentFedoraCompose template is the canonical reference for the nightly or TC/RC compose which is 'currently' being tested. If you are creating an event manually, you should update it, trying to follow the correct syntax; refer to the page history for examples. When you create an event with relval it is automatically updated so long as --no-current is not passed.

Change IRC topic

Also change the topic of #fedora-qa (freenode) to current test event (if you have the permission).

Make announcements

Currently, we announce validation test events to the test-announce mailing list. The announcement mail should provide enough information related to the test focus areas. See an example which generally includes:

  • Introduction of this test event (date, what to test, which release criteria to meet, etc)
  • For composes after TC1, a note of changes from the last compose or a link to the compose request trac ticket which provides this info
  • How and where to add test results
  • Contact information of QA members who are available on test event and can help testers who encounter problems
  • Others to be emphasized

Download links should not be contained in the announcement, since in general there are several download options, and these are already documented on the result pages. The announcement should direct people to go to these pages for download instructions.

Please announce this event on the mailing list at least one or two days in advance. This should give testers sufficient time to arrange their calendars and prepare a test environment. It is also a good idea to send a reminder e-mail the day before the test. Try to take timezones into account, to maximize convenience for testers from different regions or countries.

Supplemental announcements

It can be helpful to send supplemental announcements to other interested groups. It is a very good idea to notify the Product working groups - Workstation, Server and Cloud - of the release and request help with the parts of testing that are relevant to their products. Relevant lists are the desktop, server and cloud lists.

It can also be helpful to notify the major desktop groups - , KDE, Xfce and LXDE - of each candidate build, and refer them to the desktop testing matrix. Here is an example announcement. If possible, it is a good idea to do a 'smoke test' on each desktop live image - ensure it boots successfully to a working desktop - before sending the announcement, to avoid wasting the time and bandwidth of the desktop group members downloading un-testable images.

Provide help during test event

During the test event, many people will participate, including experienced users and new comers. Make sure the QA folks whose contact information was announced to mailing list in this test event are available during the testing period. They will provide assistance to those who encounter issues. QA people should be available at:

When a new candidate build available

If a new candidate build is made available for testing, such as TC2, TC3, RC3, you should re-do the entire process: create new test result pages for it, send out a new announcement, and update the redirects. The announcement should highlight the specific changes from the previous candidate build, as in this example.

Report and Summary

After testing has been completed, send a test summary to the test-announce mailing list. The test summary is intended to keep testers informed as to what was accomplished by their testing, whether there are any remaining tasks and to recognize key contributors. A sample report is available at https://lists.fedoraproject.org/pipermail/test-announce/2010-October/000164.html.

The test summary should include the following information:

Test status
A summary about what has been tested, what's remaining to accomplish, what issues were faced during test and whether we have achieved testing aim
Bugs reported
List filed bugs and reported issues which were found during test. Bug list including bug id, status, summary, etc...
The following command can be used to generate a list of bugs:
curl --stderr /dev/null "https://fedoraproject.org/wiki/Test_Results:Current_Installation_Test" \
   | grep -o "bugzilla\.redhat\.com.*[=\/][0-9]\{7\}" \
   | grep -o "[0-9]\{7\}" | tr '\n' ',' | sed 's|,$||' \
   | xargs bugzilla query --outputformat="%{bug_id} %{bug_status} %{resolution} - %{short_desc} - %{blocked}" -b
Got python-bugzilla?
This command requires the python-bugzilla package be installed. You may need to weed out any 'false' references to pre-existing bugs which are mentioned on the page for some reason.
Analyze results and assess risk
Analyze the results and provide an assessment where additional risk areas may exist
Thanks to all testers
Testers have volunteered their time, it never hurts to be thankful for their contributions.
To generate a list of testers who contributed to the wiki, you can use the following command:
curl --stderr /dev/null "https://fedoraproject.org/w/index.php?title=Test_Results:Fedora_14_Final_RC1_Install&action=raw" \
   | grep -A9999999 "^=.*Test Matrix.*=$" \
   | sed -e "s|{{\(result[^}]*\)}}|\n\1 |g" | grep "^result" \
   | gawk 'BEGIN{FS="[| ]"} $3{print $3} !$3{print "UNTESTED"}' \
   | sort | uniq -c | sort -bgr