From Fedora Project Wiki

< QA

(add cloud to the list of pages)
(remove IRC references)
 
(18 intermediate revisions by 3 users not shown)
Line 2: Line 2:
The QA group co-ordinates release validation test events before each Fedora release and pre-release is made, to ensure they reach [[Fedora_Release_Criteria]] - see the [[QA:Release_validation_test_plan]]. This page provides guidance on arranging and announcing a release validation test event. For any concerns, please contact the [[QA]] group.
The QA group co-ordinates release validation test events before each Fedora release and pre-release is made, to ensure they reach [[Fedora_Release_Criteria]] - see the [[QA:Release_validation_test_plan]]. This page provides guidance on arranging and announcing a release validation test event. For any concerns, please contact the [[QA]] group.


== Look up the date ==
{{admon/caution|This process is now automated|The creation of release validation test events is now automated. Wiki pages are created and an announcement email sent by [https://pagure.io/fedora-qa/relvalconsumer relvalconsumer]. Please contact [[User:Adamwill|Adam Williamson]] for any details about this process. This SOP can still be followed in any unusual case where manual validation event creation is still required.}}
Validation test events are held for each candidate build: test compose or release candidate. Several of these builds are made in the period one or two weeks before the release date for any given milestone (Alpha, Beta and Final). QA is also responsible for requesting these composes, under the [[QA:SOP_compose_request|compose request SOP]] - you may also take responsibility for that work, or co-ordinate with the person who does.


See [http://jreznik.fedorapeople.org/schedules/f-{{FedoraVersion||next}}/f-{{FedoraVersion||next}}-quality-tasks.html quality task schedule] for specific dates, listed as ''Test 'version' Test Compose'' or ''Test 'version' Candidate''. Then make the following preparations with reference to those dates.
== Nightly compose validation ==
A validation event can be created for any nightly compose, but they should not be created too frequently or if the new compose does not differ significantly from the one currently nominated for testing.


== Track image creation tickets==
== 'Candidate' compose validation ==
Validation test events are held for each 'candidate' compose. These builds may be made in the weeks before the release date for any given milestone (Alpha, Beta and Final). QA is also responsible for requesting these composes, under the [[QA:SOP_compose_request|compose request SOP]] - you may also take responsibility for that work, or co-ordinate with the person who does.
 
== Track image creation tickets ==
Find the image creation ticket from [https://fedorahosted.org/rel-eng/report/2 this list]. There is a separate ticket for each milestone (Alpha, Beta, Final) of each release, so three tickets per release. Add yourself to the CC list. Then keep tracking this ticket until the images are available. If the images are not posted on time as scheduled or have a critical bug, please inform the rel-eng team about this by adding comments to the ticket.   
Find the image creation ticket from [https://fedorahosted.org/rel-eng/report/2 this list]. There is a separate ticket for each milestone (Alpha, Beta, Final) of each release, so three tickets per release. Add yourself to the CC list. Then keep tracking this ticket until the images are available. If the images are not posted on time as scheduled or have a critical bug, please inform the rel-eng team about this by adding comments to the ticket.   
== Create test result pages and categories ==
{{admon/tip|Relval|A tool called [https://www.happyassassin.net/cgit/relval relval is available] which automates the procedure described below. It may soon be the recommended method for creating event pages. To use it to generate pages for {{FedoraVersion|long|next}} Beta RC1, you would check out a copy, ensure the package {{package|python-mwclient}} is installed, and run {{command|./relval --release 21 --milestone Beta --compose RC1 --username (fas_user_name) --current}}.}}


Test result pages are needed to gather test results of installation and desktop against Fedora candidate builds. The pages are very simple to create. You should create pages with names like the following:
{{anchor|relval}}
== Create test result pages and categories (using relval) ==
 
Test result pages are needed to gather test results of installation and desktop against Fedora candidate builds. The recommended way to create the result pages for the event is to use the  [https://www.happyassassin.net/wikitcms/ relval] tool. To install it, just run {{command|dnf install relval}}.
 
=== Normal relval usage ===
To use relval to generate pages for e.g. {{FedoraVersion|long|next}} Beta 1.1, you would run: {{command|relval compose --release {{FedoraVersion||next}} --milestone Beta --compose 1.1 --username (fas_user_name)}}. You may also
 
To generate pages for a nightly compose event, you can use e.g. {{command|relval compose --release {{FedoraVersion||next}} --milestone Branched --compose 20160425.n.0}} (for a [[Releases/Branched|Branched]] nightly) or {{command|relval compose --release {{FedoraVersion||next}} --milestone Rawhide --compose 20160425.n.0}} (for a [[Releases/Rawhide|Rawhide]] nightly).
 
=== ''Test_Results:Current_(testtype)_Test'' redirects and ''CurrentFedoraCompose'' page ===
relval usually updates the {{code|Test_Results:Current_(testtype)_Test}} redirect pages and the [[Template:CurrentFedoraCompose|CurrentFedoraCompose template]], so if you are creating pages for an event which should not be considered the 'current' event yet, pass the {{code|--no-current}} option. If you want to update the current redirects later, you can run the command again, this time leaving out the option - as long as you do not include the parameter {{code|--force}}, the existing pages will not be overwritten, only the Current redirects will be updated.
 
=== Results ===
relval will print the pages it is creating as it works. It will always add the created pages to the test results category for the release and milestone - so in our example, you could find the created pages in [[:Category:{{FedoraVersion|long|next}} Beta Test Results]].
 
== Create test result pages and categories (manually) ==
{{admon/warning|Manual creation strongly discouraged|Creating the pages manually is now strongly discouraged; relval will ensure the pages are created consistently and correctly, and is much easier than manual creation. In all normal circumstances, [[#relval|please use relval]]. The manual process is documented only as a fallback in case of problems with relval. Note also the page [[Wikitcms]], which documents the conventions relating to release validation result storage in the wiki as a notional 'test management system'.}}
 
=== Create the pages ===
To do the creation manually, you would create pages for each of the 'test types' in [[:Category:QA_test_matrix_templates|this category]] first. For instance, for {{FedoraVersion|long|next}} Beta 1.1, you might create the pages:
 
* {{code|Test_Results:{{FedoraVersion|long|next}} Beta 1.1 Installation}}
* {{code|Test_Results:{{FedoraVersion|long|next}} Beta 1.1 Base}}
* {{code|Test_Results:{{FedoraVersion|long|next}} Beta 1.1 Cloud}}
* {{code|Test_Results:{{FedoraVersion|long|next}} Beta 1.1 Desktop}}
* {{code|Test_Results:{{FedoraVersion|long|next}} Beta 1.1 Server}}
* {{code|Test_Results:{{FedoraVersion|long|next}} Beta 1.1 Security_Lab}}


* <code><nowiki>Test_Results:Fedora_21_Alpha_RC1_Installation</nowiki></code>
The release, milestone and compose should be changed appropriately, of course.
* <code><nowiki>Test_Results:Fedora_21_Alpha_RC1_Base</nowiki></code>
* <code><nowiki>Test_Results:Fedora_21_Alpha_RC1_Cloud</nowiki></code>
* <code><nowiki>Test_Results:Fedora_21_Alpha_RC1_Desktop</nowiki></code>
* <code><nowiki>Test_Results:Fedora_21_Alpha_RC1_Server</nowiki></code>
* <code><nowiki>Test_Results:Fedora_21_Alpha_RC1_Security_Lab</nowiki></code>


The release, milestone and compose should be changed appropriately, of course. Give each page '''only''' this content:
=== Create the page content ===
Give each page '''only''' this content:


<code><nowiki>{{subst:Validation_results|testtype=Installation|release=21|milestone=Alpha|compose=RC1}}</nowiki></code>
{{code|<nowiki>{{subst:Validation_results|testtype=Installation|release=</nowiki>{{FedoraVersionNumber|next}}<nowiki>|milestone=Beta|compose=1.1}}</nowiki>}}


Set the parameters - ''testtype'', ''release'', ''milestone'' and ''compose'' - appropriately, and save the page. Valid choices for ''testtype'' are those in the [[:Category:QA_test_matrix_templates]] category. This should generate the complete results page. If you mistype or leave out a parameter, do not attempt to fix the generated page by hand - simply edit the page, delete the entire contents, and try the generation again. A mistake causes no harm if it's promptly fixed, so don't worry.
Set the parameters - ''testtype'', ''release'', ''milestone'' and ''compose'' - appropriately, and save the page. Valid choices for ''testtype'' are those in the [[:Category:QA_test_matrix_templates]] category. This should generate the complete results page. If you mistype or leave out a parameter, do not attempt to fix the generated page by hand - simply edit the page, delete the entire contents, and try the generation again. A mistake causes no harm if it's promptly fixed, so don't worry.
Line 30: Line 53:
If you edit the page after creating it, you will see it looks very different! The contents that you now see in the edit dialog is a 'static' part of the page and will not be changed if the templates which enable this generation magic are changed. Particularly, adding new test cases to the [[:Category:QA_test_matrix_templates|test matrix templates]] does not cause them to magically appear in existing test result pages; they must be added manually if appropriate.
If you edit the page after creating it, you will see it looks very different! The contents that you now see in the edit dialog is a 'static' part of the page and will not be changed if the templates which enable this generation magic are changed. Particularly, adding new test cases to the [[:Category:QA_test_matrix_templates|test matrix templates]] does not cause them to magically appear in existing test result pages; they must be added manually if appropriate.


=== Categories ===
Often you will need to deal with categories. For each release, there should be a category named '''Fedora (Release) Test Results'''. For each milestone, there should be a category named '''Fedora (Release) (Milestone) Test Results'''.
Often you will need to deal with categories. For each release, there should be a category named '''Fedora (Release) Test Results'''. For each milestone, there should be a category named '''Fedora (Release) (Milestone) Test Results'''.


Line 36: Line 60:
The template will put appropriate explanatory text into the category page and ensure it is a member of the correct parent category.
The template will put appropriate explanatory text into the category page and ensure it is a member of the correct parent category.


Please note that you can make some special changes to the result pages according to the practical situation, such as copying parts of previous results to the new result page, highlighting critical cases, adding important notes etc.
=== Summary page ===
If you create the pages with relval, a Summary page will be created. There is no template to create this manually; the easiest way to do it is probably to copy a previous page and make the appropriate adjustments.


== Change current links and IRC topic ==  
=== Current redirects ===
One or two days before test event day, update the following redirect links and make sure they point to the pages you created:
One or two days before test event day, update the following redirect links and make sure they point to the pages you created:
* {{noredirect|Test_Results:Current_Installation_Test}}
* {{noredirect|Test_Results:Current_Installation_Test}}
* {{noredirect|Test_Results:Current_Base_Test}}
* {{noredirect|Test_Results:Current_Base_Test}}
* {{noredirect|Test_Results:Current_Cloud_Test}}
* {{noredirect|Test_Results:Current_Desktop_Test}}
* {{noredirect|Test_Results:Current_Desktop_Test}}
* {{noredirect|Test_Results:Current_Server_Test}}
* {{noredirect|Test_Results:Current_Server_Test}}
* {{noredirect|Test_Results:Current_Security_Lab_Test}}
* {{noredirect|Test_Results:Current_Security_Lab_Test}}
* {{noredirect|Test_Results:Current_Summary}}


{{admon/note|Relval|If you use the 'relval' tool describe above and pass the {{command|--current}} parameter, it will update these redirects for you.}}
=== ''CurrentFedoraCompose'' page ===
 
The [[Template:CurrentFedoraCompose|CurrentFedoraCompose template]] is the canonical reference for the nightly or TC/RC compose which is 'currently' being tested. If you are creating an event manually, you should update it, trying to follow the correct syntax; refer to the page history for examples. When you create an event with relval it is automatically updated so long as {{code|--no-current}} is not passed.
Also change the topic of #fedora-qa (freenode) to current test event (if you have the permission).


== Make announcements ==
== Make announcements ==
Currently, we announce validation test events to the [https://admin.fedoraproject.org/mailman/listinfo/test-announce test-announce] mailing list. The announcement mail should provide enough information related to the test focus areas. See an [https://lists.fedoraproject.org/pipermail/test-announce/2010-March/000040.html example] which generally includes:
Currently, we announce validation test events to the {{fplist|test-announce}} mailing list. The announcement mail should provide enough information related to the test focus areas. See an [https://lists.fedoraproject.org/pipermail/test-announce/2010-March/000040.html example] which generally includes:


* Introduction of this test event (date, what to test, which release criteria to meet, etc)
* Introduction of this test event (date, what to test, which release criteria to meet, etc)
* For composes after TC1, a note of changes from the last compose or a link to the compose request trac ticket which provides this info
* A note of changes from the last compose or a link to the compose request trac ticket which provides this info
* How and where to add test results
* How and where to add test results
* Contact information of QA members who are available on test event and can help testers who encounter problems
* Contact information of QA members who are available on test event and can help testers who encounter problems
Line 75: Line 101:
* Reference of ways to communicate at [[Communicate]]
* Reference of ways to communicate at [[Communicate]]


== When a new candidate build available ==
== When a new candidate compose is available ==
If a new candidate build is made available for testing, such as TC2, TC3, RC3, you should re-do the entire process: create new test result pages for it, send out a new announcement, and update the redirects. The announcement should highlight the specific changes from the previous candidate build, as in this [https://lists.fedoraproject.org/pipermail/test-announce/2010-March/000052.html example].
For each new candidate compose made available for testing, such as 1.2, 1.3, 1.4, you should re-do this entire process: create new test result pages for it, and send out a new announcement (except of course that this is now usually all automated - see the note at top of page). The announcement should note the specific changes from the previous candidate build, as in this [https://lists.fedoraproject.org/pipermail/test-announce/2010-March/000052.html example].


== Report and Summary ==
== Report and Summary ==
After testing has been completed, send a test summary to the [https://admin.fedoraproject.org/mailman/listinfo/test-announce test-announce] mailing list.  The test summary is intended to keep testers informed as to what was accomplished by their testing, whether there are any remaining tasks and to recognize key contributors.  A sample report is available at https://lists.fedoraproject.org/pipermail/test-announce/2010-October/000164.html.
The [https://www.happyassassin.net/testcase_stats/ testcase_stats] pages provide an ongoing summary of testing status throughout a cycle. Every so often, it is a good idea to use {{command|relval}}'s {{code|user-stats}} function to produce statistics on individual user contributions to the testing process, and produce a post on a blog or the Fedora Magazine or a similar value thanking the contributors. [http://fedoramagazine.org/heroes-of-fedora-qa-fedora-21-2/ Here is an example] of such a post.
 
The test summary should include the following information:
 
; Test status
: A summary about what has been tested, what's remaining to accomplish, what issues were faced during test and whether we have achieved testing aim
; Bugs reported
: List filed bugs and reported issues which were found during test.  Bug list including bug id, status, summary, etc... 
: The following command can be used to generate a list of bugs:
<pre>curl --stderr /dev/null "https://fedoraproject.org/wiki/Test_Results:Current_Installation_Test" \
  | grep -o "bugzilla\.redhat\.com.*[=\/][0-9]\{7\}" \
  | grep -o "[0-9]\{7\}" | tr '\n' ',' | sed 's|,$||' \
  | xargs bugzilla query --outputformat="%{bug_id} %{bug_status} %{resolution} - %{short_desc} - %{blocked}" -b</pre>
 
{{admon/note|Got <code>python-bugzilla</code>?|This command requires the {{package|python-bugzilla}} package be installed. You may need to weed out any 'false' references to pre-existing bugs which are mentioned on the page for some reason.}}
 
; Analyze results and assess risk
: Analyze the results and provide an assessment where additional risk areas may exist
; Thanks to all testers
: Testers have volunteered their time, it never hurts to be thankful for their contributions. 
: To generate a list of testers who contributed to the wiki, you can use the following command:
<pre>curl --stderr /dev/null "https://fedoraproject.org/w/index.php?title=Test_Results:Fedora_14_Final_RC1_Install&action=raw" \
  | grep -A9999999 "^=.*Test Matrix.*=$" \
  | sed -e "s|{{\(result[^}]*\)}}|\n\1 |g" | grep "^result" \
  | gawk 'BEGIN{FS="[| ]"} $3{print $3} !$3{print "UNTESTED"}' \
  | sort | uniq -c | sort -bgr</pre>


[[Category:QA SOPs]]
[[Category:QA SOPs]]
[[Category:Release_validation]]
[[Category:Release_validation]]

Latest revision as of 13:12, 15 December 2023

Introduction

The QA group co-ordinates release validation test events before each Fedora release and pre-release is made, to ensure they reach Fedora_Release_Criteria - see the QA:Release_validation_test_plan. This page provides guidance on arranging and announcing a release validation test event. For any concerns, please contact the QA group.

This process is now automated
The creation of release validation test events is now automated. Wiki pages are created and an announcement email sent by relvalconsumer. Please contact Adam Williamson for any details about this process. This SOP can still be followed in any unusual case where manual validation event creation is still required.

Nightly compose validation

A validation event can be created for any nightly compose, but they should not be created too frequently or if the new compose does not differ significantly from the one currently nominated for testing.

'Candidate' compose validation

Validation test events are held for each 'candidate' compose. These builds may be made in the weeks before the release date for any given milestone (Alpha, Beta and Final). QA is also responsible for requesting these composes, under the compose request SOP - you may also take responsibility for that work, or co-ordinate with the person who does.

Track image creation tickets

Find the image creation ticket from this list. There is a separate ticket for each milestone (Alpha, Beta, Final) of each release, so three tickets per release. Add yourself to the CC list. Then keep tracking this ticket until the images are available. If the images are not posted on time as scheduled or have a critical bug, please inform the rel-eng team about this by adding comments to the ticket.

Create test result pages and categories (using relval)

Test result pages are needed to gather test results of installation and desktop against Fedora candidate builds. The recommended way to create the result pages for the event is to use the relval tool. To install it, just run dnf install relval.

Normal relval usage

To use relval to generate pages for e.g. Fedora 42 Beta 1.1, you would run: relval compose --release 42 --milestone Beta --compose 1.1 --username (fas_user_name). You may also

To generate pages for a nightly compose event, you can use e.g. relval compose --release 42 --milestone Branched --compose 20160425.n.0 (for a Branched nightly) or relval compose --release 42 --milestone Rawhide --compose 20160425.n.0 (for a Rawhide nightly).

Test_Results:Current_(testtype)_Test redirects and CurrentFedoraCompose page

relval usually updates the Test_Results:Current_(testtype)_Test redirect pages and the CurrentFedoraCompose template, so if you are creating pages for an event which should not be considered the 'current' event yet, pass the --no-current option. If you want to update the current redirects later, you can run the command again, this time leaving out the option - as long as you do not include the parameter --force, the existing pages will not be overwritten, only the Current redirects will be updated.

Results

relval will print the pages it is creating as it works. It will always add the created pages to the test results category for the release and milestone - so in our example, you could find the created pages in Category:Fedora 42 Beta Test Results.

Create test result pages and categories (manually)

Manual creation strongly discouraged
Creating the pages manually is now strongly discouraged; relval will ensure the pages are created consistently and correctly, and is much easier than manual creation. In all normal circumstances, please use relval. The manual process is documented only as a fallback in case of problems with relval. Note also the page Wikitcms, which documents the conventions relating to release validation result storage in the wiki as a notional 'test management system'.

Create the pages

To do the creation manually, you would create pages for each of the 'test types' in this category first. For instance, for Fedora 42 Beta 1.1, you might create the pages:

  • Test_Results:Fedora 42 Beta 1.1 Installation
  • Test_Results:Fedora 42 Beta 1.1 Base
  • Test_Results:Fedora 42 Beta 1.1 Cloud
  • Test_Results:Fedora 42 Beta 1.1 Desktop
  • Test_Results:Fedora 42 Beta 1.1 Server
  • Test_Results:Fedora 42 Beta 1.1 Security_Lab

The release, milestone and compose should be changed appropriately, of course.

Create the page content

Give each page only this content:

{{subst:Validation_results|testtype=Installation|release=42|milestone=Beta|compose=1.1}}

Set the parameters - testtype, release, milestone and compose - appropriately, and save the page. Valid choices for testtype are those in the Category:QA_test_matrix_templates category. This should generate the complete results page. If you mistype or leave out a parameter, do not attempt to fix the generated page by hand - simply edit the page, delete the entire contents, and try the generation again. A mistake causes no harm if it's promptly fixed, so don't worry.

If you edit the page after creating it, you will see it looks very different! The contents that you now see in the edit dialog is a 'static' part of the page and will not be changed if the templates which enable this generation magic are changed. Particularly, adding new test cases to the test matrix templates does not cause them to magically appear in existing test result pages; they must be added manually if appropriate.

Categories

Often you will need to deal with categories. For each release, there should be a category named Fedora (Release) Test Results. For each milestone, there should be a category named Fedora (Release) (Milestone) Test Results.

All of these category pages can be created using the Template:Validation_results_milestone_category template, as described in that template's documentation. For both, you pass the template the release parameter. For a milestone category page, you also pass it the milestone parameter.

The template will put appropriate explanatory text into the category page and ensure it is a member of the correct parent category.

Summary page

If you create the pages with relval, a Summary page will be created. There is no template to create this manually; the easiest way to do it is probably to copy a previous page and make the appropriate adjustments.

Current redirects

One or two days before test event day, update the following redirect links and make sure they point to the pages you created:

CurrentFedoraCompose page

The CurrentFedoraCompose template is the canonical reference for the nightly or TC/RC compose which is 'currently' being tested. If you are creating an event manually, you should update it, trying to follow the correct syntax; refer to the page history for examples. When you create an event with relval it is automatically updated so long as --no-current is not passed.

Make announcements

Currently, we announce validation test events to the test-announce mailing list. The announcement mail should provide enough information related to the test focus areas. See an example which generally includes:

  • Introduction of this test event (date, what to test, which release criteria to meet, etc)
  • A note of changes from the last compose or a link to the compose request trac ticket which provides this info
  • How and where to add test results
  • Contact information of QA members who are available on test event and can help testers who encounter problems
  • Others to be emphasized

Download links should not be contained in the announcement, since in general there are several download options, and these are already documented on the result pages. The announcement should direct people to go to these pages for download instructions.

Please announce this event on the mailing list at least one or two days in advance. This should give testers sufficient time to arrange their calendars and prepare a test environment. It is also a good idea to send a reminder e-mail the day before the test. Try to take timezones into account, to maximize convenience for testers from different regions or countries.

Supplemental announcements

It can be helpful to send supplemental announcements to other interested groups. It is a very good idea to notify the Product working groups - Workstation, Server and Cloud - of the release and request help with the parts of testing that are relevant to their products. Relevant lists are the desktop, server and cloud lists.

It can also be helpful to notify the major desktop groups - , KDE, Xfce and LXDE - of each candidate build, and refer them to the desktop testing matrix. Here is an example announcement. If possible, it is a good idea to do a 'smoke test' on each desktop live image - ensure it boots successfully to a working desktop - before sending the announcement, to avoid wasting the time and bandwidth of the desktop group members downloading un-testable images.

Provide help during test event

During the test event, many people will participate, including experienced users and new comers. Make sure the QA folks whose contact information was announced to mailing list in this test event are available during the testing period. They will provide assistance to those who encounter issues. QA people should be available at:

When a new candidate compose is available

For each new candidate compose made available for testing, such as 1.2, 1.3, 1.4, you should re-do this entire process: create new test result pages for it, and send out a new announcement (except of course that this is now usually all automated - see the note at top of page). The announcement should note the specific changes from the previous candidate build, as in this example.

Report and Summary

The testcase_stats pages provide an ongoing summary of testing status throughout a cycle. Every so often, it is a good idea to use relval's user-stats function to produce statistics on individual user contributions to the testing process, and produce a post on a blog or the Fedora Magazine or a similar value thanking the contributors. Here is an example of such a post.