(update the instructions for creating test results pages with my new template-y setup) |
(remove IRC references) |
||
(28 intermediate revisions by 3 users not shown) | |||
Line 2: | Line 2: | ||
The QA group co-ordinates release validation test events before each Fedora release and pre-release is made, to ensure they reach [[Fedora_Release_Criteria]] - see the [[QA:Release_validation_test_plan]]. This page provides guidance on arranging and announcing a release validation test event. For any concerns, please contact the [[QA]] group. | The QA group co-ordinates release validation test events before each Fedora release and pre-release is made, to ensure they reach [[Fedora_Release_Criteria]] - see the [[QA:Release_validation_test_plan]]. This page provides guidance on arranging and announcing a release validation test event. For any concerns, please contact the [[QA]] group. | ||
{{admon/caution|This process is now automated|The creation of release validation test events is now automated. Wiki pages are created and an announcement email sent by [https://pagure.io/fedora-qa/relvalconsumer relvalconsumer]. Please contact [[User:Adamwill|Adam Williamson]] for any details about this process. This SOP can still be followed in any unusual case where manual validation event creation is still required.}} | |||
== Track image creation tickets== | == Nightly compose validation == | ||
A validation event can be created for any nightly compose, but they should not be created too frequently or if the new compose does not differ significantly from the one currently nominated for testing. | |||
== 'Candidate' compose validation == | |||
Validation test events are held for each 'candidate' compose. These builds may be made in the weeks before the release date for any given milestone (Alpha, Beta and Final). QA is also responsible for requesting these composes, under the [[QA:SOP_compose_request|compose request SOP]] - you may also take responsibility for that work, or co-ordinate with the person who does. | |||
== Track image creation tickets == | |||
Find the image creation ticket from [https://fedorahosted.org/rel-eng/report/2 this list]. There is a separate ticket for each milestone (Alpha, Beta, Final) of each release, so three tickets per release. Add yourself to the CC list. Then keep tracking this ticket until the images are available. If the images are not posted on time as scheduled or have a critical bug, please inform the rel-eng team about this by adding comments to the ticket. | Find the image creation ticket from [https://fedorahosted.org/rel-eng/report/2 this list]. There is a separate ticket for each milestone (Alpha, Beta, Final) of each release, so three tickets per release. Add yourself to the CC list. Then keep tracking this ticket until the images are available. If the images are not posted on time as scheduled or have a critical bug, please inform the rel-eng team about this by adding comments to the ticket. | ||
{{anchor|relval}} | |||
== Create test result pages and categories (using relval) == | |||
The | Test result pages are needed to gather test results of installation and desktop against Fedora candidate builds. The recommended way to create the result pages for the event is to use the [https://www.happyassassin.net/wikitcms/ relval] tool. To install it, just run {{command|dnf install relval}}. | ||
=== Normal relval usage === | |||
To use relval to generate pages for e.g. {{FedoraVersion|long|next}} Beta 1.1, you would run: {{command|relval compose --release {{FedoraVersion||next}} --milestone Beta --compose 1.1 --username (fas_user_name)}}. You may also | |||
Set the parameters - ''testtype'', ''release'', ''milestone'' and ''compose'' - appropriately, and save the page. This should generate the complete results page. If you mistype or leave out a parameter, do not attempt to fix the generated page by hand - simply edit the page, delete the entire contents, and try the generation again. A mistake causes no harm if it's promptly fixed, so don't worry. | To generate pages for a nightly compose event, you can use e.g. {{command|relval compose --release {{FedoraVersion||next}} --milestone Branched --compose 20160425.n.0}} (for a [[Releases/Branched|Branched]] nightly) or {{command|relval compose --release {{FedoraVersion||next}} --milestone Rawhide --compose 20160425.n.0}} (for a [[Releases/Rawhide|Rawhide]] nightly). | ||
=== ''Test_Results:Current_(testtype)_Test'' redirects and ''CurrentFedoraCompose'' page === | |||
relval usually updates the {{code|Test_Results:Current_(testtype)_Test}} redirect pages and the [[Template:CurrentFedoraCompose|CurrentFedoraCompose template]], so if you are creating pages for an event which should not be considered the 'current' event yet, pass the {{code|--no-current}} option. If you want to update the current redirects later, you can run the command again, this time leaving out the option - as long as you do not include the parameter {{code|--force}}, the existing pages will not be overwritten, only the Current redirects will be updated. | |||
=== Results === | |||
relval will print the pages it is creating as it works. It will always add the created pages to the test results category for the release and milestone - so in our example, you could find the created pages in [[:Category:{{FedoraVersion|long|next}} Beta Test Results]]. | |||
== Create test result pages and categories (manually) == | |||
{{admon/warning|Manual creation strongly discouraged|Creating the pages manually is now strongly discouraged; relval will ensure the pages are created consistently and correctly, and is much easier than manual creation. In all normal circumstances, [[#relval|please use relval]]. The manual process is documented only as a fallback in case of problems with relval. Note also the page [[Wikitcms]], which documents the conventions relating to release validation result storage in the wiki as a notional 'test management system'.}} | |||
=== Create the pages === | |||
To do the creation manually, you would create pages for each of the 'test types' in [[:Category:QA_test_matrix_templates|this category]] first. For instance, for {{FedoraVersion|long|next}} Beta 1.1, you might create the pages: | |||
* {{code|Test_Results:{{FedoraVersion|long|next}} Beta 1.1 Installation}} | |||
* {{code|Test_Results:{{FedoraVersion|long|next}} Beta 1.1 Base}} | |||
* {{code|Test_Results:{{FedoraVersion|long|next}} Beta 1.1 Cloud}} | |||
* {{code|Test_Results:{{FedoraVersion|long|next}} Beta 1.1 Desktop}} | |||
* {{code|Test_Results:{{FedoraVersion|long|next}} Beta 1.1 Server}} | |||
* {{code|Test_Results:{{FedoraVersion|long|next}} Beta 1.1 Security_Lab}} | |||
The release, milestone and compose should be changed appropriately, of course. | |||
=== Create the page content === | |||
Give each page '''only''' this content: | |||
{{code|<nowiki>{{subst:Validation_results|testtype=Installation|release=</nowiki>{{FedoraVersionNumber|next}}<nowiki>|milestone=Beta|compose=1.1}}</nowiki>}} | |||
Set the parameters - ''testtype'', ''release'', ''milestone'' and ''compose'' - appropriately, and save the page. Valid choices for ''testtype'' are those in the [[:Category:QA_test_matrix_templates]] category. This should generate the complete results page. If you mistype or leave out a parameter, do not attempt to fix the generated page by hand - simply edit the page, delete the entire contents, and try the generation again. A mistake causes no harm if it's promptly fixed, so don't worry. | |||
If you edit the page after creating it, you will see it looks very different! The contents that you now see in the edit dialog is a 'static' part of the page and will not be changed if the templates which enable this generation magic are changed. Particularly, adding new test cases to the [[:Category:QA_test_matrix_templates|test matrix templates]] does not cause them to magically appear in existing test result pages; they must be added manually if appropriate. | If you edit the page after creating it, you will see it looks very different! The contents that you now see in the edit dialog is a 'static' part of the page and will not be changed if the templates which enable this generation magic are changed. Particularly, adding new test cases to the [[:Category:QA_test_matrix_templates|test matrix templates]] does not cause them to magically appear in existing test result pages; they must be added manually if appropriate. | ||
Often you will need to deal with categories. For each release, there should be a category named '''Fedora (Release) | === Categories === | ||
Often you will need to deal with categories. For each release, there should be a category named '''Fedora (Release) Test Results'''. For each milestone, there should be a category named '''Fedora (Release) (Milestone) Test Results'''. | |||
All of these category pages can be created using the [[Template:Validation_results_milestone_category]] template, as described in that template's documentation. For both, you pass the template the ''release'' parameter. For a milestone category page, you also pass it the ''milestone'' parameter. | |||
The template will put appropriate explanatory text into the category page and ensure it is a member of the correct parent category. | |||
=== Summary page === | |||
If you create the pages with relval, a Summary page will be created. There is no template to create this manually; the easiest way to do it is probably to copy a previous page and make the appropriate adjustments. | |||
== | === Current redirects === | ||
One or two days before test event day, update the following redirect links and make sure they point to the pages you created: | One or two days before test event day, update the following redirect links and make sure they point to the pages you created: | ||
* {{noredirect|Test_Results:Current_Installation_Test}} | * {{noredirect|Test_Results:Current_Installation_Test}} | ||
* {{noredirect|Test_Results:Current_Base_Test}} | * {{noredirect|Test_Results:Current_Base_Test}} | ||
* {{noredirect|Test_Results:Current_Cloud_Test}} | |||
* {{noredirect|Test_Results:Current_Desktop_Test}} | * {{noredirect|Test_Results:Current_Desktop_Test}} | ||
* {{noredirect|Test_Results:Current_Server_Test}} | * {{noredirect|Test_Results:Current_Server_Test}} | ||
* {{noredirect|Test_Results:Current_Security_Lab_Test}} | * {{noredirect|Test_Results:Current_Security_Lab_Test}} | ||
* {{noredirect|Test_Results:Current_Summary}} | |||
=== ''CurrentFedoraCompose'' page === | |||
The [[Template:CurrentFedoraCompose|CurrentFedoraCompose template]] is the canonical reference for the nightly or TC/RC compose which is 'currently' being tested. If you are creating an event manually, you should update it, trying to follow the correct syntax; refer to the page history for examples. When you create an event with relval it is automatically updated so long as {{code|--no-current}} is not passed. | |||
== Make announcements == | == Make announcements == | ||
Currently, we announce validation test events to the | Currently, we announce validation test events to the {{fplist|test-announce}} mailing list. The announcement mail should provide enough information related to the test focus areas. See an [https://lists.fedoraproject.org/pipermail/test-announce/2010-March/000040.html example] which generally includes: | ||
* Introduction of this test event (date, what to test, which release criteria to meet, etc) | * Introduction of this test event (date, what to test, which release criteria to meet, etc) | ||
* | * A note of changes from the last compose or a link to the compose request trac ticket which provides this info | ||
* How and where to add test results | * How and where to add test results | ||
* Contact information of QA members who are available on test event and can help testers who encounter problems | * Contact information of QA members who are available on test event and can help testers who encounter problems | ||
Line 64: | Line 101: | ||
* Reference of ways to communicate at [[Communicate]] | * Reference of ways to communicate at [[Communicate]] | ||
== When a new candidate | == When a new candidate compose is available == | ||
For each new candidate compose made available for testing, such as 1.2, 1.3, 1.4, you should re-do this entire process: create new test result pages for it, and send out a new announcement (except of course that this is now usually all automated - see the note at top of page). The announcement should note the specific changes from the previous candidate build, as in this [https://lists.fedoraproject.org/pipermail/test-announce/2010-March/000052.html example]. | |||
== Report and Summary == | == Report and Summary == | ||
The [https://www.happyassassin.net/testcase_stats/ testcase_stats] pages provide an ongoing summary of testing status throughout a cycle. Every so often, it is a good idea to use {{command|relval}}'s {{code|user-stats}} function to produce statistics on individual user contributions to the testing process, and produce a post on a blog or the Fedora Magazine or a similar value thanking the contributors. [http://fedoramagazine.org/heroes-of-fedora-qa-fedora-21-2/ Here is an example] of such a post. | |||
[[Category:QA SOPs]] | [[Category:QA SOPs]] | ||
[[Category: | [[Category:Release_validation]] | ||
Latest revision as of 13:12, 15 December 2023
Introduction
The QA group co-ordinates release validation test events before each Fedora release and pre-release is made, to ensure they reach Fedora_Release_Criteria - see the QA:Release_validation_test_plan. This page provides guidance on arranging and announcing a release validation test event. For any concerns, please contact the QA group.
Nightly compose validation
A validation event can be created for any nightly compose, but they should not be created too frequently or if the new compose does not differ significantly from the one currently nominated for testing.
'Candidate' compose validation
Validation test events are held for each 'candidate' compose. These builds may be made in the weeks before the release date for any given milestone (Alpha, Beta and Final). QA is also responsible for requesting these composes, under the compose request SOP - you may also take responsibility for that work, or co-ordinate with the person who does.
Track image creation tickets
Find the image creation ticket from this list. There is a separate ticket for each milestone (Alpha, Beta, Final) of each release, so three tickets per release. Add yourself to the CC list. Then keep tracking this ticket until the images are available. If the images are not posted on time as scheduled or have a critical bug, please inform the rel-eng team about this by adding comments to the ticket.
Create test result pages and categories (using relval)
Test result pages are needed to gather test results of installation and desktop against Fedora candidate builds. The recommended way to create the result pages for the event is to use the relval tool. To install it, just run dnf install relval
.
Normal relval usage
To use relval to generate pages for e.g. Fedora 42 Beta 1.1, you would run: relval compose --release 42 --milestone Beta --compose 1.1 --username (fas_user_name)
. You may also
To generate pages for a nightly compose event, you can use e.g. relval compose --release 42 --milestone Branched --compose 20160425.n.0
(for a Branched nightly) or relval compose --release 42 --milestone Rawhide --compose 20160425.n.0
(for a Rawhide nightly).
Test_Results:Current_(testtype)_Test redirects and CurrentFedoraCompose page
relval usually updates the Test_Results:Current_(testtype)_Test redirect pages and the CurrentFedoraCompose template, so if you are creating pages for an event which should not be considered the 'current' event yet, pass the --no-current option. If you want to update the current redirects later, you can run the command again, this time leaving out the option - as long as you do not include the parameter --force, the existing pages will not be overwritten, only the Current redirects will be updated.
Results
relval will print the pages it is creating as it works. It will always add the created pages to the test results category for the release and milestone - so in our example, you could find the created pages in Category:Fedora 42 Beta Test Results.
Create test result pages and categories (manually)
Create the pages
To do the creation manually, you would create pages for each of the 'test types' in this category first. For instance, for Fedora 42 Beta 1.1, you might create the pages:
- Test_Results:Fedora 42 Beta 1.1 Installation
- Test_Results:Fedora 42 Beta 1.1 Base
- Test_Results:Fedora 42 Beta 1.1 Cloud
- Test_Results:Fedora 42 Beta 1.1 Desktop
- Test_Results:Fedora 42 Beta 1.1 Server
- Test_Results:Fedora 42 Beta 1.1 Security_Lab
The release, milestone and compose should be changed appropriately, of course.
Create the page content
Give each page only this content:
{{subst:Validation_results|testtype=Installation|release=42|milestone=Beta|compose=1.1}}
Set the parameters - testtype, release, milestone and compose - appropriately, and save the page. Valid choices for testtype are those in the Category:QA_test_matrix_templates category. This should generate the complete results page. If you mistype or leave out a parameter, do not attempt to fix the generated page by hand - simply edit the page, delete the entire contents, and try the generation again. A mistake causes no harm if it's promptly fixed, so don't worry.
If you edit the page after creating it, you will see it looks very different! The contents that you now see in the edit dialog is a 'static' part of the page and will not be changed if the templates which enable this generation magic are changed. Particularly, adding new test cases to the test matrix templates does not cause them to magically appear in existing test result pages; they must be added manually if appropriate.
Categories
Often you will need to deal with categories. For each release, there should be a category named Fedora (Release) Test Results. For each milestone, there should be a category named Fedora (Release) (Milestone) Test Results.
All of these category pages can be created using the Template:Validation_results_milestone_category template, as described in that template's documentation. For both, you pass the template the release parameter. For a milestone category page, you also pass it the milestone parameter.
The template will put appropriate explanatory text into the category page and ensure it is a member of the correct parent category.
Summary page
If you create the pages with relval, a Summary page will be created. There is no template to create this manually; the easiest way to do it is probably to copy a previous page and make the appropriate adjustments.
Current redirects
One or two days before test event day, update the following redirect links and make sure they point to the pages you created:
- Test_Results:Current_Installation_Test
- Test_Results:Current_Base_Test
- Test_Results:Current_Cloud_Test
- Test_Results:Current_Desktop_Test
- Test_Results:Current_Server_Test
- Test_Results:Current_Security_Lab_Test
- Test_Results:Current_Summary
CurrentFedoraCompose page
The CurrentFedoraCompose template is the canonical reference for the nightly or TC/RC compose which is 'currently' being tested. If you are creating an event manually, you should update it, trying to follow the correct syntax; refer to the page history for examples. When you create an event with relval it is automatically updated so long as --no-current is not passed.
Make announcements
Currently, we announce validation test events to the test-announce mailing list. The announcement mail should provide enough information related to the test focus areas. See an example which generally includes:
- Introduction of this test event (date, what to test, which release criteria to meet, etc)
- A note of changes from the last compose or a link to the compose request trac ticket which provides this info
- How and where to add test results
- Contact information of QA members who are available on test event and can help testers who encounter problems
- Others to be emphasized
Download links should not be contained in the announcement, since in general there are several download options, and these are already documented on the result pages. The announcement should direct people to go to these pages for download instructions.
Please announce this event on the mailing list at least one or two days in advance. This should give testers sufficient time to arrange their calendars and prepare a test environment. It is also a good idea to send a reminder e-mail the day before the test. Try to take timezones into account, to maximize convenience for testers from different regions or countries.
Supplemental announcements
It can be helpful to send supplemental announcements to other interested groups. It is a very good idea to notify the Product working groups - Workstation, Server and Cloud - of the release and request help with the parts of testing that are relevant to their products. Relevant lists are the desktop, server and cloud lists.
It can also be helpful to notify the major desktop groups - , KDE, Xfce and LXDE - of each candidate build, and refer them to the desktop testing matrix. Here is an example announcement. If possible, it is a good idea to do a 'smoke test' on each desktop live image - ensure it boots successfully to a working desktop - before sending the announcement, to avoid wasting the time and bandwidth of the desktop group members downloading un-testable images.
Provide help during test event
During the test event, many people will participate, including experienced users and new comers. Make sure the QA folks whose contact information was announced to mailing list in this test event are available during the testing period. They will provide assistance to those who encounter issues. QA people should be available at:
- IRC: #fedora-qa on irc.freenode.net
- Mailing list: test list
- Reference of ways to communicate at Communicate
When a new candidate compose is available
For each new candidate compose made available for testing, such as 1.2, 1.3, 1.4, you should re-do this entire process: create new test result pages for it, and send out a new announcement (except of course that this is now usually all automated - see the note at top of page). The announcement should note the specific changes from the previous candidate build, as in this example.
Report and Summary
The testcase_stats pages provide an ongoing summary of testing status throughout a cycle. Every so often, it is a good idea to use relval
's user-stats function to produce statistics on individual user contributions to the testing process, and produce a post on a blog or the Fedora Magazine or a similar value thanking the contributors. Here is an example of such a post.