From Fedora Project Wiki

< FWN‎ | Beats
Revision as of 20:16, 11 August 2011 by Adamwill (talk | contribs) (fix up https)

QualityAssurance

In this section, we cover the activities of the QA team[1]. For more information on the work of the QA team and how you can get involved, see the Joining page[2].

Contributing Writer: Adam Williamson

Test Days

The Fedora 15 Test Day track is now finished, and the main Fedora 16 Test Day track has not yet started. If you would like to propose a main track Test Day for the Fedora 16 cycle, please contact the QA team via email or IRC, or file a ticket in QA Trac[1]. At the weekly group meeting of 2011-07-18[2], the group agreed to delay the planned Fedora 15 on Amazon EC2 Test Day on 2011-07-19, as the images would not be ready in time. Adam Williamson pencilled in the X Test Week for 2011-08-30 to 2011-09-01[3], and Jaroslav Škarvada proposed a power management Test Day for 2011-09-29[4]. Adam sent out a call for Test Days[5].

The Fedora 15 on Amazon EC2 Test Day was eventually held on 2011-08-04[6]. The turnout was modest, but the five testers present were able to confirm the provided AMIs mostly worked well, and expose a few bugs.

Fedora 16 Alpha preparation

Throughout the last few weeks, the team has been working to prepare for the Fedora 16 Alpha release. A first acceptance test run was attempted by Tao Wu on 2011-07-19[1], and ran into critical early failure in the installer. A second attempt was made on 2011-07-26, and failed similarly[2]. The first (and only) test compose was released behind schedule on 2011-08-02[3], and again contained significant bugs. Adam Williamson started a post-TC1 strategy discussion[4] to decide what to do in case it seemed impractical to produce a release candidate in a reasonable timeframe, but in the event, all TC1 blockers were thought to be addressed by 2011-08-06, and a release candidate was produced[5].

In the meantime, blocker bug review meetings were held each Friday - 2011-07-22[6], 2011-07-29[7] and 2011-08-05[8] to review the substantial volume of blocker bugs which were identified.

oVirt node spin review and testing

At the 2011-07-18 weekly meeting, the group held an initial discussion of the proposed oVirt node spin[9], from the standpoint of whether to grant it QA approval. Athmane Madjoudj volunteered to work on making sure the necessary testing framework was in place. By 2011-07-22, he had a draft validation matrix[10] ready for review[11]. The draft matrix was reviewed at the weekly meeting of 2011-07-25[12], and the group agreed Athmane's validation matrix was good and the oVirt spin should be granted QA approval.

QA group meeting SOP

James Laska announced[1] that he had put the draft group meeting SOP (see FWN #282) into production[2].

Separation of release validation and feature processes

At the FESCo meeting of 2011-07-18[1], FESCo approved the group's proposal (see FWN #282) to formalize the separation between the release validation and feature processes. Adam Williamson subsequently announced that he had made the necessary changes to the wiki[2].

Fedora 16 Alpha RATs run

Tao Wu announced the completion of the first RATs (Rawhide Acceptance Tests) automated installation testing run for Fedora 16 Alpha[1]. He reported that the testing failed due to a major bug in installation[2].

Instalatron anaconda testing framework

Sergio Rubio of FrameOS[1] wrote to let the group know[2] of the release of Instalatron[3], a testing framework for anaconda based around VirtualBox input automation and ImageMagick image comparison. James Laska replied to thank Sergio for reaching out, and to point out the similar work being done by Tao Wu and Hongqing Yang to automate the Fedora installation validation matrix[4]. Tim Flink asked some questions about the design of Instalatron[5], and Sergio provided some answers[6]. Eric Blake noted that KVM had recently grown the ability to inject keyboard scancodes[7], which Sergio had cited as the main reason for choosing VirtualBox. David Cantrell gave a heads-up that the design of anaconda would soon change quite drastically[8], and James recommended the use of AT-SPI in preference to image analysis[9]. Sergio thanked everyone for their feedback[10].

Release criteria updates

James Laska followed up his initial survey of ways to handle secondary architecture release criteria (see FWN #281) with a draft[1] of the preferred approach[2].

James also proposed some changes to the criteria following from the second Alpha blocker bug review meeting[3]. Rui He adjusted a test case to reflect the proposed change[4]. Adam Williamson suggested a change to James' proposed shutdown criterion[5], which prompted some discussion. Ultimately James updated the criteria with the revised proposals[6], and proposed a test case to enforce the shutdown criterion[7].

Release criteria and validation testing

Rui He continued adjusting installation validation test cases in response to Adam Williamson's release criteria / validation test concordance survey. She added a test for uncategorized packages[1], updated some test cases to check unattended installations work[2], added a test for the 'use existing Linux partitions' partitioning method[3], updated the rescue mode test case[4], and added test cases for btrfs and xfs installations[5].

Acceptance testing SOP

Rui He proposed the creation of an SOP for the rawhide acceptance testing events[1]. Tao Wu worked on a draft SOP[2], and Adam Williamson provided feedback. Eventually, Tao, Adam and James Laska progressed to a broader discussion on the nature of RATS events, and whether they should simply be folded into the Test Compose / Release Candidate process.

Security testing scripts

Steve Grubb announced[1] some scripts for testing the security of Fedora[2]. Adam Williamson thanked him for the work, and wondered if any of the scripts would be suitable for incorporation into AutoQA[3]. Kamil Paral highlighted some issues with integrating third party tests in the current state of AutoQA[4].

AutoQA

James Laska wondered if it would be possible to run depcheck tests on EPEL packages[1]. Kamil Paral said it had not been tried yet, and had some questions about the benefits. He summarized that "Overall it should be doable, but it requires quite some work and resources."[2]. James said he would check if it was the EPEL SIG or individual maintainers who were interested[3].

Josef Skladanka posted[4] a "brain dump" of ideas he and Kamil had come up with around depcheck[5].

Kamil proposed (and later carried out) the inclusion of a NEWS file in the AutoQA source[6], and provided a draft[7].

The group continued to work on several tasks related to making AutoQA output more attractive and legible[8] [9] [10].