From Fedora Project Wiki

This page provides a high-level roadmap for implementing the Is_anaconda_broken_proposal project. More detailed tasks can be found in autoqa TRAC roadmap. We follow these steps to define the methods by which we initiate testing


First, in order to provide a consistent and documented test approach, the existing Fedora Install test plan [1] will be revisited. The test plan will be adjusted to ensure proper test coverage for the failure scenarios listed above. Existing test cases will be reviewed for accuracy. New test cases will be created using the Template:QA/Test_Case template. Finally, the test plan will be adjusted to match the improved Fedora Release Criteria [2]. This includes adjusting the test case priority to match milestone criteria.

Next, in order to reduce the setup/execution time, improve efficiency and to provide test results on a more consistent basis, a subset of test cases will be chosen for automation. Tests will be written in python and will be developed and executed on a system supporting KVM virtualization. Test scripts will be responsible for preparing a virtual install environment, initiating a kickstart install and validating the results. Once an initial batch of tests exist, they will be formally integrated into the AutoQA project.

Last, a method will be developed for collecting test results into a single test result matrix. Results may be posted to the wiki directly, or a custom turbogears application may be needed to display results [3]. The results will be easily accessible for testers and the installer development team.


The project will be divided into several phases.

Phase#1 - proof of concept

  • Pass pass Revise Fedora Install test plan to ensure adequate test coverage exists for failure scenarios listed above
  • Pass pass Select a small, but representative, subset of test cases from the install test plan to automate
    The following test cases are selected:
    • Rawhide Acceptance Test Plan [[1]]
    • DVD.iso Installation
    • Boot.iso/Netinst.iso Installation
    • Live.iso Installation
    • URL Installation
    • upgrade an exiting system
    • system with basic video driver
    • Rescue installed system
    • Memory test [[2]]
  • Pass pass Create python scripts to prepare KVM-based virtual environments for testing, initiate kickstart installs, and validate results
  • Pass pass Investigate methods for leveraging GUI automation to aid in automating applicable test cases

Phase#2 - implementation

Implement the selected test cases

  • Pass pass Rawhide Acceptance Test Plan [[3]]
    • Pass pass repodata validity[[4]]
    • Pass pass comps.xml validity[[5]]
    • Pass pass Core package dependency closure[[6]]
    • Pass pass Core package existence[[7]]
    • Pass pass installer image existence[[8]]
    • Pass pass Kernel boot[[9]]
    • Pass pass Anaconda loader fetching stage2[[10]]
    • Pass pass Anaconda stage2 disk probe[[11]]
    • Pass pass Anaconda package install[[12]]
  • Inprogress inprogress DVD.iso Installation
    • Inprogress inprogress mediakit_ISO size[[13]]
    • Inprogress inprogress mediakit_ISO checksums[[14]]
    • Inprogress inprogress mediakit repoclosure[[15]]
    • Inprogress inprogress mediakit file conflicts[[16]]
    • Inprogress inprogress boot methods[[17]]
    • Inprogress inprogress install source[[18]]
  • Boot.iso/Netinst.iso Installation
    • mediakit ISO size[[19]]
    • mediakit ISO checksums[[20]]
    • boot methods[[21]]
    • install source[[22]]
  • Live.iso Installation
    • media ISO size[[23]]
    • mediakit ISO checksums[[24]]
  • upgrade an exiting system[[25]]
    • perform a default installation of the previous release
    • install the current release
  • system with basic video driver[[26]]
  • Rescue installed system[[27]]
  • Memory test [[28]]

Identify and automate remainder of test cases from the install test plan

  • Rawhide Acceptance Test Plan
    • Anaconda bootloader setup[[29]]
    • X startup/basic display configuration[[30]]
    • X basic input handing[[31]]
    • basic network connectivity[[32]]
    • yum update functionality[[33]]
  • DVD installation
    • additional http repository[[34]]
    • additional ftp repository[[35]]
    • additional mirrorlist repository[[36]]
    • additional nfs repository[[37]]
  • Boot.iso/netinst.iso installation
    • http repository[[38]]
  • Live.iso installation
    • install source live image[[39]]
  • URL installation
    • Anaconda bootloader setup[[40]]
    • X startup/basic display configuration[[41]]
    • X basic input handing[[42]]
    • basic network connectivity[[43]]
    • yum update functionality[[44]]

Implement the general test cases

  • Anaconda user interface graphical [[45]]
  • Anaconda user interface basic video driver[[46]]
  • Anaconda user interface text [[47]]
  • Anaconda user interface VNC [[48]]
  • Anaconda user interface cmdline [[49]]
  • parse different repository: cdrom, http,ftp(anonymout, non anonymus), nfs, nfsiso, hard drive[[50]][[51]]
  • parse different kickstart delivery: http, file, hard drive, nfs [[52]] [[53]][[54]][[55]]
  • different packages selections: default, minimal [[56]][[57]]
  • different partation: autopart, autopart encrypted, autopart shrink install, autopart use free space, ext4 on native device, ext3 on native device, no swap, software raid
  • rescue mode
    • update.img via url installation source local media[[58]] [[59]][[60]]
    • Anaconda save traceback to remote system/ bugzilla/ disk /debug mode [[61]][[62]][[63]][[64]]
  • upgrade
    • new boot loader[[65]]
    • skip boot loader[[66]]
    • updte boot loader[[67]]
    • encrypted root[[68]]
    • skip boot loader text mode[[69]]
    • update boot loader text mode[[70]]
  • preupgrade
    • preupgrade[[71]]
    • preupgrade from older release[[72]]

Identify and integrate anaconda built-in unit tests into tests

Identify test event triggers which will be used to automatically initiate testing

Identify test cases where GUI automation will be required

Phase#3 - integration

  • Create appropriate control files and test wrappers to allow for scheduling tests through AutoQA (see Writing_AutoQA_Tests)
  • Develop or update AutoQA test event hooks to accommodate new test events (see Writing_AutoQA_Hooks)
  • Implement initial test result dashboard intended to eventually replace the wiki test matrix. The dashboard will also support FAS user test result submission. This will likely rely on