#34 optimize test cases
Closed: Fixed None Opened 15 years ago by liam.

try to optimize test cases to reduce the test cases in matrix, for example: after execute a default DVD install,we have finished these cases:
QA/TestCases/BootMethodsDvd
QA/TestCases/InstallSourceDvd
QA/TestCases/PackageSetsDefaultPackageInstall
QA:Testcase_Anaconda_autopart_install
QA/TestCases/PartitioningRootfsOnLvmDevice
QA/TestCases/PartitioningSwapOnLvmDevice
QA/TestCases/UserInterfaceGraphical

but we have to fill 7 results, if possible, try to combine and reduce these cases


This is largely by design of the test plan (see [https://fedoraproject.org/wiki/QA:Fedora_12_Install_Test_Plan#Test_Strategy QA:Fedora_12_Install_Test_Plan#Test_Strategy]). I've found in the past that specifying 1 test case of DVD install, wasn't fine grained enough for our installations. For example, in the event of a failure during DVD install, we'd see:

  • QA:Testcase install from DVD - '''FAIL'''

There were so many different aspects of a DVD install that could fail, all with different levels of severity and impact. So we chose to divide up the different steps of the installer into a test case.

  • QA:Testcase boot from DVD - '''PASS'''
  • QA:Testcase install source from DVD - '''PASS'''
  • QA:Testcase autopart install - '''PASS'''
  • QA:Testcase graphical user inferface - '''PASS'''
  • QA:Testcase yum repo from media - '''FAIL'''

We gain improved granularity with this method, but it does make data entry more difficult. This is definitely something I'd see the [https://fedoraproject.org/wiki/Is_anaconda_broken_proposal Is Anaconda Broken] project addressing when the automated tests post results (much like [http://jlaska.fedorapeople.org/irb.png israwhidebroken does now]). So there would be basically little (to no) manual data entry for these tests.

Can we combine the intent of this ticket with the goals of the [https://fedoraproject.org/wiki/Is_anaconda_broken_proposal Is Anaconda Broken] project?

For the test cases you've listed, my thoughts are noted below:

  • QA/TestCases/BootMethodsDvd
  • Attempts to address media composition problems, syslinux or kernel failures
  • QA/TestCases/InstallSourceDvd
  • Validates anaconda loader detection of stage#2 install.img on local media, as well as package repositories on local media
  • QA/TestCases/PackageSetsDefaultPackageInstall
  • Conflicts/deps tests with the default package set. I'd be comfortable seeing this test change to something more scripted using yum repoclosure.
  • QA:Testcase_Anaconda_autopart_install
  • QA/TestCases/PartitioningRootfsOnLvmDevice
  • Let's move this to [https://fedoraproject.org/wiki/Category:Obsolete_Test_Cases Category:Obsolete_Test_Cases], it's a duplicate of QA:Testcase_Anaconda_autopart_install
  • QA/TestCases/PartitioningSwapOnLvmDevice
  • Let's move this to [https://fedoraproject.org/wiki/Category:Obsolete_Test_Cases Category:Obsolete_Test_Cases], it's a duplicate of QA:Testcase_Anaconda_autopart_install
  • QA/TestCases/UserInterfaceGraphical
  • Targets install environment X setup issues and X itself.

Replying to [comment:1 jlaska]:
It's hard to find a balance point between dividing and combine the install cases.But I agree that dividing steps has advantage to identify issues of different aspects.Since you mentioned that it's very important,we keep it.We also will exert this advantage in Is Anaconda Broken. About the thought of the listed cases, do you mean to add these to Description of test cases to identify issues of different aspects or something else?

Sorry, missed your question. For the listed tests, I was just trying to outline what the function of each tests was. If you think the test cases themselves could use some improved descriptions, we can also tackle that.

no cases need to optimize, so close to fix

Metadata Update from @adamwill:
- Issue untagged with: test review
- Issue tagged with: test cases

7 years ago

Log in to comment on this ticket.

Metadata