Software Test Automation .com

? Software Test AutomationTest Automation Context1.2 Testing & Test Automation Dare differentrole definitiongood test case comprises:Detection Effectivenesswhether or not it can find errorsExemplarytest more than 1 thing - reducing no. of test casesEconomicalhow long to performcosts of analysiscost of debuggingEvolvablehow easy to maintainhow easy to extend1.3 V ModelModel:Requirements <> Acceptance testFunctions <> Sys TestDesign <> Integration TestCode <> Unit TestRAD utilises small successive use of V model instancestools1.4 Tool supportLoad testingstatic analysis - Mc Cabe measurements etccode coveragePerformance1.6 Common Problems of test automationUnrealistic expectations - optimistic on cost/dividendsPoor testing practise - poorly designed/documented - aim for effectiveness than efficiency firstExpectation - assumption automation will find lots of new faultsFalse sense of securityMaintenance cost as s/w evolvesTechnical problems - tools not keeping up with changesOrganisation - need for mgmt commitment1.7 Test activitiesCore activitiesIdentify - determine test conditions & what can be testedDesign - determine how the cases testBuild - implement scripts, dataExecute - run testsCompare - compare outcome vs expectedideally have organizational objectives for testingideally have test strategytest planning - idebntify effort needed etc1.9 Limitations of automating s/w testingdoesn't replace manual testsmanual tests find more defects than automated testsgreater reliance on quality of testsautomation doesn't improve eefectivenessmay limit s/w devtools have no imagination1.8 Automate Test Designtest input generation toolscode basedmutationinterface basedGUIspecification5.Testware ArchitectureComprises ofall elements that form test envdocumentationtest datascriptsrsultsrportsexpected resultskeys issues to manage in the test architectureScaleThe number of different artifactsbreadth of skills & knowledge in the teamestablishing consistent structuring of materialsRe-Usere-using data and scriptsgood automation takes advantage of re-useAvoid duplicaton thus managing maintenance costsScript design has to lend itself to re-useto achieve re-usabilityre-usable components - locatable within 2 minsquick determination of how to useability to expand lib of componentsMultiple versionsscripts ability to be used or adapted for multiple release versionsAbility to retain previous versions for testing patches against earlier releasesTrack test scripts against product versionsPlatform & Env independencemale test env portable across all possible target platfornsaccomodation of how O/S issues may impact resultsAn Approachthis describe a suggested strategy for implementing a testware architectureThe guidance here is pretty much the same for well structured s/w developmentBasic ConceptsDivide test materials logicallytest sets containing 1 or more test casestest case allocation to a test set - follows some logical guidancetest suitetestware setsMetricsCalculating ROIcost of manual vs cost of tools + maintenancegain coveragecompetitive edge thru gaining agility60-80% defects found during test devmeasuring testing & automationtrends in metrics give warning to potential issueschances to identify cost/benefit of testing maintainedGilb's Law - anything can be made measurable in someway, which is superior to not measuring it at allpossible measuresfor productSLOCfunction pointsnumber of developersobject code sizedefects found in test vs deployedbuild cost - time or effortdecision pointsfor testsnumber of teststests planned, run & passedtime/effort costnumber of defects foundcoveragetest effectiveness can be measured byDefect Detection Percentage (DDP)defects found in testing/total known defectsDefect Fix Percentagefrom book 'Applied Software Measurement' by Capers Jones (91)Jones refers to as Defect Removal EfficiencyMarnie Hucheson - refers to as Performance of Test EffortDefects fixed before release/all defects foundCoveragein terms of statementsdecision & branch pointsconditionsmodule callsLinear Code Sequences and JumpsOther IssuesTesting non functional requirementsperformanceabnormal loadswhat to automate firstmost important testsa set of breadth teststests for most important functionseasiest to automatequickest paybackmost often run testsScripting TechniquesSame principles as programminggood script characteristicslower quantities of scripts per test caseshort & well annotatedscript has clear single purposedocumentedre-usableeasy to undestand - facilitating maintenanceeasy to maintain - doesn't need rewriting for small changeTest Case Design & Implementationtest case - means a series of 1 or more tests that can bemeaningfully distinguished from another seriesnot always direct relationship betweentest case & automated scripteach test case have single script - doesntlead to efficient or effective test automationScript TechniquesLinear Scriptsproduct of when record a test case performed manuallyrecord of all actions executed from single scriptadvantagesno up front work or planning neededquick start to automationaudit trail of operations doneuser doesn't need programming skillsgood for demos & trainingdisadvantageslabour intensive - 2 - 10 times longer for automated testeverything tends to be from 'scratch' everytimetest inputs & comparisons tend to be hardwiredno script re-usemore vulnerable to s/w changesmore expensive to change as tend towards rewriteStructured Scriptinglike programming uses control structuresscripts can be chained & common elements usedbetter chance of maintainabilityShared Scriptsscripts shared by more than 1 test caseadvantagessimilar tests take less effortmaintenance cost lower than linear scriptseliminates obvious repititioncan afford more intelligencedisadvantagesmore scripts to track & managetest specific scripts for each case are requiredoften specific to 1 part of the system under testData Driven Scriptsstores test inputs in separate files rather than in scriptseparation of data means script can be used to drive different testsadvantagessimilar tests can be added v quicklynew tests can be added without needing programming skillsno script maintenance for subsequent testsdisadvantagesneeds programming skillsinitial setup will take longerif volume of tests is amll - is expensiveKeyword Driven Scriptsactually more sophisticated data driven scriptsmove intelligence from script over logic flow into dataadvantagesmore test coverage for lower no. of scriptslower script maintenance costnon programmers can contribute moreHan Buwalda - CMG - method of scripting called Action WordsAutomated ComparisonVerfication - process of checking whether or not the s/whas produced correct outcomeimportant decisions in planning auto comparisonshow much data will affect performancethink out verification in advance of automationadvance consideration - improves chances of good balance of trade offstester's understanding of what is correcttypesdynamiccomparison performed during executiontend to draw more intelligence into testrequires more programming like experienceable to terminate tests soon as failures occur - improves efficiencyintelligence may help improve resiliencemore complexity - higher maintenance costpost executionpost execution results nalysisnot so well supported by toolsless tools - may feel like more work to automateable to be more selective of what comparisons performedactual outputs are save - helps problem resolutioncan be done offline - spread load across machinesreference testing - created results manually evaluated then compared against in futuread hoc/informal - hard to conform what tester expects to be correctplanned/formal - identification in advance of testingsimpleaka dumb comparisonlook for identical match of actual & expectedeasy to define - so less error pronelow maintenance due to simplicitylow dev costexpensive to analyse results of mismatchcomplexaka intelligent comparisoncaoable of ignoring certain differencescommon complex comparisons can handledifferences in date & timegenerated Idsvalues can be range scopedvariations in rendering of textbest approach influenced bymagnitude of expected result (data volume)whether or not result can be predicted (use of live data?)Availability of s/w under test - unable to generateresults for reference early enoughQualifty of verificationbetter to predict than to verifyWhy automate?most automatable task in testingautomated execution can produce lots outcomesWhat to comparechanges made to database have been actioneddata sent to printer;email; across networksimproce testing quality & range of evaluations performedlimitationsmanual performance - may check more things than can be quickly automatedautomation - does provide consistencymanaual regression - more flexiblecomparing actual & expected - doesn't mean correct7 Building Maintainable testsnumber of test casesjust adding test cases may appear good - but must consider maintenace costmore tests = more maintenance costsmore tests - increase chance od duplication & redundancycan be combatted by periodic weeding of test casesany test case where maintenace cost > value provided - should not be automatedquantity of test datamore test data = more maintenanceplus cost of back up & data transfermore data to go through the longer to fix the defectformat of test datamake data as portable as possiblesimpler common formats easier to maintain & manipulatetime to run test caseslonger tests may save in setup/tear down time BUT slow time to complete debug/retest cyclesso keep tests short and focuseddebug-ability of test casesneed to be able to determine cause of failure -analysis post run means you don't see when things stoppedwhen developing - keep in mind - what would I like to know about a failureinterdependencies between testscan be an optimisationif 1st test fails then subsequent tests will faildiscourage - but dont ban out right - could solve cascade by using dumps of start statesnaming conventionssame reason as normal developmenttest complexitymore complex = harder to understandharder to understand = more time for maintenancetech people can get carried away with technologytest documentationsame reason as normal developmentThe conspiracy - how tools & drive to automate can lead tomore maintenance issuestools can make it easier to do the wrong thingmaking easier - can result in short cuts/easy approaches = more maintenanceinitial enthusiasmROI - 80/20 where last 20% will give the80% return - don't commit to that last 20%Strategies & tactics that canhelp contain maintenance costsdefine preferred values & stdsstick to stds & only relax when justifiedprovide tool supportmake use of means to measuretools can make jobs easierautomate updatestest data updates should be expected & planned fortry to automate updates - making process more consistent and quickerscheduler periodic weedingeliminate duplicate & redundant testsaccurate & good test docs will facilitate & speed thismaintenance utilsOther Sectionsautomatring pre & post processing8 Metrics9 Other Issues10 Choosing a tool to automate testing11 Implementing tools within the organizationUseful SitesJames Bach Uni notes Quality Engineering ................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download