Software Testing Portfolio



Software Testing Portfolio12/12/2013CITP 140 – Introduction to Software TestingBy Rick A. VanVolkinburgTABLE OF CONTENTSWhat I learned2Assignment 1Fundamental Test Process4Assignment 5quick test Plan8Assignment 4Test Case15Assignment 3Equivalence Class Partitioning, Boundary Value Analysis, and Decision Tables16Assignment 2Defect Report Dodge Charger19Defect Report pany22defect report Windows 7 Paint23Software Testing PortfolioCITP 140 – Introduction to Software TestingWhat I learned about Software Testing/Change of Perspective:Week 1: Software bugs cost billions of dollars and that software testing is a new and developing field.Week 2: Characteristics of software quality are: 1. Functionality 2. Reliability 3. Usability 4. Efficiency 5. Maintainability 6.Portability. There is no known software that is free from bugs.Week 3: Testing shows that defects are present, not that they are absent and we cannot test for everything and that even with bugs present, we still could have useful software.Week 4 & 5: Component testing – verify the functionality of the code by testing units individually; integration testing – all components are tested to make sure they work together; system testing – does it meet the requirements of the user; acceptance testing – are the customer requirements being met. The Heuristic Test Strategy Model is helpful to design a test strategy and sort of “cheat sheet” of what a tester should take into consideration when testing.Week 6 – Functional testing is what the system does, and nonfunctioning testing is does it do what it was intended to do. Regression testing is repeating other tests to uncover any new bugs that may have been created from any changes.Week 7 – Six steps in the review process: 1. Planning 2. Overview 3. Preparation 4. Review meeting 5. Rework 6. Follow-up. Reviews can fail because required persons are not available, the wrong reviewers are chosen, or lack of management support are just a few of the reasons. Also learned there are 4 types of reviews:Walkthrough – Inspection –Technical review – Information review –Static analysis – performed without actually executing the program usually using an automatic tool.Week 8 – The importance of Configuration Management (make sure that the individual components fit together in the system. A symptom of insufficient configuration management is developers overwriting each other’s changes); and incident management (recognizing, investigating, and taking action which may lead to a defect). I also learned about test logs – evaluating the actual results with the expected results. In the test log, I learned how to classify the severity of the incident – from “severe” to “mild.” I also learned about bug advocacy – report any bugs promptly and no bug is too small, but know the difference between bugs that are “severe” and which one’s need the most priority.Week 9: Dynamic Analysis – evaluating behavior (e.g. CPU usage) of the system or component (software) during the execution. Goal is to expose as many failures or bugs as possible. The biggest thing I learned was the difference between Black Box (functional) and White Box testing. Black Box testing is not concerned with the internal structure. White Box testing is the “inside” of the system – testing the internal coding and the tester needs to understand the source code.Week 10 – Decision tables – using causes and effects and every combination could be considered a test case; and how to create a use case – for example, if enter a user name & password and it is correct, allow access to the account, if it is invalid, display error message and request to try again, if it is not valid after 3 attempts, the application is closed out.Week 11 – The difference between scripted and exploratory testing – scripted testing the test cases are made up in advance; exploratory testing you make it up as you go along.Week 12 – How to build a good test case. Should include the ID, Description, and pre-conditions, the steps needed, what the expected results should be, and any notes. To create a good test case, it is a good idea to observe the user to see how they use the system/software.Week 13 – Test planning – Since there are more than 150 different types of testing, it is impossible to do them all. Therefore, it is best to figure out what the risks are, what the likelihood is of it occurring, and how to mitigate the risk.Week 14 – Test strategy defines the objectives of the project and need to consider the quality of the software, how far along is the development process, the qualifications of the people working on the project. This will determine how much effort will be applied and what the costs are.Week 15 – There are over a 100 testing tools that can be used to do the repetitive tasks or which may be difficult to find manually.My perspective has changed from the beginning in that I assumed that if there were any problems with the software, they would just be fixed by the developer(s). Generally speaking, it was “tested” to see if it was working. But the software tester’s role is to demonstrate not only how it works under normal conditions, but what happens when it doesn’t work under normal conditions? Fundamental Test Process:Planning and Control:Test Planning?and Control starts at the beginning of the software development project and must be checked regularly, updated and adjusted. Test Planning takes into account the feedback from monitoring and control activities that includes:Verify the Mission – what is going to be testedDefine the Objectives – how will these be achievedSpecifying the Test Activities – How will activities be done and who will do themTest planning is also where we define the test completion criteria. Completion criteria are how we know when testing is finished. For example, 100% requirement coverage or maximum of 50 known faults still remain. Test Control, on the other hand, is what we do when the activities don’t match up. During Test Control:Compare actual progress against the plan – measuring and analyzing resultsStatus Reporting (including any deviations from the plan) – monitoring and documenting progress, test coverage and the exit criteriaTake actions necessary to meet the mission and objectives of the project – initiating any actions that need correctingMonitored throughout the project – making decisions that include adjusting plans to meet targets, if possible Analysis and Design: Analysis and Design is the activity where general testing objectives are transformed into tangible test conditions and test designs and has the following major tasks:Review the test basis – ensure that all required resources, requirements, architecture, design, and interfaces, which collectively comprise the test basis are to hand.Identify the test conditions, requirements and data based on the analysis of test items, specification, behavior, and structureDesign and prioritize test cases – how the test conditions are going to be exercisedEvaluate testability of the test basis and test objects based upon requirements and systemDesign the test environment set-up and identify any required infrastructure and toolsImplementation and Execution:Where test conditions are transformed into test cases and testware and the environment is set up. This involves running tests and checking the test environment before testing begins. The most important test cases should be executed first. The major tasks in this phase includes:Preparing test cases – Developing and prioritizing test cases, creating test data, writing test procedures and, optionally, preparing test harnesses and writing automated test scripts.Creating Test Suites from the test cases for efficient test executionVerify the test environment – verify that the test environment has been set up correctlyExecute the test case either manually, using test execution tools, or a test running tool which needs a scripting language in order to run the tool (i.e. programming language) according to the planned sequence.Logging the outcome of test execution and record the identities and versions of the software under test:Date and time of test execution.Result of the test (pass or fail).The name of the testerSchedule for re-tests.Defect reference for failed testsCompare actual results with expected resultsReporting any discrepancies as incidents and analyze them to establish their cause (a defect in the code, specified test data in the test document, a mistake in test execution, etc.)Repeating test activities as a result of action taken for each discrepancy. This includes re-execution of a test that previously failed in order to confirm a fix (retesting), execution of a corrected test and execution of previously passed tests to check that defects have not been introduced (regression testing).Evaluation of Exit Criteria and Reporting:The activity where test execution is assessed against the defined objectives, is the Evaluation of Exit Criteria and Reporting. This should be done for each test level. At the end of the test execution, the test manager checks to see if the exit criteria was met. For example, we said earlier our completion criteria was 100% requirement coverage or maximum of 50 known faults still remain. As an example, we have 85% requirement coverage and 60 known faults still remain. There are two possible actions: change the exit criteria, or run more tests. It is even possible that if the preset criteria were met, more tests would be still be required. This phase comprises the following major tasks:?Checking test logs to determine exit criteria has been met.Assess if more tests are needed or if the specified exit criteria needs to be changedWriting a test summary report for stakeholders.Fundamental Test Process is the Test Closure Activities. These activities collect data from completed test activities. Test closure concentrates on making sure that everything is tidied away, reports written, defects closed, and those defects deferred to another phase clearly seen to be as such. This can take place when a software system is released, a test project is completed or cancelled, a maintenance release has been completed, etc. and includes the following major tasks.Checking which planned deliverables have really been delivered and that all incident reports have been resolved.Closure of incident reports or raising of change records for any that remain openDocument of acceptanceFinalizing and archiving testware such as scripts, test environments, etc. for ruse later. Handover of testware to the maintenance organization for software supportAnalyze lessons learned for future releases and projectsReferences Fundamental Test Process. (2008, November 8). Testing Excellence. Retrieved September 9, 2013, from Fundamental Test Process. (n.d.). Intelligent Test Method (iTM) Demonstration Version. Retrieved September 8, 2013, from Reddy, G. (2012, September 30). Manual Testing. Fundamentals of Testing. Retrieved September 10, 2013, from What is fundamental test process in software testing? (n.d.). ISTQB Exam Certification Com RSS. Retrieved September 10, 2013, from Quick Test Plan:Short Name for PlanVersionDateRevision #Edited byMS NotePad6.111-14-20131.0Rick VanVolkinburgProject Name: MS NotePad Test Project Number: 100.1 Date: 11/18/2013Prepared By: Rick VanVolkinburgProject Description: This is a Master Test Plan for the MS NotePad Version 6.1 application designed for Windows 7. Notepad is a basic text editor that is used to mostly create simple documents, but many users find Notepad a simple tool for creating Web pages as well. This project will test the functionality of the application as its compatibility with the web browsers: Internet Explorer, Google Chrome, and Mozilla Firefox. Microsoft has contracted BrightStreet Group, LLC based in Grand Rapids to oversee the testing of the project.Scope: Items in the Scope of TestingItems Outside of the Scope of Testing (include reasoning)Menu Items: File, Edit, Format, View, and HelpInternet Browsers: Internet Explorer, Google Chrome, and Mozilla FirefoxNo other web browsers will be tested due to 93.0% of those using the internet use one of those 3 according to statistics (“Browser Statistics”, n.d.).Windows Operating Software: XP, Windows 7 and 8According to statistics, nearly 86% use one of those 3 operating systems (only 3.63% use Vista). No other operating system will be tested due to NotePad being a Windows based application (“Browser Statistics”, n.d.) and NotePad does not treat newlines in Unix or Mac style text files correctly.Documentation used as inputs for Test Object: Windows 8 Notepad Version 6.1Windows based web browsers:Internet Explorer Versions 8,9, and 10 (less than 1% use another version)Google Chrome Versions: C29, C30, and C31 (90% of users use one of those versions)Mozilla Firefox Versions: FF25, FF24, and FF23 (80% of Firefox users use one of those versions)Basic HTML coding found at will use the risk-based testing philosophy – each test will be prioritized as: High, Medium, or Low with the High priority tasks being scheduled first. We will take into consideration some exceptions which may include:Some low priority test cases may be executed quickly or using very little resourcesConflicts in schedules may come up due to illnesses, vacations, etc. that may cause low priority to be scheduled while a team member is unavailable.Some priority tests may be a pre-requisite for tests that are a higher priority – expensive and high priority tests might need inexpensive and lower priority tests to be passed first.Navigational and other functional tests may be scheduled first due to comprehensive requirements lacking because this will enable testers the opportunity to become familiar with the application.Because the application is smaller, and not a large amount of users, testing will be limited to just manual testing.Due to lack of available tools and time, only basic information for hours, executed test cases, and incidents (bugs) will be kept. No attempt will be made to track more complex and involved information.Type of testing to be performed:Agile testing – emphasize testing from the perspective of the customer that will use the system.Black box testing – will verify the function of the application without having specific knowledge of the application’s code/internal structure. Tests are based upon requirements and patibility testing – various Windows operating systems as well as browser compatibility (Chrome, Firefox, and Internet Explorer). This will be necessary to not only make sure compatible with various web browsers, but will the Windows 7 based NotePad be compatible with a previous released version (i.e. Windows XP), and how will it perform with a version released later (i.e. Windows 8).Comparison testing – testing technique to compare the product strengths and weaknesses with previous versions.Integration testing: The phase in software testing in which individual software modules are combined and tested as a group. It is usually conducted by testing teams.Regression testing: Type of software testing that seeks to uncover software errors after changes to the program (e.g. bug fixes or new functionality) have been made, by retesting the program. It is performed by the testing teams.User Interface testing: Type of testing which is performed to check how user-friendly the application is. It is performed by testing teamsTest Techniques:Are testing technique will use functional testing to verify that MS NotePad is providing the required output by Microsoft by comparing the application function with the requirements set forth by Microsoft. We will check the software for usability such as making sure the menu items work properly and the application is compatible with the selected web browsers. We will also want to verify that the Windows 7 version of the application is compatible with previous Windows versions (i.e. Windows XP), as well as Windows versions released later (Windows 8). Test Schedule: 30 calendar days Monday – Friday 8:00 am to 5:00 PM for a total of 640 hours. Because of the holiday season approaching, testing will not begin until week of January 6, 2014Test Location: Test center provided by the clientTest Participants: Test Manager – Rick VanVolkinburg3 Testers with the following skills:general knowledge of NotePadfamiliar with Microsoft Windowsbasic knowledge of Internet Explorer, Google Chrome, Mozilla FirefoxTest Environment:1 tester will test the NotePad functions, the other 2 testers will perform the HTML testing for the three different web browsers.Equipment Environments:3 PCs with Windows XP3 PCs with Windows 73 PCs with Windows 8XP - 1 computer will test Internet Explorer 8, Google Chrome C29, Firefox 23XP - 1 computer will test Internet Explorer 9, Google Chrome C30, Firefox 24XP - 1 computer will test Internet Explorer 10, Google Chrome C31, Firefox 25When testing complete, Windows 7 will be loaded and above tests performed, then Windows 8.All PCs will need access to the internetAll PCs will need a black and white printerProblem Identification:Identify the problemTest Manager is alertedIf validated by Test Manager, fill out Defect or Bug Log and enteredForward for fixing/correctingDiscuss fixing/correcting with the person who carried it out.Determine any re-testing that may be requiredTest plan is drawn/redrawn or needs to be repeatedTest is carried outIf test is successful: Update bug/defect log and sign off on fix – quitIf test is unsuccessful, update bug/defect log and return to responsible person – update “log” to show still needs fixing/correcting.Sign-off procedures:Testing will be considered “complete” when the following conditions are met:the “Test Plan” has been completedno defects classified as being “critical” or “high” existless than 4 defects “medium” priority existTesting Deliverables: Test Strategy and Test PlanTesting scriptsTest Results Bug ReportsTest execution and summary report when test is completedRisks and Contingencies: Identified RiskRisk Mitigation StrategyRequirements changereduce tests performed or decide not to implement low priority testingpersonnel lossresources will be added by working overtime or bring in additional testersTesting Tasks and Schedule: TaskEffort (hours)Completion DateNotesTest Plan5 hours11-20-2013Training if necessary5 hours1/1/2014Test Cases240 hours1 – 15-2014As mentioned in the test plan, testing will not begin until 1/6/14. Assumes 4 people working on project at 40 hours a week for a total of 160 hours (32 hours a day)Test Execution360 hours2-1-2014includes meetings and reportsSoftware Release30 hours2-7-2014Total Hours640Test System/Database/Environment Needs:FunctionSystemLocationNotesHardwarePC with minimum requirements: Intel Pentium 4 or AMD Athlon x64; 1 GB RAM; 10/100 Fast Ethernet adapter; 40GB hard drive with at least 15% free space; current firewall and ant-virus software installed .Recommend a laptop website that lists the top selling laptops which may be necessary if current systems do not meet the requirementsOperating SystemWindows XP, 7, and 8SoftwareMS OfficeNotePad version 6.1PrinterHP Photosmart 5520 recommended if need to purchase printer will work, but if a printer needs to be purchased, then recommend purchasing the most popular printer according to IntelliReview.Responsibilities:RoleNameResponsibilitiesTest ManagerRick VanVolkinburgincluding but not limited to:Prepare the test planOverseeing the Testing DepartmentAllocating the resources to the projectsReviewing the reports of the tester’s and taking any action that is necessaryProvide support to the testing teamAttend any meetings that are necessaryReview the test plan and casesTesterValerie Pungincluding but not limited to:understand the requirements of the projectprepare & update the test case document conduct the testingattend any necessary meetingsverify and log the defectsAnthony PageOmar CruzTraining Requirements:Training NeedRoles needing trainingDate Required byMicrosoft NotePad 6.1General knowledge of MS Office Products. Test Manager will act as facilitator for testers. Necessary will be a classroom setting which would entail watching various YouTube videos1/1/14Building a program with NotePadseveral YouTube videos are available for viewing on how to program in NotePad1/1/14SignoffThe following project stakeholders have reviewed this document and accepted the Test Plan as stated.StakeholderNameSignatureDateBrightStreet Group, LLCJason WelchBrightStreet Group, LLCPete RichardsonReferencesBrowser Statistics. (n.d.). Retrieved?November?15, 2013, from 4 Test Case:Attached Zip FolderEquivalence Class Partitioning, Boundary Value Analysis, and Decision TablesPart 1: Equivalence Partitioning1.1 Month FunctionTake the example of a function which takes a parameter "month". The valid range for the month is 1 to 12, representing January to December. In this example there are two further partitions of invalid ranges. --------------------------------------------0|1---------------------------------------------12|13-------------------------------------invalid valid partition invalid 1.2 Savings Account InterestFor example in a savings Bank account,3% rate of interest is given if the balance in the account is in the range of $0 to $100,5% rate of interest is given if the balance in the account is in the range of $100 to $1000,And 7% rate of interest is given if the balance in the account is $1000 and above. -$0.01 $0.00 - $100.00 (for 3.0%) $100.01 - $999.99 (for 5.0%) $1000.00+ (for 7.0%)--------------------------------------|-------------------------------------------------|-------------------------------------------------invalid valid partition invalid PART 2: Equivalence Class Partitioning and Boundary Value AnalysisBuild a table that describes the Equivalence Class Partitioning and Boundary Value Analysis for the determining the interest paid on a Vehicle loan described below. New, Used, Refinanced VehiclesAPR*Up to 36 months (2009-2012 Only)as low as 2.250% Up to 36 monthsas low as 2.500% 37 to 48 months (2009-2012 Only)as low as 2.500% 37 to 48 monthsas low as 2.750% 49 to 60 monthsas low as 3.000% 61 to 72 monthsas low as 4.000%*APR not to exceed 12.98%Invalid Partition 1ValidInvalid Partition 2Data Input(s) you’d use to test ValidData Input(s) you’d use to test Invalid partition 1 and 2APRsUp to 36 months (2009-2012 Only)Months 1 - 11Years 200812-362009 -201237 - 7212, 362009, 201211, 3720082.249%, 2.250%, 12.98%, 12.99%Up to 36 months1-1112-3637 – 7212, 3611, 372.49%2.50%12.98%12.99%37 to 48 months (2009-2012 Only)Months0 -36Years 200837-482009 – 201249 - 7237, 482009, 201236, 4920082.49%2.50%12.98%12.99%37 to 48 months1-3637 to 4849 – 7237, 4836, 492.74%2.75%12.98%12.99%49 to 60 months37 – 4849 - 6061 – 7249, 6048, 612.99%3.00%12.98%12.99%61 to 72 months37 - 6061 -7273 - 61, 7260, 733.99%4.00%12.98%12.99%PART 3: Cause and effects with Decision Table for auto loanA program is being built with the following possible inputs for a car loan. The following information is needed to make the loan: Income and Credit Score. The loan will be automatically approved if the person’s income is over $50,000. If the person’s credit score is over 750 the loan is automatically approved. I have built the table for you. Please complete the decision table options below.Test 1Test 2Test 3Test 4CausesIncome > $50,000YNNYCredit > 750NYNYEffectsApprovedYYNYDeclineNNYNSAMPLE DEFECT REPORT 1Write up a defect for our 2013 Dodge Charger Build and Price website problem as shown below in the Attachment section below: Title: Dodge Charger displays AWD twice in header of its Build & Price ID Number: 101Version/Build Number: Version: 1.4.3.68Severity: Trivial – feature requestCritical: Application crash, Loss of data. Major: Major loss of function. Minor: minor loss of function. Trivial: Some UI enhancements. Enhancement: Request for new feature or some enhancement in existing one.Priority: Low to MediumAssigned to: Jason WelchReporter: Rick VanVolkinburg (RICKVANVOLK@)Reported On: October 13, 2013Operating System: Windows 8 - Internet Explorer 10 – using HP laptop 2000Status: NewDescription: AWD prints twice in the Build & Price heading for the 2013 Dodge Charger at the URL: To Reproduce: Go to the websitePoint to vehicles on the Dodge Menu Bar – left of the Shopping Tools menu selection.A drop box appears. Select Charger (second on the list) See Figure 1 in attachments for steps 1 and 2.At the site, select Build & Price see Figure 2 in attachments.Click the “+” in the “Please Select a Model” section for R/T Plus (5th selection with a base price of $34,495).Select Drive Option appearsChoose the radio button “AWD”When selected, AWD displays twice in the Build & Price header see Figure 3 in attachments.Expected Result:Build & Price 2013 Charger R/T Plus AWD displayed in headingActual Result:Build & Price 2013 Charger R/T Plus AWD AWD displayed in headingAttachments:Figure 1: Selecting Charger from the Vehicles MenuFigure 2: Step 3Figure 3: AWD appears twice in the Build & Price Header:SAMPLE DEFECT REPORT 2Write up a bug for accessing this hyperlink: Title: Unable to open pany website - cannot locate the internet server or proxy server.ID Number: 202Version/Build Number: 1.0 = MajorSeverity: CriticalCritical: Application crash, Loss of data. Major: Major loss of function. Minor: minor loss of function. Trivial: Some UI enhancements. Enhancement: Request for new feature or some enhancement in existing one.Priority: 1-HighAssigned to: Jess LancasterReporter: Rick VanVolkinburg (RICKVANVOLK@)Reported On: October 15, 2013Operating System: HP 2000 laptop Windows 8Status: NewDescription: click on the link to open website and get error message unable to open website. Company is not a valid extension.Steps To Reproduce: Hover over the ctrl + click to follow linkGet error message unable to open the website (as seen in the attachment)Expected Result: open the webpageActual Result: error message that the webpage is unable to open – unable to open pany (see attachment).Attachment:SAMPLE DEFECT REPORT 3Write up a bug for the following Microsoft Paint issue: The Windows 7 version of Paint has a long standing defect involving an inability to scroll the window when editing in Zoom view over 100%. In addition, when inserting text in Zoom view, the user cannot move the text beyond the zoomed view while the text window is in edit mode with either the mouse or keyboard because…why? You tell me what the defect is.Title: Windows 7 Paint program – scroll (arrow) keys zoom instead of scrolling Product: ID Number: 301Version/Build Number: Version 6.1 (Build 7601: Service Pack 1) – Minor BuildSeverity: Major - AnnoyanceCritical: Application crash, Loss of data. Major: Major loss of function. Minor: minor loss of function. Trivial: Some UI enhancements. Enhancement: Request for new feature or some enhancement in existing one.Priority: 1-HighAssigned to: Stephen SeredayReporter: Rick VanVolkinburg (RICKVANVOLK@)Reported On: October 15, 2013Operating System: Windows 7 using HP 2000 LaptopStatus: NewDescription: scrollbars are disabled when editing in Zoom other than 100%Steps To Reproduce: Open the Paint program in Windows 7Paste any picture in 100% zoom view as seen in Figure 1 in attachmentsPress the up and down arrows as seen in attachments, the picture will move up and down (Figure 2 in attachments).Change view to any other view than 100% by selecting the view button (Figure 3 in attachments)Select either Zoom in or Zoom out (Figure 4 in attachments)The up and down arrows (scroll bar)becomes disabledExpected Result: the picture would scroll up and down when using arrow keysActual Result: the arrow keys (scroll bar) is disabledAttachments:Figure 1: Open Paint in 100% zoom viewFigure 2: screen shot of image going up and downFigure 3: Select View to change view from 100%Figure 4: Selecting either Zoom in or Zoom out disables the arrow keys (scroll bar)In addition, when inserting text in Zoom view, the user cannot move the text beyond the zoomed view while the text window is in edit mode with either the mouse or keyboard because…why? You tell me what the defect is.Title: Windows 7 Paint program – text box will not go past scroll barProduct: ID Number: 302Version/Build Number: Version 6.1 (Build 7601: Service Pack 1) – Minor BuildSeverity: Major - AnnoyanceCritical: Application crash, Loss of data. Major: Major loss of function. Minor: minor loss of function. Trivial: Some UI enhancements. Enhancement: Request for new feature or some enhancement in existing one.Priority: 1-HighAssigned to: Stephen SeredayReporter: Rick VanVolkinburg (RICKVANVOLK@)Reported On: October 15, 2013Operating System: Windows 7 using HP 2000 LaptopStatus: NewDescription: when inserting a text box in picture, the text box will not go past the scroll bar and text wraps at the scroll barSteps To Reproduce: Open the Paint program in Windows 7.Paste a pictureSelect View on the menu barChange view to 200% by selecting Zoom inClick on HomeSelect the A on the tools menu for TextClick and drag a text box on the picture, dragging the box past the vertical scroll bar on the right.Type text in box. Keep typing text until text wraps to at least line 2 (as seen in Figure 1 of attachments)Click View on the menu barSelect Zoom out – text continues to wrap (as seen in Figure 2 of attachments).Expected Result: the text would continue to go across the picture no matter the viewActual Result: the text wraps – stopping at the vertical scroll barAttachments:Figure 1: Text box and text inserted will not go past scroll barFigure 2: changing the view, the text wraps rather than go across the picture ................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download