Machine Setup -ip.com



OEM Ready Software Qualification ToolInstruction ManualMachine SetupPerform a clean install of Microsoft Vista SP1 Ultimate 32-bit edition.Install the Microsoft OEM Ready Qualification ToolDatabase configurationLaunch the Microsoft OEM Ready Qualification ToolFeel free to read the “Getting Started with Microsoft OEM Ready Qualification Test Suite” guide in order to gain a better general understanding of the qualification process.If Microsoft SQL Server is not installed, install Microsoft SQL Express 2005 SP2.From the Qualification Tool, from the main menu select “Tools” “Preferences”:From the “Preferences” dialog modify the default SQL Server settings to match your database settings. For example, if Microsoft SQL Express is installed on the same machine where Qualification tool is running, specify the following settings:Server Name: (local)\sqlexpressSelect as Authentication “Windows Authentication” from the drop down list.Click on the Connect button. A confirmation dialog should pop up informing you that the test connection was successful:If you did not receive this confirmation, please verify your connection string and SQL installationClose the dialog by clicking on OK.If this is the first time you are using the Qualification Tool on this machine, select the “Create Database” option:Type the name you would like to create a new database:Click the “Create” link to create the database:You should be greeted by a message that lets you know the database was successfully created:Close confirmation dialog by clicking on the OK button.If you have already used the Qualification Tool and would like to add data to the existing database, choose the option “Select Database” and then select the database you would like to work within from drop down list.Once the database is configured, the Preferences dialog should look similar to the image below:Figure 1: Preferences dialog after establishing connection to the CleanRun database (in this case, a local database of Microsoft SQL Express). Log and report location configurationFrom the Preferences dialog, enter the path to the folder where the report and log files will be located. It is recommended that the folder where the files will be created should include the name of the application to be tested.For instance, if foo.exe is going to be checked the target folder could have the name of: c:\FooCertificationYou can either type the path to the folder in the edit box or you can also navigate to the target folder using Browse button. By using the Browse approach, a new folder can be created by navigating to the target directory where the new folder should be located and clicking on the “Make New Folder” button.Figure 2: Everything should be configured in the Preferences dialog: SQL Server and Database settings, as well as the folder path where all logs and reports for the Foo app will be placed.Click on OK to close the Preferences dialog.Test tool configurationFrom the Getting Started pane (on the right), click on the Setup tab:Click on the “Preparing the test environment” button and follow instructions to download and install all needed tools:On the right pane in Getting Started page, click on the Tool Box tab:For every test tool in the application, the path needs be set. To carry out this operation, follow these steps:Click on a tool under the “Name” column (in the picture below ORCA is being selected as an example):On the Getting Started pane, under Misc it is possible to modify the default settings of the tool. To do so, click on the path row on the right:Click on Browse button under the Path row:Navigate to the folder where the binaries for the specific tool are located.Select the executable and click on Open.The path for that test tool is now defined.Repeat steps REF _Ref203458762 \w \h 4.4.1 to REF _Ref203458860 \w \h 4.4.6 for each tool in Table 1 below:Table 1 below should provide the typical file path for each of the binaries. Please note, these paths will be slightly different for 64-bit operating systems.Application NameFile PathOrcasC:\Program Files\Orca\Orca.exeLog ParserC:\Program Files\Log Parser 2.2\LogParser.exeApplication VerifierC:\WINDOWS\system32\appverif.exeSigntoolC:\tools\signtool.exeSigCheckC:\tools\sigcheck.exeDriver Verifierverifier.exeWindows DebuggerC:\Program Files\Debugging Tools for Windows (x86)\windbg.exeAppverifier Logs ParserC:\Program Files\Microsoft\Certification Tool\bin\Scripts\AppVerifierLogsParser.exeRestart Manager TestC:\Program Files\Microsoft Corporation\Logo Testing Tools for Windows\Restart Manager\x86\RMTool.exeTable 1: Paths to binaries for required test tools. Creating the New Qualification ProjectOn the left pane, from the “Start a new Certification” window, click on the New button:The New Qualification dialog should openUnder the Certification Name field, type in the name for the project. It is recommended that you give the project a name that is related to the application name to be tested. Figure 4: The “New Certification” dialog. In this case, the certification name is set to “Foo App”.The logs for this Qualification project will be located in: <the given folder name>\<name of this qualification project>_Logs.For example, if the folder name was defined as “c:\FooCertification” and the Certification project name was defined as “Foo App” then all logs and reports which are related to the tested application Foo will be found under: c:\FooCertification\Foo App_LogsProduct details configurationUnder the Product details section, 3 different options are available: Automatically detect files installed during installationSelect this option if you’d like to run automatic test cases to verify the tested application. The application should already be installed on when carrying out this type of testing. The full path including the name of the manifest file should be specified as shown below:By default, the manifest file will be created in the same folder where the logs and reports will be placed. The default manifest file name is “submission.xml”. To change default value for the file path of the manifest file, click on the Browse button and navigate to the desired location.Please note that the default name of the manifest file is not a configurable option, it will always be called submission.xmlThe manifest file will include all the logs and test results, and other information that is gathered during the collecting phase about the installed applications.It is recommended that you select the same folder that was previously selected to store logs and reports to keep all application/qualification related information in the same place.Provide an existing manifestIf you have already run a test that generated the manifest file and you need to run the tool again, then you can reuse the existing manifest. In this case the full path to the manifest file should be provided. It has to be the same path as it was given during step REF _Ref203459965 \w \h 6.1.1, path for the manifest file.Run all test cases manuallySelect this option if you plan to run all test cases manually and do not wish to take advantage of any of the automated tests. Note: by selecting this option, it will not be possible to run any of automated test cases for this Qualification Project. If you decide you do want to run automated tests, a new Qualification Project must be created.Once you have selected the option that best suits your needs, click the OK button.If option REF _Ref203459965 \w \h 6.1.1 (Automatically detect files installed during installation) was selected, follow these steps:The Application Selection Tool wizard should now open. Click on Next button:The application will collect information on the applications that are installed on your machine. This process may take a few minutes to complete.Once the collection process has finished, a list of installed applications on this machine will be displayed: Select the application from the list that is going to be tested. Click the Next button.The list of files that belong to the selected application will be shown. If all related files are not listed, any missing files will need to be added manually. KNOWN LIMITATION: if an application contains drivers, each .sys and .cat file must be added manually.To manually add files, click on the “Add a file” buttonNavigate to the file that should be added. Select the file and click on the Open button to add the selected file to the list.If a file was added by mistake, select it from the Application Selection Tool dialog and click the “Remove file” button. Please note that only files that were added manually can be removed.Once all binaries relating to the application are listed, click on the “Export XML” button. The Application Selection Tool dialog will close.Once the file collection process is complete (if option REF _Ref203459965 \w \h 6.1.1 was selected) or if either option REF _Ref203460081 \w \h 6.1.2 or REF _Ref203460090 \w \h 6.1.3 was selected, the left pane will list the test cases to run.If options REF _Ref203459965 \w \h 6.1.1 or REF _Ref203460081 \w \h 6.1.2 were selected, then test cases 1.1.1, 1.2.1, 1.2.3, 3.1.1 can be executed automatically. If option REF _Ref203460090 \w \h 6.1.3 was selected, all test cases must be executed manually.Running the AutomationOnce the XML file is generated, the Qualification Tool will display the OEM Read y tests as shown below:Select a test case on the left pane to runNOTE: currently, only 1.1.1, 1.2.1, 1.2.3, 3.1.1 are automated. These test cases correspond to tests 1, 2.1, 2.3, and 7 from the OEM Ready Program Documentation.From the right pane click on the Test Steps tab. The right pane window contains the manual steps required to perform the test case as show in the figure below:Figure 5: Test case 1.1.1 was selected to runTo run the test case automatically, click on the green arrow button in the bottom right corner as show in the figure below:Figure 6: The green button to run automated test case 1.1.1 is located on the upper right corner.Some of the automated test cases will launch tools that will display interactive dialog. It is imperative that you do NOT interact with the machine until the test case is complete. At the end of the test case a message box will appear with information on whether the test case passed or failed.Results are automatically uploaded to the database of the Qualification tool.The path to the latest log file is also automatically uploaded to the Qualification tool’s database.To display the results from the Qualification Tool, from the left pane click on the Test Result tab.To see the path to the log file in the Qualification Tool, from the left pane click on the Test Result tab. The path is located in the Logs/ Other files section at the bottom part of the window.Figure 7: In this example, test case 1.1.1 failed. Path to the log file is at the bottom. (c:\FooCertification\Foo App_Logs\Foo App_Test Case_1.1.1_Log.xml)Paths to more logs/ files can be added to the Logs/Other files section as needed. For example, supporting documents could be added. Each file whose path is listed in this section will be included in the Submission package.When another test case is selected from the right pane, the title of the previously executed test case will be displayed in green or red, corresponding to pass or fail results.Verifying and Analyzing LogsTo analyze the logs in a more user friendly format, use SvcTraceViewer.exe. This tool is included with the Windows SDK.All logs can be found under <the folder path>\<the Qualification Project name_Logs> All automated test cases check each executable and report their results to the log file. If any of executables/drivers that belong to the application failed to pass the test, the entire test is failed.If the automation is run multiple times, logs from previous runs are archived in the directory <the folder path>\<Qualification Project name>\BackupLogs. For instance, in our “foo” application example, the backup directory is: c:\FooCertification\Foo App_Logs\BackupLogsRunning the Automation Multiple TimesIt is possible to run automation multiple times for the same application as needed. The results for each test run for the same test case might be different if the list of application has changed since the last run or if the manifest file (submission.xml) was changed. In such cases, the results are overwritten in the database.If the Qualification Tool was closed and there is a need to continue with an existing project: Launch the Qualification Tool and click on the project name from the right pane. The Test cases tree will open. If test cases were previously completed, they will appear with their results.Generating the Report and Submission PackageTo generate the report about the qualification project:from the main menu select Reporting Generate Report.A save dialog will open pointing by default to the folder that was previously defined. It is recommend to keep the report in the same folder as the log files that belong to the qualification project, i.e. <the folder path>\<Qualification Project name>The default file name for a report is OEMReadyProgram-<Qualification Project name.html>. Feel free to change the name as desired.Click on the Save button.The report will be generated and will be placed in the folder previously defined.To generate a Submission package, from the main menu click on: Reporting Generate Submission PackageA save dialog will open pointing by default to the folder that was previously defined. It is recommend to keep the submission file in the same folder as the log files that belong to the qualification project, i.e. <the folder path>\<Qualification Project name>The default file name for a submission package is OEMReadyProgram-<Qualification Project name>-SubmissionPackage.xml. Feel free to change the name as desired.Click on the Save button.The Submission package will be generated and will be placed in the folder previously defined.The Submission package will include all logs that were uploaded to the Qualification Tool for automatic and manual results and the final report.NOTE: paths to log files for test cases that were run automatically are added to the Qualification tool database immediately upon completion, however, paths to the log files that were created during manual execution be added manually. It is recommened to store these logs in the previously defined path for the project. NOTE: The submission package will include only those files/ logs that were defined in the Qualification tool in the Logs/ Other files section. NOTE: In order to access the Logs/ Other files section:Select any test caseIn the right pane, click on Test ResultsGo to the bottom section, the last section is Logs/ Other filesTo add files, click on the Browse button and add files.FAQQ: How can I find out where the log files are saved?A: From Qualification Tool menu, click on File, Close. This will close the current project. From the Qualification Tool menu, click on Tools Preferences. The Preferences dialog will display allsettings. Under “folder path” the root folder for all logs and reports is displayed.Q: Reviewing results/logs, I found that some executables were tested do not belong to my application. How can I remove them from the results? A: You have to open the manifest file (in our example, it was c:\FooCertification\Foo App\submission.xml – figure 4). In this file you can see the list of executables, drivers, catalogs and shortcuts that belong to the tested application. “Noise” during the collection process could have caused the creation of some redundant files. Remove these redundant files carefully without modifying the XML tags. Once the file has been updated, you must re-run the automated test cases again. If files are missing, add paths to the missing files into the appropriate sections: exe files should be added to executable section, sys files should be added to Drivers sections, cat files should be added to Catalogs section, lnk files should be added to Shortcuts sectionIt is recommended that you avoid direct modification of manifest file. Q: It seems like the Qualification tool hangs during the test case execution. What should I do?A: The test execution should not take longer than <Number of files listed in the manifest file> * 30 seconds. For example, if foo application installs 5 executables, 2 drivers and 3 shortcuts then automation for each test case should not take longer than (5+2+3)*30sec = 5 min. If the Qualification Tool is hung longer than this, the Qualification Tool should be closed and it’s process killed in Task Manager. Launch the Qualification Tool again, reopen the project you were working, and run the tests again.Q: There are a lot of message dialogs opening and closing during the automated testing. Is it normal?A: Yes, it is normal. For some test cases each executable that belongs to the application is launched and closed. Also, cmd windows are opened for each executable during the test. Do NOT interact with the machine while any tests are running. This may adversely affect the test results.Q: After running some test cases, open dialog messages may be left on the machine. Is it normal?A: Yes, it is normal. To perform some test cases, executables that belong to an application are launched. In some cases, not all launched applications are killed at the end of the test. Once testing is complete, (once the notification dialog is displayed) these messages can be closed. ................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download