Migrating Databases to SQL Azure



Hands-On LabMigrating Databases to SQL AzureLab version:1.0.3Last updated: DATE \@ "M/d/yyyy" 12/22/2009Contents TOC \h \z \t "Heading 3,2,pp Topic,1,PP Procedure start,3" Overview PAGEREF _Toc249276266 \h 3Exercise 1: Moving an Existing Database to the Cloud PAGEREF _Toc249276267 \h 4Task 1 – Exporting A Script from SQL Server 2008 (OPTIONAL) PAGEREF _Toc249276268 \h 4Task 2 – Modifying the Exported Script from SQL Server 2008 (OPTIONAL) PAGEREF _Toc249276269 \h 7Exercise 2: Using BCP for Data Import and Export PAGEREF _Toc249276270 \h 21Task 1 – Moving Data Out of SQL Azure with BCP PAGEREF _Toc249276271 \h 21Task 2 – Deleting Data from the Existing Database PAGEREF _Toc249276272 \h 24Task 3 – Moving Data into SQL Azure Using BCP PAGEREF _Toc249276273 \h 24Exercise 3: Using SSIS for Data Import and Export PAGEREF _Toc249276274 \h 27Task 1 – Deleting Data from an Existing Database PAGEREF _Toc249276275 \h 27Task 2 – Flowing Data from SQL Azure to SQL Server PAGEREF _Toc249276276 \h 33Summary PAGEREF _Toc249276277 \h 55OverviewSQL Azure Database is based on Microsoft SQL Server, which makes it relatively easy to move existing SQL Server databases to SQL Azure. In this Hands-on lab, you will learn how to move a schema and data to SQL Azure by using the Generate Script Wizard in SQL Server Management Studio (SSMS). You will also learn how to use BCP and SQL Server Integration Services (SSIS) to copy data into and out of SQL Azure.ObjectivesIn this Hands-On Lab, you will learn how to:Use SQL Azure as a cloud hosted database platform for your applications.You will learn how toMigrate an existing databaseBackup data from the cloudExercisesThis Hands-On Lab comprises the following exercises:Migrating an on-premise database to the cloudUsing BCP for Data Import and ExportUsing SSIS to move data from the cloud to an on-premise databaseEstimated time to complete this lab: 60 minutes.Exercise 1: Moving an Existing Database to the CloudIn this exercise, we will move an existing on-premise database from SQL Server on premise to the cloud. We will do this by using scripts. In order to assist in completing the exercise in a timely fashion and to avoid a dependency on SQL Server and SQL Server management studio on your local machine these scripts have been pre generated.SQL Azure supports a subset of the features that are found in the on-premise version of SQL Server. We will identify some of these differences by comparing the original exported SQL script with a modified SQL Azure ‘ready’ copy.The goal of this exercise is to give you a primer in readying exported SQL scripts for SQL Azure. This Exercise will not cover all the key differences between SQL Server and SQL Azure. For a complete list of commands not supported in SQL Azure, please see the SQL Azure Documentation.Note: This exercise makes use of the HoLTestDB database that was created in Exercise 1 of the lab Introduction to SQL Azure. If you have not yet created this database, please complete Exercise 1 of this lab.The Task 1 and Task 2 of this exercise are optional. If you do not have SQL Server 2008 and SQL Server Management studio installed you may proceed directly to Task 2 and use the prebuilt scripts.Note: As of the October CTP all accounts are created on the Production cluster. Databases and servers that were created during the August CTP can be accessed via the portal at . Additionally, previous documentation instructed all connections to point to <server>.ctp.database. as of the October CTP connections should be made to the Production cluster at <server>.database.. Some screenshots still incorrectly show connections to the ctp endpoint.Task 1 – Exporting A Script from SQL Server 2008 (OPTIONAL)Note: Within the PDC timeframe a version of SQL Server Management Studio will be available which supports SQL Azure. In particular the Generate Script Wizard will have an option for generating SQL Azure compatible scripts eliminating the need for much of the script cleanup work identified in this Lab.Additionally, thanks to the hard work of George Huey there is a SQL Azure Migration Wizard available on CodePlex () which simplifies the migration process.In this task, we will generate scripts from an on-premise installation of the SQL Server 2008 sample database AdventureWorksLT2008. This task assumes that you have a SQL Server 2008 or SQL Express 2008 instance available, SQL Server Management Studio and that you have the AdventureWorksLT2008 database attached.AdventureWorksLT2008 can be downloaded from here: versions of SQL Server Management Studio will have specific support for SQL Azure avoiding the need to clean up the exported scripts.A script that has been exported according to the steps below can be found in the Ex01-MovingExistingDb/begin/ folder for this lab named AdventureWorks2008LT_Original.sql If you do not have SQL Server Management Studio open then open it now.Right click the AdventureWorksLT2008 database. Choose Tasks>Generate ScriptsFigure 1Generating a Script in SQL Server Management StudioPress Next on the first screen of the Wizard then choose the AdventureWorksLT2008 database and check the Script all objects in the selected database checkbox. Press Next.Figure 2Wizard stepWe will need to edit the script by hand after it is generated, but, at the same time we can save some work by modifying some of the default script options. The following bullets specify the changes to the default options and the justification for doing so:Convert UDDTs to base types: TrueSQL Azure does not support User Defined Data Types. You should use this option to convert any user-defined types into their underlying base types.Script extended properties: FalseSQL Azure does not support extended properties. Therefore we do not need to script out these propertiesScript USE DATABASE: FalseSQL Azure does not support the USE DATABASE command for changing database contextScript Data: TrueFor the purposes of this lab, we want to script not only the database schema but also the data contained therein.You may choose not to script data in some scenarios.After setting all the properties press NextLeave the default Script to New Query Window radio button selected. Press Next. Press Finish.You should see the progress dialog. Wait for this to complete and press CloseFigure 3Script Progress Wizard StepTask 2 – Modifying the Exported Script from SQL Server 2008 (OPTIONAL)In this task, we will edit the script that is generated by SQL Server 2008 in order to make it compatible with SQL Azure. You can find a copy of the script that has already been edited according to the below instructions in the Ex01-MovingExistingDb/end/ folder named AdventureWorks2008LT_Azure.sqlIt is worth spending the time at least reading through this Task as it provides insight into many of the common unsupported features in SQL Azure. You may also choose to perform each step, many of these steps will need to be completed if you are moving your own database schema to SQL Azure.Note: If you decide not to perform these steps manually, ensure that you run the AdventureWorks2008LT_Azure.sql script found in the Ex01-MovingExistingDb/end/ fold against your SQL Azure account. This database will be used by a number of different demos and labs.If it is not already open, open the script you created in Task 1 or the pre-generated version from the Ex01-MovingExistingDb/begin/ folder named AdventureWorks2008LT_Original.sqlClick the New Query button.This will open a similar dialog requesting logon informationFigure 4Creating a New Query WindowIf the first statement from the script matches the following one, delete it. This is because SQL Azure does not support Windows Authentication. We have already created the user account that we will be using.SQL/****** Object: User [NT AUTHORITY\NETWORK SERVICE] Script Date: 07/20/2009 09:26:19 ******/CREATE USER [NT AUTHORITY\NETWORK SERVICE] FOR LOGIN [NT AUTHORITY\NETWORK SERVICE] WITH DEFAULT_SCHEMA=[dbo]GONext, we will remove all the user defined type statements from the script. Press Ctrl-F and type CREATE TYPE in the Find what text box. Then press bookmark all.Figure 5Finding CREATE TYPESQL Server Management Studio will then bookmark all the instances of the CREATE TYPE commands.While we specified the scripting option to convert all UDDTs to base types this will only change the type that is used in the table or procedure creation commands. The CREATE TYPE statements are still output in the script.Delete each of the instances of CREATE TYPE. You can press Ctrl-K, Ctrl-N to go to the next bookmark each time.Press Ctrl-Home to go back to the top of the document. At present, the script will be trying to create objects on a particular filegroup. We are not able to specify filegroups in SQL Azure. Press Ctrl-H to open the Find and Replace dialog again and this timeFind: ON [PRIMARY] Replace with: Press Replace All.Figure 6Finding ON PRIMARYIt should find and replace 44 instances.Press Ctrl-Home to return to the top of the document.SQL Azure does not support the NOT FOR REPLICATON column constraint. We need to remove this. Press Ctrl-H to open the Find and Replace dialog.Find: NOT FOR REPLICATIONReplace with:Press Replace AllFigure 7Finding NOT FOR REPLICATIONYou should find three instances.There are a number of options on indexes that are not supported as well. We will replace the WITH statement for various indexes. We are doing this to remove the options PAD_INDEX, ALLOW_ROW_LOCKS, ALLOW_PAGE_LOCKS. Press Ctrl-H to open the Find and Replace dialog.Find: WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON)Replace with: WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF)Press Replace AllFigure 8Finding NOT FOR REPLICATIONYou should find and replace 26 occurrences.Press Ctrl-Home to return to the top of the documentSQL Azure does not support the creation of an XML SCHEMA COLLECTION. Just below the top of the document is a statement that you will need to delete.SQL/****** Object: XmlSchemaCollection [SalesLT].[ProductDescriptionSchemaCollection] Script Date: 07/20/2009 09:26:23 ******/CREATEXMLSCHEMACOLLECTION [SalesLT].[ProductDescriptionSchemaCollection] ASN'<xsd:schema xmlns:xsd="" xmlns.....'GOThere is also still a reference to the XML SCHEMA COLLECTION on one of the tables. Press Ctrl-F to open the Find dialog.Search for: contentPress Find nextFigure 9Finding the XML Schema Collection referenceChange the line that you find from[CatalogDescription] [xml](CONTENT [SalesLT].[ProductDescriptionSchemaCollection]) NULL,to[CatalogDescription] [xml] NULL,Finally for the XML changes there is an XML index that is not supported. Press Ctrl-F to open the Find dialog.Find: CREATE PRIMARY XML INDEXPress Find NextFigure 10Finding the XML IndexRemove the statementSQLCREATE PRIMARY XML INDEX [PXML_ProductModel_CatalogDescription] ON [SalesLT].[ProductModel] ([CatalogDescription])WITH (DROP_EXISTING = OFF)GOThe ROWGUIDCOL column property is not supported in SQL Azure. Press Ctrl-H to open the Find and Replace dialog.Find: ROWGUIDCOLReplace:Press Replace AllFigure 11Finding references to ROWGUIDCOLYou should find 10 occurrences to replaceSQL Azure does not support the SET option ANSI_NULLS ON. Press Ctrl-H to open the Find and Replace dialog.Find: SET ANSI_NULLS ONReplace:Press Replace AllFigure 12Finding the XML Schema Collection referenceYou should find 25 occurrences.Press Ctrl-S to save our progress to dateLet’s now test our progress in preparing our script for SQL Azure. You will need to ensure that your query window has an active connection to SQL Azure and is connected to the HoLTestDB database. If the connection has timed out you can press execute a couple of times and to be prompted for a login. Press the Parse button on the toolbar.Figure 13Parsing the Query textSQL Azure will now parse the query. This may take some time. No objects will be created.Figure 14Parse errorsThere are a number of unsupported keywords still in the script.First, we will resolve the continued existence of the pad_index option on some indexes.Press Ctrl-F to open the Find dialog. Search for: pad_indexPress Bookmark AllFigure 15Finding the pad_index optionPress Ctrl-K, Ctrl-N to move through each bookmark. Remove the following options within each index definition. Ensure you keep the appropriate commas separating the remaining options. SORT_IN_TEMPDB,PAD_INDEX, ALLOW_ROW_LOCKS, ALLOW_PAGE_LOCKSSave your work again by pressing Ctrl-SPress the Parse button on the toolbar once again.Figure 16Unsupported index optionThere is still an index option that is causing us errors. Find this and remove it by using the find dialog. Press Ctrl-F to open the Find Dialog. Find: sort_in_tempdbPress Find NextRemove the option.Press the Parse button on the toolbar once again. You should have an error free parse this time.There is one more change that will only become evident when we execute the script. To save time we will just make the change now. If you remember the first exercise, you will know that every table in SQL Azure must have a clustered index before data is inserted into it. We need to add a clustered index to the BuildVersion table. Find this table in the script. Press Ctrl-F to open the find dialog and Find: BuildVersionPress Find nextFigure 17Finding the BuildVersion tableAdd a clustered index to the CREATE TABLE statement for the BuildVersion table. It should now look like this.T-SQLCREATE TABLE [dbo].[BuildVersion]([SystemInformationID] [tinyint] IDENTITY(1,1) NOT NULL PRIMARY KEY CLUSTERED,[Database Version] [nvarchar](25) NOT NULL,[VersionDate] [datetime] NOT NULL,[ModifiedDate] [datetime] NOT NULL) Save your work by pressing Ctrl-S. Execute the script by pressing F5 or the Execute button on the toolbar.Figure 18Execute the scriptGo and grab a caffeinated beverage of your choosing. It takes a while for the script to execute…Note: You might want to record the time it takes to complete the script. SQL Server Management Studio shows the elapsed time in its status bar, in the lower right corner of the window. You can use this value later to compare with the time it takes to load the same data using the Bulk Load Utility (BCP) in Exercise 2.Switch to the Messages Tab to watch the messages as your objects are created.Figure 19Script executionScroll down and look for any errors. If you have errors go back and check you haven’t missed a step. You may need to drop and recreate your HoLTestDB.You will identify one error in the output.Figure 20Error Creating Index on ViewThis is the failure in the creation of an index on a view. This is not supported in SQL Azure but for our purposes, we can ignore it- it is a non-fatal error. It simply means this object will not be created.To test your newly created database, execute the following query. SELECT Count(*) FROM SalesLT.CustomerFigure 21Getting the Count of Customer tableExecute the following query to output the contents of the sys.objects system table.SELECT * FROM sys.objectsFigure 22Contents of sys.objectsExercise 2: Using BCP for Data Import and ExportThe Bulk Copy Program (BCP) is a utility for copying large amounts of data into or out of SQL Server. BCP is mainly used to copy data from SQL Server for use with other programs, to upload data files from different databases, and to copy information from one SQL Server to another SQL Server.The goal of this exercise is to show you how we can use BCP to transfer data to and from SQL Azure databases very quickly by using a simple procedure. First, we will export the data contained in a database into data files in your system. Next, we will delete the existing data. Finally, we will re-load the information into SQL Azure from the data files.Note: The SQL Server setup installs BCP.EXE in its Tools directory. The commands described in this exercise assume that this utility is available in your system and its location is in your path. Additionally, the BCP utility can be copied into a Windows Azure Web or Worker Role and used to move data within the Windows Azure platform.Task 1 – Moving Data Out of SQL Azure with BCPIn this task, we use the Bulk Copy command utility (BCP) to export the data contained in the HolTestDB database into text files stored in your local hard disk. First, we will export a single table to become acquainted with the procedure to export data using BCP. Next, we will export the remainder of the tables using a batch file.Open a command prompt window and change the current directory to %TrainingKitInstallationFolder%\Labs\MigratingDatabasesToSQLAzure\Source\Ex02-ImportExportWithBCP. You will place the data downloaded from SQL Azure into this folder. Type the following command replacing [YOUR_ASSIGNED_SERVER] with the name of the server assigned to your account and [YOUR_USERNAME] and [YOUR_PASSWORD] with your account credentials. Press ENTER to execute the command. Command Promptbcp HolTestDB.SalesLT.Customer out Customer.dat -S [YOUR_ASSIGNED_SERVER].database. -U [YOUR_USERNAME]@[YOUR_ASSIGNED_SERVER] -P [YOUR_PASSWORD] -n -qNote: You control the operation of the utility by specifying options in its command line. You can review available options by executing bcp -? in a command prompt window.bcp [database_name].[schema].[table_name] out -S [server_name] -U [login_id] -P [password] -n -qThe following arguments are used in this example: [database_name.][schema].[table_name]: name of the source table out: direction of the bulk copy -S [server_name]: instance of SQL Server to which to connect -U [login_id]: login ID used to connect to SQL Server -P [password]: password for the login ID -n: using the native (database) data types of the data -q: Executes the SET QUOTED_IDENTIFIERS ON statement in the connectionFor more information on the Bulk Copy Program (BCP) command utility, see bcp Utility. Figure 23Using BCP to export a single table from SQL AzureNext, we will export the remaining tables in the HolTestDB database. Because the procedure is similar for every table, we will launch a batch file that executes the corresponding BCP command for each one. Type the following command replacing [YOUR_ASSIGNED_SERVER] with the name of the server assigned to your account and [YOUR_USERNAME] and [YOUR_PASSWORD] with your account credentials. Press ENTER to execute the command. Command PromptExportData.cmd [YOUR_ASSIGNED_SERVER].database. [YOUR_USERNAME]@[YOUR_ASSIGNED_SERVER] [YOUR_PASSWORD] Figure 24Using BCP to export multiple tables from SQL Azure List the contents of the directory to verify that the data was exported successfully. It should contain ten files with .dat extension, one for each table exported from the SQL Azure database. Figure 25Data files exported from SQL AzureTask 2 – Deleting Data from the Existing DatabaseIn this task, we will clear the existing data in the HolTestDB database tables. This is necessary because we will load the data again, this time using BCP. To clear the tables, we execute a batch of T-SQL statements to delete every row in each of the tables in the database.If you have not opened SQL Server Management Studio, then do so now.Press CTRL + O, browse to the %TrainingKitInstallationFolder%\Labs\MigratingDatabasesToSQLAzure\Source\Ex02-ImportExportWithBCP folder, select ClearTables.sql and click Open.If prompted, enter your credentials and connect to the HoLTestDB database in SQL Azure.Press CTRL + E to execute the batch of DELETE statements. This will remove every row in each of the tables in the database.Figure 26Clearing the tables in the databaseTask 3 – Moving Data into SQL Azure Using BCPIn this task, we will load data back into the HolTestDB database tables using the BCP utility. First, we will import a single table to become acquainted with the procedure to import data using BCP. Next, we will import the remainder of the tables using a batch file.Open a command prompt window and change the current directory to %TrainingKitInstallationFolder%\Labs\MigratingDatabasesToSQLAzure\Source\Ex02-ImportExportWithBCP. This directory contains the data files that you previously exported from SQL Azure. Type the following command replacing [YOUR_ASSIGNED_SERVER] with the name of the server assigned to your account and [YOUR_USERNAME] and [YOUR_PASSWORD] with your account credentials. Press ENTER to execute the command. Command Promptbcp HolTestDB.SalesLT.Customer in Customer.dat -S [YOUR_ASSIGNED_SERVER].database. -U [YOUR_USERNAME]@[YOUR_ASSIGNED_SERVER] -P [YOUR_PASSWORD] -n -q Figure 27Using BCP to import a single table into SQL AzureNext, we will import the remaining tables into the HolTestDB database. Because the procedure is similar for every table, we will launch a batch file that executes the corresponding BCP command for each one. Type the following command replacing [YOUR_ASSIGNED_SERVER] with the name of the server assigned to your account and [YOUR_USERNAME] and [YOUR_PASSWORD] with your account credentials. Press ENTER to execute the command. Note: You might want to take note of how long it takes to complete and contrast this value with the time it took to import the same data using a script as you did in Exercise 1. Using BCP the procedure should complete significantly mand PromptImportData.cmd [YOUR_ASSIGNED_SERVER].database. [YOUR_USERNAME]@[YOUR_ASSIGNED_SERVER] [YOUR_PASSWORD]Figure 28Using BCP to import multiple tables into SQL AzureWe will now verify that the data was loaded successfully. To do this, open a new query window in SQL Server Management Studio. If prompted, enter your credentials and connect to the HoLTestDB database in SQL Azure. Paste the following T-SQL statements into the query window and press CTRL + E to execute the query. You should obtain the count of records loaded in each table.C#SELECT COUNT(*) FROM SalesLT.SalesOrderDetailSELECT COUNT(*) FROM SalesLT.CustomerAddressSELECT COUNT(*) FROM SalesLT.SalesOrderHeaderSELECT COUNT(*) FROM SalesLT.ProductModelProductDescriptionSELECT COUNT(*) FROM SalesLT.ProductSELECT COUNT(*) FROM SalesLT.AddressSELECT COUNT(*) FROM SalesLT.ProductDescriptionSELECT COUNT(*) FROM SalesLT.ProductModelSELECT COUNT(*) FROM SalesLT.ProductCategorySELECT COUNT(*) FROM SalesLT.CustomerExercise 3: Using SSIS for Data Import and ExportIn this exercise, we will create a simple SQL Server Integration Services (SSIS) package to migrate data back from the Sales Order tables in SQL Azure. This will demonstrate how simple it is to create data migration packages using SQL Azure. Other options include scripted INSERT statements or using BCP to move data to and from SQL Azure which will be available in future releases.SSIS is a platform for building enterprise-level data integration and data transformations solutions known as Extraction Translation and Load (ETL). You use SSIS to solve complex business problems by copying or downloading files, sending e-mail messages in response to events, updating data warehouses, cleaning and mining data, and managing SQL Server objects and data. The packages can work alone or in concert with other packages to address complex business needs. Integration Services can extract and transform data from a wide variety of sources such as XML data files, flat files, and relational data sources, and then load the data into one or more destinations.Integration Services includes a rich set of built-in tasks and transformations; tools for constructing packages; and the Integration Services service for running and managing packages. You can use the graphical Integration Services tools to create solutions without writing a single line of code; or you can program the extensive Integration Services object model to create packages programmatically and code custom tasks and other package objects.Note: This exercise assumes that Exercise 1 of this lab has been completed with the Adventure Works 2008 Light database schema created in the SQL Azure oLTestDB HoLTestDB database. To complete this lab you will need to have SSIS installed.Task 1 – Deleting Data from an Existing DatabaseOpen SQL Server Business Intelligence Development Studio. Start > All Programs > SQL Server 2008> SQL Server Business Intelligence Development StudioOpen the New Project dialog. File > New > ProjectSelect Business Intelligence Projects from the Project types list.Select Integration Services Project from the Templates.Enter the name CloudSalesImportClick OK to create the new projectFigure 29Creating a new SSIS ProjectNote: If you don’t have the Business Intelligence Projects template that’s probably an indicator that you have not install the SQL Server Business Intelligence Studio (BIDS) or SQL Server Integration Services which can be found as part SQL Server 2008 Trial and Developer Editions.If Package.dtsx is not open double click on the file in the solution explorer to view.From the Maintenance Plan Tasks category in the toolbox, drag and drop the Execute T-SQL Statement Task onto the Designer.Click the new task on the designer and press F2 to rename the task to Clear Sales Order Details.Figure 30Adding a T-SQL Statement Task to the CanvasRight Click on the task and click EditType the SQL statement DELETE FROM SalesLT.SalesOrderDetail into the T-SQL statement.Figure 31Execute T-SQL DialogClick the New button to create a new connection.Fill in the Connection properties for localhost.Figure 32Creating a connection for localhostClick OK to complete the connection properties screenClick OK to save the task.Note: When a connection is created via the previous dialogs, the database is not set. This will mean the Transact SQL will not work when run. To remedy this follow the steps below.Right Click on the LocalDatabase connection in the Connections section and click Edit…Figure 33Editing the LocalDatabase connectionSelect the AdventureWorksLT2008 database.Figure 34Configuring the database to connect toClick on Test Connection to check the connection is working.Click OK to close the connection properties.Drag another Execute T-SQL Task onto the package designer.Rename the new task to Clear Sales Order Headers.Link the original task with the new task by selecting the original task, clicking the green arrow and dragging the arrow to the New Task.Figure 35Linking two steps in the control flowEdit the Clear Sales Order Headers task by right clicking the task and selecting Edit.Fill in the T-SQL statement with DELETE FROM SalesLT.SalesOrderHeader.Figure 36T-SQL to delete header rowsClick the OK button to save the changes.Task 2 – Flowing Data from SQL Azure to SQL ServerFrom the Control Flow Items category in the toolbox, drag on a Data Flow Task to the package designerLink the Clear Sales Order Headers task with the dataflow task via selecting the Clear Sales Order Headers Task and dragging the green arrow to the Data Flow Task.Figure 37Connecting the Data Flow task in the control flow canvasRename the data flow task to Copy across headers.Edit the data flow task.Figure 38Editing the Data Flow taskThe Data Flow tab should now be displayed.Drag on an ADO NET Source.Note: SQL Azure CTP does not currently support OLE DB connections, so instead we'll use an ADO Net connection.Rename the source to Cloud Sales Headers.Figure 39Editing the Data Flow taskRight Click on Cloud Sales Headers and click Edit…The ADO .NET Source Editor should be displayed, click New…Figure Source EditorThe Configure Connection Manager should now be displayed, Click New…Figure Connection ManagerFill in the connection properties with the HoLTestUser login to the SQL Azure database and type in HolTestDB as the database.Note: You will be unable to select the database using the combo box; instead, you will need to type the name of the database in.Figure 42Creating a new ConnectionNote:The database connection will be different according to the connection details to Azure. If you completed the Introduction to SQL Azure lab, you can use the HoLTestUser for this connection.Click on Test Connection, to ensure the connection is workingClick the OK button to close the Test connection dialog.Click the OK button to save the Connection.Select the SQL Azure Connection and click the OK button to select the connection for the ADO NET source.Select the SalesLT.SalesOrderHeader table and Click the OK button.Figure 43Creating a new ConnectionNote: To allow inserting of the identity columns, the OLE DB destination must be used as this has an option for Identity Insert. The following steps will demonstrate adding an OLE DB destination to the data flow.From the Data Flow Destinations category in the toolbox, drag on an OLE DB Destination.Rename the destination to Local Sales Headers.Figure 44Creating a new ConnectionLink the new destination to its source by selecting Cloud Sales Headers, clicking the green arrow and dragging the arrow to Local Sales Headers.Edit the Local Sales Header destination.Click the New button to create a new OLE DB Connection.Figure 45Creating an OLE DB DestinationClick the New button to create a new Data Connection.Fill in the local connection details and select the AdventureWorksLT2008 database.Figure 46Creating a new OLE DB ConnectionClick the OK Button to save the Connection settings.Select the Local Connection and Click OK to select the connection for the OLE DB destination.Figure 47Selecting the new OLE DB ConnectionSet the Data access mode to Table or View – fast load, and Check Keep identity.Figure 48Configuring the OLE DB DestinationNote: Checking Keep identity will allow the primary key to be inserted into the database. This is important in order to maintain referential integrity.Select Mappings to automatically generate the mappings, then Click OK.Figure 49Using the Default MappingsOpen the Error List to view the current errors. View -> Error ListFigure 50Package error listNote: You will notice that there are Errors relating to the string data type, the next steps will show how to get around the issue that causes this error.Drag on a Data Conversion transformation.Select the link between the source and the destination and delete it.Figure 51Un-joining existing stepsCreate a Link between the source and conversion.Figure 52Joining the Data Conversion TransformationEdit the conversionAdd an input column to the list by checking the checkbox beside the CreditCardApprovalCode and change the output alias to NewCreditCardApproval and Data Type string [DT_STR].Figure 53Configuring the Data Conversion TransformationNote: SQL Azure uses Unicode strings, these are not compatible in the database, these will need to be converted to a non-Unicode string before being added into the local database.Click the OK button to save the transformations.Link the Data Conversion with Local Sales Header.Figure 54Joining the Data Conversion to the OLE DB DestinationEdit the Mappings for the Local Sales Header destinationDelete the link between the CreditCardApprovalCode and re-link to our new column.Figure 55Editing the Mappings for the OLE DB DestinationClick OK button to save the new mappings.Note: This completes the copy of the Cloud Sales Headers. You will now need to bring the Sales Order Details from SQL Azure back to the local databaseClick on the control flow tab to view the control flow againDrag on another Data Flow Task.Figure 56Adding another Data Flow TaskRename the task to Copy across detailsLink the two data flows.Figure 57Connecting the two data flow tasksEdit the data flowDrag on a new ADO NET source from the source section in the toolbox into the Data Flow designer.Rename the source to Cloud Sales Details.Edit the source and set the connection to the cloud connection and the table to SalesLT.SalesOrderDetail.Figure 58Configuring the SourceClick the OK button to save the changes to the ADO NET sourceDrag on an OLE DB Destination from the Destination section of the toolbar.Rename the destination to Local Sales Details.Figure 59Renaming the OLE DB DestintionLink the Source with the Destination by dragging the green arrow from the source to the destination.Figure 60Connecting the source and destinationEdit the Destination, setting the Table to SalesLT.SalesOrderDetail and check Keep identity.Figure 61Configuring the OLE DB DestinationSelect mappings and Click OK to save the changes to the destination.Figure 62Using the default mappings Navigate back to the Control Flow.Click Start Debugging or press F5 to test. If the package is working correctly, the 4 boxes in control flow should change color to green.Note: If any of the boxes are red after running, this means there was an error. The details of this error can be viewed in the output window in Visual Studio. The output window may not be showing, if it isn’t showing go to View -> Output.Figure 63Successfully executed packageSummaryThis lab provided examples on how we can use SSMS and SSIS to interact with SQL Azure.In the first exercise, we took a moderately complex pre-existing database, exported it as a T-SQL Script and re-created the database in SQL Azure. There were a number of unsupported features that we had to edit out of the script.The second exercise showed how to bring across data from the cloud to another local server easily using SSIS. ................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download