ASP



- Best Practices

is much more powerful than Classic ASP, however it’s important to understand how to use that power to build highly efficient, reliable and robust applications. In this article I tried to highlight the key tips you can use to maximize the performance of your pages. The list can be much longer, I am only emphasizing the most important ones.

1. Plan and research before you develop: Research and investigate how .NET can really benefit you. .NET offers variety of solutions on each level of application design and development. It is imperative that you understand your situation and pros and cons of each approach supported by this rich development environment. Visual Studio is a comprehensive development package and offers many options to implement the same logic. It is really important that you examine each option and find the best optimal solution suited for the task at hand. Use layering to logically partition your application logic into presentation, business, and data access layers. It will not only help you create maintainable code, but also permits you to monitor and optimize the performance of each layer separately. A clear logical separation also offers more choices for scaling your application. Try to reduce the amount of code in your code-behind files to improve maintenance and scalability.

2. String concatenation: If not handled properly, String Concatenation can really decrease the performance of your application. You can concatenate strings in two ways. First, using string and adding the new string to an existing string. However, this operation is really expensive (especially if you are Concatenating the string within a loop). When you add a string to an existing string, the Framework copies both the existing and new data to the memory, deletes the existing string, and reads data in a new string. This operation can be very time consuming and costly in lengthy string concatenation operations. The second and better way to concatenate strings is using the StringBuilder Class. Below is an example of both approaches. If you are considering doing any type of Sting Concatenation, please do yourself a favor and test both routines separately. You may be surprised at the results.

‘Concatenation using String Class

Response.Write("String Class")

Dim str As String = ""

Dim startTime As DateTime = DateTime.Now

Response.Write(("Start time:" + startTime.ToString()))

Dim i As Integer

For i = 0 To 99999

str += i.ToString()

Next i

Dim EndTime As DateTime = DateTime.Now

Response.Write(("End time:" + EndTime.ToString()))

Response.Write(("# of time Concatenated: " + i.ToString))

Results: Took 4 minutes and 23 Seconds to to complete 100,000 Concatenations.

String Class

Start time:2/15/2006 10:21:24 AM

End time:2/15/2006 10:25:47 AM

# of time Concatenated: 100000

'Concatenation using StringBuilder

Response.Write("StringBuilder Class")

Dim strbuilder As New StringBuilder()

Dim startTime As DateTime = DateTime.Now

Response.Write(("Start time:" + startTime.ToString()))

Dim i As Integer

For i = 0 To 99999

strbuilder.Append(i.ToString())

Next i

Dim EndTime As DateTime = DateTime.Now

Response.Write(("Stop time:" + EndTime.ToString()))

Response.Write(("# of time Concatenated: " + i.ToString))

Results: Took less than a Second to complete 100,000 Concatenations.

StringBuilder Class

Start time:2/15/2006 10:31:22 AM

Stop time:2/15/2006 10:31:22 AM

# of time Concatenated: 100000

This is one of many situations in which provides extremely high performance benefits over classis ASP.

3. Avoid round trips to the server: You can avoid needless round trips to the Web Server using following tips.

• Implement Ajax UI whenever possible. The idea is to avoid full page refresh and only update the portion of the page that needs to be changed. I think Scott's article gave great information on how to implement Ajax Atlas and control.

• Use Client Side Scripts. Client site validation can help reduce round trips that are required to process user’s request. In you can also use client side controls to validate user input.

• Use Page.ISPostBack property to ensure that you only perform page initialization logic when a page first time loaded and not in response to client postbacks.

If Not IsPostBack Then

LoadJScripts()

End If

• In some situations performing postback event handling are unnecessary. You can use client callbacks to read data from the server instead of performing a full round trip. Click here for details.

4. Save viewstate only when necessary: ViewState is used primarily by Server controls to retain state only on pages that post data back to themselves. The information is passed to the client and read back in a hidden variable. ViewState is an unnecessary overhead for pages that do not need it. As the ViewState grows larger, it affects the performance of garbage collection. You can optimize the way your application uses ViewState by following these tips:

Situation when you don’t need ViewState. ViewState is turned on in by default. You might not need ViewState because your page is output-only or because you explicitly reload data for each request. You do not need ViewState in the following situations:

• Your page does not post back. If the page does not post information back to itself, if the page is only used for output, and if the page does not rely on response processing, you do not need ViewState.

• You do not handle server control events. If your server controls do not handle events, and if your server controls have no dynamic or data bound property values, or they are set in code on every request, you do not need ViewState.

• You repopulate controls with every page refresh. If you ignore old data, and if you repopulate the server control each time the page is refreshed, you do not need ViewState.

Disabling viewstate. There are several ways to disable viewstate at various levels:

• To disable ViewState for a single control on a page, set the EnableViewState property of the control to false.

• To disable ViewState for a single page, set the EnableViewState attribute in the @ Page directive to false. i.e.

• To disable ViewState for a specific application, use the following element in the Web.config file of the application.

• To disable ViewState for all applications on a Web server, configure the element in the Machine.config file as follows.

Determine the size of your ViewState. By enabling tracing for the page, you can monitor the ViewState size for each control. You can use this information to determine the optimal size of the viewstate or if there are controls in which the ViewState can be disabled.

5. Use of session variables carefully: Avoid storing too much data in session variables, and make sure your session timeout is reasonable. This can use a significant amount of server memory. Keep in mind that data stored in session variables can hang out long after the user closes the browser. Too many session variables can bring the server on its knees. Disable session state, if you are not using session variables in the particular page or application.

• To disable session state for a page, set the EnableSessionState attribute in the @ Page directive to false.i.e.

.

• If a page requires access to session variables but will not create or modify them, set the EnableSessionState attribute in the @ Page directive to ReadOnly. i.e.

.

• To disable session state for a specific application, use the following element in the Web.config file of the application.

• To disable session state for all applications on your Web server, use the following element in the Machine.config file.

6. Use Server.Transfer: Use the Server.Transfer method to redirect between pages in the same application. Using this method in a page, with Server.Transfer syntax, avoids unnecessary client-side redirection. Consider Using Server.Transfer Instead of Response.Redirect. However, you cannot always just replace Response.Redirect calls with Server.Transfer. If you need authentication and authorization checks during redirection, use Response.Redirect instead of Server.Transfer because the two mechanisms are not equivalent. When you use Response.Redirect, ensure you use the overloaded method that accepts a Boolean second parameter, and pass a value of false to ensure an internal exception is not raised. Also note that you can only use Server.Transfer to transfer control to pages in the same application. To transfer to pages in other applications, you must use Response.Redirect.

7. Use server controls when appropriate and avoid creating deeply nested controls: The HTTP protocol is stateless; however, server controls provide a rich programming model that manages state between page requests by using ViewState. However nothing comes for free, server controls require a fixed amount of processing to establish the control and all of its child controls. This makes server controls relatively expensive compared to HTML controls or possibly static text. When you do not need rich interaction, replace server controls with an inline representation of the user interface that you want to present. It is better to replace a server control if,

• You do not need to retain state across postbacks.

• The data that appears in the control is static or control is displaying read-only data

• You do not need programmatic access to the control on the server-side.

Alternatives to server controls include simple rendering, HTML elements, inline Response.Write calls, and raw inline angle brackets (). It is essential to balance your tradeoffs. Avoid over optimization if the overhead is acceptable and if your application is within the limits of its performance objectives.

Deeply nested hierarchies of controls compound the cost of creating a server control and its child controls. Deeply nested hierarchies create extra processing that could be avoided by using a different design that uses inline controls, or by using a flatter hierarchy of server controls. This is especially important when you use controls such as Repeater, DataList, and DataGrid because they create additional child controls in the container.

8. Choose the data viewing control appropriate for your solution: Depending on how you choose to display data in a Web Forms page, there are often significant tradeoffs between convenience and performance. Always compare the pros and cons of controls before you use them in your application. For example, you can choose any of these three controls (DataGrid, DataList and Repeater) to display data, it’s your job to find out which control will provide you maximum benefit. The DataGrid control can be a quick and easy way to display data, but it is frequently the most expensive in terms of performance. Rendering the data yourself by generating the appropriate HTML may work in some simple cases, but customization and browser targeting can quickly offset the extra work involved. A Repeater Web server control is a compromise between convenience and performance. It is efficient, customizable, and programmable.

9. Optimize code and exception handling: To optimize expensive loops, use For instead of ForEach in performance-critical code paths. Also do not rely on exceptions in your code and write code that avoids exceptions. Since exceptions cause performance to suffer significantly, you should never use them as a way to control normal program flow. If it is possible to detect in code a condition that would cause an exception, do so. Do not catch the exception itself before you handle that condition. Do not use exceptions to control logic. A database connection that fails to open is an exception but a user who mistypes his password is simply a condition that needs to be handled. Common scenarios include checking for null, assigning a value to a String that will be parsed into a numeric value, or checking for specific values before applying math operations. The following example demonstrates code that could cause an exception and code that test for a condition. Both produce the same result.

'Unnecessary use of exception

Try

value = 100 / number

Catch ex As Exception

value = 0

End Try

' Recommended code

If Not number = 0 Then

value = 100 / number

Else

value = 0

End If

Check for null values. If it is possible for an object to be null, check to make sure it is not null, rather then throwing an exception. This commonly occurs when you retrieve items from ViewState, session state, application state, or cache objects as well as query string and form field variables. For example, do not use the following code to access session state information.

'Unnecessary use of exception

Try

value = HttpContext.Current.Session("Value").ToString

Catch ex As Exception

Response.Redirect("Main.aspx", False)

End Try

'Recommended code

If Not HttpContext.Current.Session("Value") Is Nothing Then

value = HttpContext.Current.Session("Value").ToString

Else

Response.Redirect("Main.aspx", False)

End If

10. Use a DataReader for fast and efficient data binding: Use a DataReader object if you do not need to cache data, if you are displaying read - only data, and if you need to load data into a control as quickly as possible. The DataReader is the optimum choice for retrieving read-only data in a forward-only manner. Loading the data into a DataSet object and then binding the DataSet to the control moves the data twice. This method also incurs the relatively significant expense of constructing a DataSet. In addition, when you use the DataReader, you can use the specialized type-specific methods to retrieve the data for better performance.

11. Use paging efficiently: Allowing users to request and retrieve more data than they can consume puts an unnecessary strain on your application resources. This unnecessary strain causes increased CPU utilization, increased memory consumption, and decreased response times. This is especially true for clients that have a slow connection speed. From a usability standpoint, most users do not want to see thousands of rows presented as a single unit. Implement a paging solution that retrieves only the desired data from the database and reduces back-end work on the database. You should optimize the number of rows returned by the Database Server to the middle-tier web-server. For more information read this article to implement paging at Database level. If you are using SQL Server 2000, please also look at this article.

12. Explicitly Dispose or Close all the resources: To guarantee resources are cleaned up when an exception occurs, use a try/finally block. Close the resources in the finally clause. Using a try/finally block ensures that resources are disposed even if an exception occurs. Open your connection just before needing it, and close it as soon as you're done with it. Your motto should always be "get in, get/save data, get out." If you use different objects, make sure you call the Dispose method of the object or the Close method if one is provided. Failing to call Close or Dispose prolongs the life of the object in memory long after the client stops using it. This defers the cleanup and can contribute to memory pressure. Database connection and files are examples of shared resources that should be explicitly closed.

Try

_con.Open()

Catch ex As Exception

Throw ex

Finally

If Not _con Is Nothing Then

_con.Close()

End If

End Try

13. Disable tracing and debugging: Before you deploy your application, disable tracing and debugging. Tracing and debugging may cause performance issues. Tracing and debugging are not recommended while your application is running in production. You can disable tracing and debugging in the Machine.config and Web.config using the syntax below.

14. Precompiling pages and disabling AutoEventWireup: By precompiled pages, users do not have to experience the batch compile of your files; it will increase the performance that your users will experience.

Also setting the AutoEventWireup attribute to false in the Machine.config file means that the page will not match method names to events and hook them up (for example, Page_Load). If page developers want to use these events, they will need to override the methods in the base class (for example, they will need to override Page.OnLoad for the page load event instead of using a Page_Load method). If you disable AutoEventWireup, your pages will get a slight performance boost by leaving the event wiring to the page author instead of performing it automatically.

15. Use of stored procedures and indexes: In most cases you can get an additional performance boost by using compiled stored procedures instead of ad hoc queries.

Make sure you index your tables, and choose your indexes wisely. Try using Index Tuning Wizard and have it report to you what it thinks the best candidates for indexes would be. You don't have to follow all of its suggestions, but it may reveal things about your structure or data that will help you choose more appropriate indexes.

• In SQL Server Management Studio (SQL Server 2005), highlight your query. Now from the Query menu, click Analyze Query in Database Engine Tuning Advisor.

• You can do something similar in SQL Server 2000 to run the index tuning wizard? In Query Analyzer, highlight your query. From the Query menu, click Index Tuning Wizard.

1) Avoid the use of ArrayList. Because any objects added into the Arraylist are added as System.Object and when retrieving values back from the arraylist, these objects are to be unboxed to return the actual valuetype. So it is recommended to use the custom typed collections instead of ArrayList. .NET provides a strongly typed collection class for String in System.Collection.Specialized, namely StringCollection. For other type of class, rename the _ClassType attribute in the attached file to your required type.

2) Reconsider the use of Hashtable instead try other dictionary such as StringDictionary, NameValueCollection, HybridCollection. Hashtable can be used if less number of values are stored.

3) Always declare constants for the string literals you use instead of enclosing them in "".

//AVOID

//

MyObject obj = new MyObject();

obj.Status = "ACTIVE";

//RECOMMENDED

const string C_STATUS = "ACTIVE";

MyObject obj = new MyObject();

obj.Status = C_STATUS;

4) Donot compare strings by converting them to uppercase or lowercase, use pare instead, which can ignore the case and compare.

   Ex: 

 

const string C_VALUE = "COMPARE";

if (pare(sVariable, C_VALUE, true) == 0)

{

Console.Write("SAME");

}

5) Avoid String concatenation using + operator, instead use StringBuilder for concatenation. 

//AVOID

String sXML = "";

sXML += "";

sXML += "Data";

sXML += "";

sXML += "";

//RECOMMENDED

StringBuilder sbXML = new StringBuilder();

sbXML.Append("");

sbXML.Append("");

sbXML.Append("Data");

sbXML.Append("");

sbXML.Append("");

6) If you are only reading from the XML object, avoid using XMLDocumentt, instead use XPathDocument, which is readonly and so improves performance. 

//AVOID

XmlDocument xmld = new XmlDocument();

xmld.LoadXml(sXML);

txtName.Text = xmld.SelectSingleNode("/packet/child").InnerText;

.

.



//RECOMMENDED

XPathDocument xmldContext = new XPathDocument(new StringReader(oContext.Value));

XPathNavigator xnav = xmldContext.CreateNavigator();

XPathNodeIterator xpNodeIter = xnav.Select("packet/child");

iCount = xpNodeIter.Count;

xpNodeIter = xnav.SelectDescendants(XPathNodeType.Element, false);

while(xpNodeIter.MoveNext())

{

sCurrValues += xpNodeIter.Current.Value+"~";

}

7) Avoid declaring objects/variables inside loops, instead declare the variable once outside the loop and initialize them inside.

//AVOID

for(int i=0; i Configuration Manager -- > Set the configuration option of project to the "Release" mode.

 

4. Disable the viewstate:

With the help of automatic state management feature, the server control is re-populate their values without writing any code. And it affects the performance. The always set EnableViewState = false when not requires.

For control

For Page

5. Use Caching to improve the performance of your application.

OutputCaching enables your page to be cached for specific duration and can be made invalid based on various parameters that can be specified. The Cache exists for the duration you specify and until that time, the requests do not go to the server and are served from the Cache.

Do not assign cached items a short expiration. Items that expire quickly cause unnecessary turnover in the cache and frequently cause more work for cleanup code and the garbage collector. In case you have static as well as dynamic sections of your page, try to use Partial Caching (Fragment Caching) by breaking up your page into user controls and specify Caching for only those Controls which are more-or-less static.

6. Use appropriate Authentication Mechanism.

Following are the Authentication Modes.

• None

• Windows

• Forms

• Passport

7. Validate all Input received from the Users.

Validate all Input received from the users at client side to avoid the server round trip.

8. Use Finally Method to kill resources.

 

Always use the finally block to kill resources like closing database connection, closing files etc.

 

9. Always use the String builder to concatenate string

 

The memory representation of string is an array of characters, So on re-assigning the new array of Char is formed & the start address is changed. Thus keeping the old string in memory for garbage collector to be disposed. Hence application is slow down. Always use the string builder for concatenating string.

 

10. Enable the web gardening for multiprocessors computers:

 

The process model helps enable scalability on multiprocessor machines by distributing the work to several processes, one for each CPU, each with processor affinity set to its CPU. The technique is called Web gardening, and can dramatically improve the performance of some applications

 

11. Set Enlist="false" in connection string:

 

True indicates that the SQL Server connection Pooler automatically enlists the connection in the creation thread's current transaction context. That's why set enlist = false in connection string.

 

12. Avoid recursive functions / nested loops

13. Always set option strict to "on"

 

14. Try to avoid Throwing Exceptions. 

Caching: SQL Cache Dependency With SQL Server 2000

Introducing Cache Dependencies:

As time passes, the data source may change in response to other actions. However, if your code uses caching, you may remain unaware of the changes and continue using out-of-date information from the cache. To help mitigate this problem, supports cache dependencies. Cache dependencies allow you to make a cached item dependent on another resource so that when that resource changes the cached item is removed automatically. includes three types of dependencies:

• Dependencies on other cache items.

• Dependencies on files or folders.

• Dependencies on a database query.

Introducing SQL Cache Notifications:

SQL cache dependencies are one of the most wonderful new features in 2.0, the ability to automatically invalidate a cached data object (such as a DataSet or Custom Data Type) when the related data is modified in the database. This feature is supported in both SQL Server 2005 and in SQL Server 2000, although the underlying plumbing is quite different.

Cache Notifications in SQL Server 2000:

Before you can use SQL Server cache invalidation, you need to enable notifications for the database. This task is performed with the aspnet_regsql.exe command-line utility, which is located in the c:\[WinDir]\\Framework\[Version] directory. To enable notifications, you need to use the -ed command-line switch. You also need to identify the server (use -E for a trusted connection and -S to choose a server other than the current computer) and the database (use -d). Here's an example that enables notifications for the Northwind database on the current server:

aspnet_regsql -ed -E -d Northwind

After executing this command, a new table named SqlCacheTablesForChangeNotification is added to the database Northwind. The SqlCacheTablesForChangeNotification table has three columns: tableName, notificationCreated, and changeId. This table is used to track changes. Essentially, when a change takes place, a record is written into this table. The SQL Server polling queries this table. Also a set of stored procedures is added to the database as well. See the following table.

|Procedure Name |Description |

|AspNet_SqlCacheRegisterTableStoredProcedure |Sets a table up to support notifications. This process works by adding a |

| |notification trigger to the table, which will fire when any row is inserted, |

| |deleted, or updated. |

|AspNet_SqlCacheUnRegisterTableStoredProcedure |Takes a registered table and removes the notification trigger so that notifications|

| |won't be generated. |

|AspNet_SqlCacheUpdateChangeIdStoredProcedure |The notification trigger calls this stored procedure to update the |

| |AspNet_SqlCacheTablesForChangeNotification table, thereby indicating that the table|

| |has changed. |

|AspNet_SqlCacheQueryRegisteredTablesStoredProcedure |Extracts just the table names fromthe AspNet_SqlCacheTablesForChangeNotification |

| |table. Used to get a quick look at all the registered tables. |

|AspNet_SqlCachePollingStoredProcedure |Gets the list of changes from the AspNet_SqlCacheTablesForChangeNotification table.|

| |Used to perform the polling. |

After this, need to enable notification support for each individual table. You can do this manually using the AspNet_SqlCacheRegisterTableStoredProcedure, to that, open your query analyzer and select your Database you've enabled for SQL Cache Notification for example Northwind database and write the following command:

exec AspNet_SqlCacheRegisterTableStoredProcedure 'TableName'

Or you can use aspnet_regsql, using the -et parameter to enable a able for sql cache dependency notifications and the -t parameter to name the table. Here's an example that enables notifications for the Employees table:

aspnet_regsql -et -E -d Northwind -t Products

Both options generates the notification trigger for the Products table as the following:

CREATE TRIGGER dbo.[Products_AspNet_SqlCacheNotification_Trigger] ON [Products]

FOR INSERT, UPDATE, DELETE

AS

BEGIN

SET NOCOUNT ON

EXEC dbo.AspNet_SqlCacheUpdateChangeIdStoredProcedure N'Products'

END

So any record is inserted, deleted or updated in Products table will update ChangeId field in AspNet_SqlCacheTablesForChangeNotification table.

How Notificaions Works:

The AspNet_SqlCacheTablesForChangeNotification contains a single record for every table you're monitoring. When you make a change in the table (such as inserting ,deleting or updating a record), the changeId column is incremented by 1 -see AspNet_SqlCacheUpdateChangeIdStoredProcedure procedure-. queries this table repeatedly and keeps track of the most recent changeId values for every table. When this value changes in a subsequent read, knows that the table has changed.

In this scenario, Any change to the table is deemed to invalidate any query for that table. In other words, if you use this query:

SELECT * FROM Products WHERE CategoryID=1

The caching still works in the same way. That means if any product record is touched, even if the product resides in another category (and therefore isn't one of the cached records), the notification is still sent and the cached item is considered invalid. Also keep in mind it doesn't make sense to cache tables that change frequently.

Enable Polling:

To enable polling , you need to use the element in the web.config file. Set the enabled attribute to true to turn it on, and set the pollTime attribute to the number of milliseconds between each poll. (The higher the poll time, the longer the potential delay before a change is detected.) You also need to supply the connection string information.

[pic]

Creating the Cache Dependency:

Now that we've seen how to set up a database to support SQL Server notifications, the only remaining detail is the code, which is quite straightforward. We can use our cache dependency with programmatic data caching, a data source control, and output caching.

For programmatic data caching, we need to create a new SqlCacheDependency and supply that to the Cache.Insert() method. In the SqlCacheDependency constructor, you supply two strings. The first is the name of the database you defined in the element in the section of the web.config file e.g: Northwind. The second is the name of the linked table e.g: Products.

Example:

private static void CacheProductsList(List products)

{

SqlCacheDependency sqlDependency = new SqlCacheDependency("Northwind", "Products");

HttpContext.Current.Cache.Insert("ProductsList", products, sqlDependency, DateTime.Now.AddDays(1), Cache.NoSlidingExpiration);

}

private static List GetCachedProductList()

{

return HttpContext.Current.Cache["ProductsList"] as List;

}

NWProductItem is business class, and here we are trying to cache a list of NWProductItem instead of DataSet or DataTable.

The following method is used by an ObjectDataSource Control to retrieve List of Products

public static List GetProductsList(int catId, string sortBy)

{

//Try to Get Products List from the Cache

List products = GetCachedProductList();

if (products == null)

{

//Products List not in the cache, so we need to query the Database by using a Data Layer

NWProductsDB db = new NWProductsDB(_connectionString);

DbDataReader reader = null;

products = new List(80);

if (catId > 0)

{

//Return Product List from the Data Layer

reader = db.GetProductsList(catId);

}

else

{

//Return Product List from the Data Layer

reader = db.GetProductsList();

}

//Create List of Products -List if NWProductItem-

products = BuildProductsList(reader);

reader.Close();

//Add entry to products list in the Cache

CacheProductsList(products);

}

products.Sort(new NWProductItemComparer(sortBy));

if (sortBy.Contains("DESC")) products.Reverse();

return products;

}

To perform the same trick with output caching, you simply need to set the SqlDependency property with the database dependency name and the table name, separated by a colon:

The same technique works with the SqlDataSource and ObjectDataSource controls:

Important Note:

ObjectDataSource doesn't support built in caching for Custom types such as the one in our example. It only support this feature for DataSets and DataTables.

To test this feature, download the attached demo. It contains an editable GridView. Set a break point at the GetProductList method and run in debug mode. Update any record and notice the changes. Also you can edit the solution and remove the cache dependency and note the deference after update.

Also you can remove the SqlDependency from the output cache in the OutputCaching.aspx page, and notice that whatever update you made to the data source, the page still retrieves the old version of data.

Architecture

requires a host. On Windows Server™ 2003, the default host is the Internet Information Services (IIS) 6.0 worker process (W3wp.exe). When you use the Process Model, the host is the worker process (Aspnet_wp.exe).

When a request is received by , the request is handled by the HttpRuntime object. The HttpRuntime is responsible for application creation and initialization, managing the request queue and thread pool, and dispatching the incoming requests to the correct application. After the request is dispatched to the appropriate application, the request is passed through a pipeline. This pipeline is a staged, event-based execution framework consisting of multiple HttpModule objects and a single HttpHandler object. This architecture is shown in Figure 6.1.

[pic]

Figure 6.1: runtime infrastructure

HttpModule objects participate in the pipeline by handling predefined events that exposes. These events include BeginRequest, AuthenticateRequest, and EndRequest. The request flows through the pipeline of HttpModule objects and is then run by a single HttpHandler. After the event handler is completed, the request then flows back through the pipeline and is sent to the client.

Throughout the entire lifetime of a request, a context is exposed. The HttpContext object encapsulates information about individual requests and their associated responses.

Performance and Scalability Issues

The main issues that can adversely affect the performance and scalability of your application are summarized below. Subsequent sections in this chapter provide strategies and technical information to prevent or resolve each of these issues.

• Resource affinity. Resource affinity can prevent you from adding more servers, or resource affinity can reduce the benefits of adding more CPUs and memory. Resource affinity occurs when code needs a specific thread, CPU, component instance, or server.

• Excessive allocations. Applications that allocate memory excessively on a per-request basis consume memory and create additional work for garbage collection. The additional garbage collection work increases CPU utilization. These excessive allocations may be caused by temporary allocations. For example, the excessive allocations may be caused by excessive string concatenation that uses the += operator in a tight loop.

• Failure to share expensive resources. Failing to call the Dispose or Close method to release expensive resources, such as database connections, may lead to resource shortages. Closing or disposing resources permits the resources to be reused more efficiently.

• Blocking operations. The single thread that handles an request is blocked from servicing additional user requests while the thread is waiting for a downstream call to return. Calls to long-running stored procedures and remote objects may block a thread for a significant amount of time.

• Misusing threads. Creating threads for each request incurs thread initialization costs that can be avoided. Also, using single-threaded apartment (STA) COM objects incorrectly may cause multiple requests to queue up. Multiple requests in the queue slow performance and create scalability issues.

• Making late-bound calls. Late-bound calls require extra instructions at runtime to identify and load the code to be run. Whether the target code is managed or unmanaged, you should avoid these extra instructions.

• Misusing COM interop. COM interop is generally very efficient, although many factors affect its performance. These factors include the size and type of the parameters that you pass across the managed/unmanaged boundary and crossing apartment boundaries. Crossing apartment boundaries may require expensive thread switches.

• Large pages. Page size is affected by the number and the types of controls on the page. Page size is also affected by the data and images that you use to render the page. The more data you send over the network, the more bandwidth you consume. When you consume high levels of bandwidth, you are more likely to create a bottleneck.

• Failure to use data caching appropriately. Failure to cache static data, caching too much data so that the items get flushed out, caching user data instead of application-wide data, and caching infrequently used items may limit your system's performance and scalability.

• Failure to use output caching appropriately. If you do not use output caching or if you use it incorrectly, you can add avoidable strain to your Web server.

• Inefficient rendering. Interspersing HTML and server code, performing unnecessary initialization code on page postback, and late-bound data binding may all cause significant rendering overhead. This may decrease the perceived and true page performance.

Design Considerations

Building high-performance applications is significantly easier if you design with performance in mind. Make sure you develop a performance plan from the outset of your project. Never try to add performance as a post-build step. Also, use an iterative development process that incorporates constant measuring between iterations.

By following best practice design guidelines, you significantly increase your chances of creating a high-performance Web application. Consider the following design guidelines:

• Consider security and performance.

• Partition your application logically.

• Evaluate affinity.

• Reduce round trips.

• Avoid blocking on long-running tasks.

• Use caching.

• Avoid unnecessary exceptions.

Consider Security and Performance

Your choice of authentication scheme can affect the performance and scalability of your application. You need to consider the following issues:

• Identities. Consider the identities you are using and the way that you flow identity through your application. To access downstream resources, you can use the process identity or another specific service identity. Or, you can enable impersonation and flow the identity of the original caller. If you connect to Microsoft SQL Server™, you can also use SQL authentication. However, SQL authentication requires you to store credentials in the database connection string. Storing credentials in the database connection string is not recommended from a security perspective. When you connect to a shared resource, such as a database, by using a single identity, you benefit from connection pooling. Connection pooling significantly increases scalability. If you flow the identity of the original caller by using impersonation, you cannot benefit from efficient connection pooling, and you have to configure access control for multiple individual user accounts. For these reasons, it is best to use a single trusted identity to connect to downstream databases.

• Managing credentials. Consider the way that you manage credentials. You have to decide if your application stores and verifies credentials in a database, or if you want to use an authentication mechanism provided by the operating system where credentials are stored for you in the Active Directory® directory service.

You should also determine the number of concurrent users that your application can support and determine the number of users that your credential store (database or Active Directory) can handle. You should perform capacity planning for your application to determine if the system can handle the anticipated load.

• Protecting credentials. Your decision to encrypt and decrypt credentials when they are sent over the network costs additional processing cycles. If you use authentication schemes such as Windows® Forms authentication or SQL authentication, credentials flow in clear text and can be accessed by network eavesdroppers. In these cases, how important is it for you to protect them as they are passed across the network? Decide if you can choose authentication schemes that are provided by the operating system, such as NTLM or the Kerberos protocol, where credentials are not sent over the network to avoid encryption overhead.

• Cryptography. If your application only needs to ensure that information is not tampered with during transit, you can use keyed hashing. Encryption is not required in this case, and it is relatively expensive compared to hashing. If you need to hide the data that you send over the network, you require encryption and probably keyed hashing to ensure data validity. When both parties can share the keys, using symmetric encryption provides improved performance in comparison to asymmetric encryption. Although larger key sizes provide greater encryption strength, performance is slower relative to smaller key sizes. You must consider this type of performance and balance the larger key sizes against security tradeoffs at design time.

More Information

For more information, see "Performance Comparison: Security Design Choices" on MSDN at .

Partition Your Application Logically

Use layering to logically partition your application logic into presentation, business, and data access layers. This helps you create maintainable code, but it also permits you to monitor and optimize the performance of each layer separately. A clear logical separation also offers more choices for scaling your application. Try to reduce the amount of code in your code-behind files to improve maintenance and scalability.

Do not confuse logical partitioning with physical deployment. A logical separation enables you to decide whether to locate presentation and business logic on the same server and clone the logic across servers in a Web farm, or to decide to install the logic on servers that are physically separate. The key point to remember is that remote calls incur a latency cost, and that latency increases as the distance between the layers increases.

For example, in-process calls are the quickest calls, followed by cross-process calls on the same computer, followed by remote network calls. If possible, try to keep the logical partitions close to each other. For optimum performance you should place your business and data access logic in the Bin directory of your application on the Web server.

For more information about these and other deployment issues, see "Deployment Considerations" later in this chapter.

Evaluate Affinity

Affinity can improve performance. However, affinity may affect your ability to scale. Common coding practices that introduce resource affinity include the following:

• Using in-process session state. To avoid server affinity, maintain session state out of process in a SQL Server database or use the out-of-process state service running on a remote machine. Alternatively, design a stateless application, or store state on the client and pass it with each request.

• Using computer-specific encryption keys. Using computer-specific encryption keys to encrypt data in a database prevents your application from working in a Web farm because common encrypted data needs to be accessed by multiple Web servers. A better approach is to use computer-specific keys to encrypt a shared symmetric key. You use the shared symmetric key to store encrypted data in the database.

More Information

For more information about how to encrypt and decrypt data in a shared database, without introducing affinity, see Chapter 14, "Building Secure Data Access," in Improving Web Application Security: Threats and Countermeasures on MSDN at .

Reduce Round Trips

Use the following techniques and features in to minimize the number of round trips between a Web server and a browser, and between a Web server and a downstream system:

• HttpResponse.IsClientConnected. Consider using the HttpResponse.IsClientConnected property to verify if the client is still connected before processing a request and performing expensive server-side operations. However, this call may need to go out of process on IIS 5.0 and can be very expensive. If you use it, measure whether it actually benefits your scenario.

• Caching. If your application is fetching, transforming, and rendering data that is static or nearly static, you can avoid redundant hits by using caching.

• Output buffering. Reduce roundtrips when possible by buffering your output. This approach batches work on the server and avoids chatty communication with the client. The downside is that the client does not see any rendering of the page until it is complete. You can use the Response.Flush method. This method sends output up to that point to the client. Note that clients that connect over slow networks where buffering is turned off, affect the response time of your server. The response time of your server is affected because your server needs to wait for acknowledgements from the client. The acknowledgements from the client occur after the client receives all the content from the server.

• Server.Transfer. Where possible, use the Server.Transfer method instead of the Response.Redirect method. Response.Redirect sends a response header to the client that causes the client to send a new request to the redirected server by using the new URL. Server.Transfer avoids this level of indirection by simply making a server-side call.

You cannot always just replace Response.Redirect calls with Server.Transfer calls because Server.Transfer uses a new handler during the handler phase of request processing. If you need authentication and authorization checks during redirection, use Response.Redirect instead of Server.Transfer because the two mechanisms are not equivalent. When you use Response.Redirect, ensure you use the overloaded method that accepts a Boolean second parameter, and pass a value of false to ensure an internal exception is not raised.

Also note that you can only use Server.Transfer to transfer control to pages in the same application. To transfer to pages in other applications, you must use Response.Redirect.

More Information

For more information, see Knowledge Base article 312629, "PRB: ThreadAbortException Occurs If You Use Response.End, Response.Redirect, or Server.Transfer," at .

Avoid Blocking on Long-Running Tasks

If you run long-running or blocking operations, consider using the following asynchronous mechanisms to free the Web server to process other incoming requests:

• Use asynchronous calls to invoke Web services or remote objects when there is an opportunity to perform additional parallel processing while the Web service call proceeds. Where possible, avoid synchronous (blocking) calls to Web services because outgoing Web service calls are made by using threads from the thread pool. Blocking calls reduce the number of available threads for processing other incoming requests.

For more information, see "Avoid Asynchronous Calls Unless You Have Additional Parallel Work" later in this chapter.

• Consider using the OneWay attribute on Web methods or remote object methods if you do not need a response. This "fire and forget" model allows the Web server to make the call and continue processing immediately. This choice may be an appropriate design choice for some scenarios.

• Queue work, and then poll for completion from the client. This permits the Web server to invoke code and then let the Web client poll the server to confirm that the work is complete.

More Information

For more information about how to implement these mechanisms, see "Threading Guidelines" later in this chapter.

Use Caching

A well-designed caching strategy is probably the single most important performance-related design consideration. caching features include output caching, partial page caching, and the cache API. Design your application to take advantage of these features.

Caching can be used to reduce the cost of data access and rendering output. Knowing how your pages use or render data enables you to design efficient caching strategies. Caching is particularly useful when your Web application constantly relies on data from remote resources such as databases, Web services, remote application servers, and other remote resources. Applications that are database intensive may benefit from caching by reducing the load on the database and by increasing the throughput of the application. As a general rule, if caching is cheaper than the equivalent processing, you should use caching. Consider the following when you design for caching:

• Identify data or output that is expensive to create or retrieve. Caching data or output that is expensive to create or retrieve can reduce the costs of obtaining the data. Caching the data reduces the load on your database server.

• Evaluate the volatility. For caching to be effective, the data or output should be static or infrequently modified. Lists of countries, states, or zip codes are some simple examples of the type of data that you might want to cache. Data or output that changes frequently is usually less suited to caching but can be manageable, depending upon the need. Caching user data is typically only recommended when you use specialized caches, such as the session state store.

• Evaluate the frequency of use. Caching data or output that is frequently used can provide significant performance and scalability benefits. You can obtain performance and scalability benefits when you cache static or frequently modified data and output alike. For example, frequently used, expensive data that is modified on a periodic basis may still provide large performance and scalability improvements when managed correctly. If the data is used more often than it is updated, the data is a candidate for caching.

• Separate volatile data from nonvolatile data. Design user controls to encapsulate static content such as navigational aids or help systems, and keep them separate from more volatile data. This permits them to be cached. Caching this data decreases the load on your server.

• Choose the right caching mechanism. There are many different ways to cache data. Depending on your scenario, some are better than others. User-specific data is typically stored in the Session object. Static pages and some types of dynamic pages such as non-personalized pages that are served to large user sets can be cached by using the output cache and response caching. Static content in pages can be cached by using a combination of the output cache and user controls. The caching features provide a built-in mechanism to update the cache. Application state, session state, and other caching means do not provide a built-in mechanism to update the cache.

Avoid Unnecessary Exceptions

Exceptions add significant overhead to your application. Do not use exceptions to control logic flow, and design your code to avoid exceptions where possible. For example, validate user input, and check for known conditions that can cause exceptions. Also, design your code to fail early to avoid unnecessary processing.

If your application does not handle an exception, it propagates up the stack and is ultimately handled by the exception handler. When you design your exception handling strategy, consider the following:

• Design code to avoid exceptions. Validate user input and check for known conditions that can cause exceptions. Design code to avoid exceptions.

• Avoid using exceptions to control logic flow. Avoid using exception management to control regular application logic flow.

• Avoid relying on global handlers for all exceptions. Exceptions cause the runtime to manipulate and walk the stack. The further the runtime traverses the stack searching for an exception handler, the more expensive the exception is to process.

• Catch and handle exceptions close to where they occur. When possible, catch and handle exceptions close to where they occur. This avoids excessive and expensive stack traversal and manipulation.

• Do not catch exceptions you cannot handle. If your code cannot handle an exception, use a try/finally block to ensure that you close resources, regardless of whether an exception occurs. When you use a try/finally block, your resources are cleaned up in the finally block if an exception occurs, and the exception is permitted to propagate up to an appropriate handler.

• Fail early to avoid expensive work. Design your code to avoid expensive or long-running work if a dependent task fails.

• Log exception details for administrators. Implement an exception logging mechanism that captures detailed information about exceptions so that administrators and developers can identify and remedy any issues.

• Avoid showing too much exception detail to users. Avoid displaying detailed exception information to users, to help maintain security and to reduce the amount of data that is sent to the client.

Implementation Considerations

When you move from application design to application development, consider the technical details of your application. Key performance measures include response times, speed of throughput, and resource management.

You can improve response times by reducing page sizes, reducing your reliance on server controls, and using buffering to reduce chatty communication with the client. You can avoid unnecessary work by caching resources.

Throughput can be improved by making effective use of threads. Tune the thread pool to reduce connections, and to avoid blocking threads because blocking threads reduce the number of available worker threads.

Poor resource management can place excessive loads on server CPU and memory. You can improve resource utilization by effectively using pooled resources, by explicitly closing or disposing resources you open, and by using efficient string management.

When you follow best practice implementation guidelines, you increase the performance of your application by using well-engineered code and a well-configured application platform. The following sections describe performance considerations for features and scenarios.

Threading Explained

processes requests by using threads from the .NET thread pool. The thread pool maintains a pool of threads that have already incurred the thread initialization costs. Therefore, these threads are easy to reuse. The .NET thread pool is also self-tuning. It monitors CPU and other resource utilization, and it adds new threads or trims the thread pool size as needed. You should generally avoid creating threads manually to perform work. Instead, use threads from the thread pool. At the same time, it is important to ensure that your application does not perform lengthy blocking operations that could quickly lead to thread pool starvation and rejected HTTP requests.

Formula for Reducing Contention

The formula for reducing contention can give you a good empirical start for tuning the thread pool. Consider using the Microsoft product group-recommended settings that are shown in Table 6.1 if the following conditions are true:

• You have available CPU.

• Your application performs I/O bound operations such as calling a Web method or accessing the file system.

• The Applications/Requests In Application Queue performance counter indicates that you have queued requests.

Table 6.1: Recommended Threading Settings for Reducing Contention

|Configuration setting |Default value (.NET Framework 1.1) |Recommende|

| | |d value |

|maxconnection |2 |12 * #CPUs|

|maxIoThreads |20 |100 |

|maxWorkerThreads |20 |100 |

|minFreeThreads |8 |88 * #CPUs|

|minLocalRequestFreeThreads |4 |76 * #CPUs|

To address this issue, you need to configure the following items in the Machine.config file. Apply the recommended changes that are described in the following section, across the settings and not in isolation. For a detailed description of each of these settings, see "Thread Pool Attributes" in Chapter 17, "Tuning .NET Application Performance."

• Set maxconnection to 12 * # of CPUs. This setting controls the maximum number of outgoing HTTP connections that you can initiate from a client. In this case, is the client. Set maxconnection to 12 * # of CPUs.

• Set maxIoThreads to 100. This setting controls the maximum number of I/O threads in the .NET thread pool. This number is automatically multiplied by the number of available CPUs. Set maxloThreads to 100.

• Set maxWorkerThreads to 100. This setting controls the maximum number of worker threads in the thread pool. This number is then automatically multiplied by the number of available CPUs. Set maxWorkerThreads to 100.

• Set minFreeThreads to 88 * # of CPUs. This setting is used by the worker process to queue all the incoming requests if the number of available threads in the thread pool falls below the value for this setting. This setting effectively limits the number of requests that can run concurrently to maxWorkerThreads – minFreeThreads. Set minFreeThreads to 88 * # of CPUs. This limits the number of concurrent requests to 12 (assuming maxWorkerThreads is 100).

• Set minLocalRequestFreeThreads to 76 * # of CPUs. This setting is used by the worker process to queue requests from localhost (where a Web application sends requests to a local Web service) if the number of available threads in the thread pool falls below this number. This setting is similar to minFreeThreads but it only applies to localhost requests from the local computer. Set minLocalRequestFreeThreads to 76 * # of CPUs.

Note   The recommendations that are provided in this section are not rules. They are a starting point. Test to determine the appropriate settings for your scenario. If you move your application to a new computer, ensure that you recalculate and reconfigure the settings based on the number of CPUs in the new computer.

If your ASPX Web page makes multiple calls to Web services on a per-request basis, apply the recommendations.

The recommendation to limit the runtime to 12 threads for handling incoming requests is most applicable for quick-running operations. The limit also reduces the number of context switches. If your application makes long-running calls, first consider the design alternatives presented in the "Avoid Blocking on Long-Running Tasks" section. If the alternative designs cannot be applied in your scenario, start with 100 maxWorkerThreads, and keep the defaults for minFreeThreads. This ensures that requests are not serialized in this particular scenario. Next, if you see high CPU utilization and context-switching when you test your application, test by reducing maxWorkerThreads or by increasing minFreeThreads.

The following occurs if the formula has worked:

• CPU utilization increases.

• Throughput increases according to the Applications\Requests/Sec performance counter.

• Requests in the application queue decrease according to the Applications/Requests In Application Queue performance counter.

If using the recommended settings does not improve your application performance, you may have a CPU bound scenario. By adding more threads you increase thread context switching. For more information, see " Tuning" in Chapter 17, "Tuning .NET Application Performance."

More Information

For more information, see Knowledge Base article 821268, "PRB: Contention, Poor Performance, and Deadlocks When You Make Web Service Requests from Applications," at .

Threading Guidelines

This section discusses guidelines that you can use to help improve threading efficiency in . The guidelines include the following:

• Tune the thread pool by using the formula to reduce contention.

• Consider minIoThreads and minWorkerThreads for burst load.

• Do not create threads on a per-request basis.

• Avoid blocking threads.

• Avoid asynchronous calls unless you have additional parallel work.

Tune the Thread Pool by Using the Formula to Reduce Contention

If you have available CPU and if requests are queued, configure the thread pool. For more information about how to do this, see "Formula for Reducing Contention" in the preceding "Threading Explained" section. The recommendations in "Threading Explained" are a starting point.

When your application uses the common language runtime (CLR) thread pool, it is important to tune the thread pool correctly. Otherwise, you may experience contention issues, performance problems, or possible deadlocks. Your application may be using the CLR thread pool if the following conditions are true:

• Your application makes Web service calls.

• Your application uses the WebRequest or HttpWebRequest classes to make outgoing Web requests.

• Your application explicitly queues work to the thread pool by calling the QueueUserWorkItem method.

More Information

For more information, see Knowledge Base article 821268, "PRB: Contention, Poor Performance, and Deadlocks When You Make Web Service Requests from Applications," at .

Consider minIoThreads and minWorkerThreads for Burst Load

If your application experiences burst loads where there are prolonged periods of inactivity between the burst loads, the thread pool may not have enough time to reach the optimal level of threads. A burst load occurs when a large number of users connect to your application suddenly and at the same time. The minIoThreads and minWorkerThreads settings enable you to configure a minimum number of worker threads and I/O threads for load conditions.

At the time of this writing, you need a supported fix to configure these settings. For more information, see the following Knowledge Base articles:

• 810259, "FIX: SetMinThreads and GetMinThreads API Added to Common Language Runtime ThreadPool Class," at

• 827419, "PRB: Sudden Requirement for a Larger Number of Threads from the ThreadPool Class May Result in Slow Computer Response Time," at

Do Not Create Threads on a Per-Request Basis

Creating threads is an expensive operation that requires initialization of both managed and unmanaged resources. You should avoid manually creating threads on each client request for server-based applications such as applications and Web services.

Consider using asynchronous calls if you have work that is not CPU bound that can run in parallel with the call. For example, this might include disk I/O bound or network I/O bound operations such as reading or writing files, or making calls to another Web method.

You can use the infrastructure provided by the .NET Framework to perform asynchronous operations by calling the Beginsynchronous and Endsynchronous methods (where synchronous represents the synchronous method name). If this asynchronous calling pattern is not an option, then consider using threads from the CLR thread pool. The following code fragment shows how you queue a method to run on a separate thread from the thread pool.

WaitCallback methodTarget = new WaitCallback(myClass.UpdateCache); bool isQueued = ThreadPool.QueueUserWorkItem(methodTarget);

Avoid Blocking Threads

Any operation that you perform from an page that causes the current request thread to block means that one less worker thread from the thread pool is available to service other requests. Avoid blocking threads.

Avoid Asynchronous Calls Unless You Have Additional Parallel Work

Make asynchronous calls from your Web application only when your application has additional parallel work to perform while it waits for the completion of the asynchronous calls, and the work performed by the asynchronous call is not CPU bound. Internally, the asynchronous calls use a worker thread from the thread pool; in effect, you are using additional threads.

At the same time that you make asynchronous I/O calls, such as calling a Web method or performing file operations, the thread that makes the call is released so that it can perform additional work, such as making other asynchronous calls or performing other parallel tasks. You can then wait for completion of all of those tasks. Making several asynchronous calls that are not CPU bound and then letting them run simultaneously can improve throughput.

More Information

For more information about threading and asynchronous communication, see " Pipeline: Use Threads and Build Asynchronous Handlers in Your Server-Side Web Code" at .

Resource Management

Poor resource management from pages and controls is one of the primary causes of poor Web application performance. Poor resource management can place excessive loads on CPUs and can consume vast amounts of memory. When CPU or memory thresholds are exceeded, applications might be recycled or blocked until the load on the server is lower. For more information, see "Resource Management" in Chapter 3, "Design Guidelines for Application Performance." Use the following guidelines to help you manage your resources efficiently:

• Pool resources.

• Explicitly call Dispose or Close on resources you open.

• Do not cache or block on pooled resources.

• Know your application allocation pattern.

• Obtain resources late and release them early.

• Avoid per-request impersonation.

Pool Resources

provides built-in database connection pooling that is fully automatic and requires no specific coding. Make sure that you use the same connection string for every request to access the database.

Make sure you release pooled resources so that they can be returned to the pool as soon as possible. Do not cache pooled resources or make lengthy blocking calls while you own the pooled resource, because this means that other clients cannot use the resource in the meantime. Also, avoid holding objects across multiple requests.

Explicitly Call Dispose or Close on Resources You Open

If you use objects that implement the IDisposable interface, make sure you call the Dispose method of the object or the Close method if one is provided. Failing to call Close or Dispose prolongs the life of the object in memory long after the client stops using it. This defers the cleanup and can contribute to memory pressure. Database connection and files are examples of shared resources that should be explicitly closed. The finally clause of the try/finally block is a good place to ensure that the Close or Dispose method of the object is called. This technique is shown in the following Visual Basic® .NET code fragment.

Try conn.Open() …Finally If Not(conn Is Nothing) Then conn.Close() End If End Try

In Visual C#®, you can wrap resources that should be disposed, by using a using block. When the using block completes, Dispose is called on the object listed in the brackets on the using statement. The following code fragment shows how you can wrap resources that should be disposed by using a using block.

SqlConnection conn = new SqlConnection(connString); using (conn) { conn.Open(); . . . } // Dispose is automatically called on the connection object conn here.

More Information

For more information, see "Finalize and Dispose Guidelines" in Chapter 5, "Improving Managed Code Performance" Also, see "Explicitly Close Connections" in Chapter 12, "Improving Performance"

Do Not Cache or Block on Pooled Resources

If your application uses resources that are pooled, release the resource back to the pool. Caching the pooled resources or making blocking calls from a pooled resource reduces the availability of the pooled resource for other users. Pooled resources include database connections, network connections, and Enterprise Services pooled objects.

Know Your Application Allocation Pattern

Poor memory allocation patterns may cause the garbage collector to spend most of its time collecting objects from Generation 2. Collecting objects from Generation 2 leads to poor application performance and high loads on the CPU.

Coding techniques that cause large numbers of temporary allocations during a short interval put pressure on the garbage collector. For example, when you perform a large number of string concatenation operations by using the += operator in a tight loop, or when you use String.Split for every request, you may put pressure on the garbage collector. All of these operations create hidden objects (temporary allocations). Use tools such as the CLR Profiler and System Monitor to better understand allocation patterns in your application.

More Information

For more information, see "How To: Use CLR Profiler" in the "How To" section of this guide. Also, see "CLR and Managed Code" in Chapter 15, "Measuring .NET Application Performance"

For more information about the mechanics of garbage collection and generations, see "Garbage Collection Explained" in Chapter 5, "Improving Managed Code Performance"

Obtain Resources Late and Release Them Early

Open critical, limited, and shared resources just before you need them, and release them as soon as you can. Critical, limited, and shared resources include resources such as database connections, network connections, and transactions.

Avoid Per-Request Impersonation

Identify and, if necessary, authorize the caller at the Web server. Obtain access to system resources or application-wide resources by using the identity of the Web application process or by using a fixed service account. System resources are resources such as event logs. Application-wide resources are resources such as databases. Avoiding per-request impersonation minimizes security overhead and maximizes resource pooling.

Note   Impersonation on its own does not cause performance issues. However, impersonation often prevents efficient resource pooling. This is a common cause of performance and scalability problems.

Pages

The efficiency of your page and code-behind page logic plays a large part in determining the overall performance of your Web application. The following guidelines relate to the development of individual .aspx and .ascx Web page files.

• Trim your page size.

• Enable buffering.

• Use Page.IsPostBack to minimize redundant processing.

• Partition page content to improve caching efficiency and reduce rendering.

• Ensure pages are batch compiled.

• Ensure debug is set to false.

• Optimize expensive loops.

• Consider using Server.Transfer instead of Response.Redirect.

• Use client-side validation.

Trim Your Page Size

Processing large page sizes increases the load on the CPU, increases the consumption of network bandwidth, and increases the response times for clients. Avoid designing and developing large pages that accomplish multiple tasks, particularly where only a few tasks are normally executed for each request. Where possible logically partition your pages.

To trim your page size, you can do one or all of the following:

• Use script includes for any static scripts in your page to enable the client to cache these scripts for subsequent requests. The following script element shows how to do this.

• Remove characters such as tabs and spaces that create white space before you send a response to the client. Removing white spaces can dramatically reduce the size of your pages. The following sample table contains white spaces.

// with white space hello world

The following sample table does not contain white spaces.

// without white space helloworld

Save these two tables in separate text files by using Notepad, and then view the size of each file. The second table saves several bytes simply by removing the white space. If you had a table with 1,000 rows, you could reduce the response time by just removing the white spaces. In intranet scenarios, removing white space may not represent a huge saving. However, in an Internet scenario that involves slow clients, removing white space can increase response times dramatically. You can also consider HTTP compression; however, HTTP compression affects CPU utilization.

You cannot always expect to design your pages in this way. Therefore, the most effective method for removing the white space is to use an Internet Server API (ISAPI) filter or an HttpModule object. An ISAPI filter is faster than an HttpModule; however, the ISAPI filter is more complex to develop and increases CPU utilization. You might also consider IIS compression. IIS compression can be added by using a metabase entry.

Additionally, you can trim page size in the following ways:

• Disable view state when you do not need it. For more information, see "View State" later in this chapter.

• Limit the use of graphics, and consider using compressed graphics.

• Consider using cascading style sheets to avoid sending the same formatting directives to the client repeatedly.

• Avoid long control names; especially ones that are repeated in a DataGrid or Repeater control. Control names are used to generate unique HTML ID names. A 10-character control name can easily turn into 30 to 40 characters when it is used inside nested controls that are repeated.

Note   When using the process model, the worker process sends responses back to the client, it first sends them through IIS in 31-kilobyte (KB) chunks. This applies to .NET Framework 1.1, but it could change in future versions. The more 31-KB chunks that has to send through IIS, the slower your page runs. You can determine how many chunks requires for your page by browsing the page, viewing the source, and then saving the file to disk. To determine the number of chunks, divide the page size by 31.

More Information

For more information about IIS compression, see Knowledge Base article, 322603, "HOW TO: Enable ASPX Compression in IIS," at .

Enable Buffering

Because buffering is enabled by default, batches work on the server and avoid chatty communication with the client. The disadvantage to this approach is that for a slow page, the client does not see any rendering of the page until it is complete. You can use Response.Flush to mitigate this situation because Response.Flush sends output up to that point to the client. Clients that connect over slow networks affect the response time of your server because your server has to wait for acknowledgements from the client to proceed. Because you sent headers with the first send, there is no chance to do it later.

If buffering is turned off, you can enable buffering by using the following methods:

• Enable buffering programmatically in a page.

// Response.Buffer is available for backwards compatibility; do not use. Response.BufferOutput = true;

• Enable buffering at the page level by using the @Page directive.

• Enable buffering at the application or computer level by using the element in the Web.config or Machine.config file.

When you run your application by using the process model, it is even more important to have buffering enabled. The worker process first sends responses to IIS in the form of response buffers. After the ISAPI filter is running, IIS receives the response buffers. These response buffers are 31 KB in size., After IIS receives the response buffers, it then sends that actual response back to the client. With buffering disabled, instead of using the entire 31-KB buffer, can only send a few characters to the buffer. This causes extra CPU processing in both as well as in IIS. This may also cause memory consumption in the IIS process to increase dramatically.

Use Page.IsPostBack to Minimize Redundant Processing

Use the Page.IsPostBack property to ensure that you only perform page initialization logic when a page is first loaded and not in response to client postbacks. The following code fragment shows how to use the Page.IsPostBack property.

if (Page.IsPostBack == false) { // Initialization logic } else { // Client post-back logic }

Partition Page Content to Improve Caching Efficiency and Reduce Rendering

Partition the content in your page to increase caching potential. Partitioning your page content enables you to make different decisions about how you retrieve, display, and cache the content. You can use user controls to segregate static content, such as navigational items, menus, advertisements, copyrights, page headers, and page footers. You should also separate dynamic content and user-specific content for maximum flexibility when you want to cache content.

More Information

For more information, see "Partial Page or Fragment Caching" later in this chapter.

Ensure Pages Are Batch Compiled

As the number of assemblies that are loaded in a process grows, the virtual address space can become fragmented. When the virtual address space is fragmented, out-of-memory conditions are more likely to occur. To prevent a large number of assemblies from loading in a process, tries to compile all pages that are in the same directory into a single assembly. This occurs when the first request for a page in that directory occurs. Use the following techniques to reduce the number of assemblies that are not batch compiled:

• Do not mix multiple languages in the same directory. When multiple languages such as C# or Visual Basic .NET are used in pages in the same directory, compiles a separate assembly for each language.

• Ensure content updates do not cause additional assemblies to be loaded. For more information, see "Deployment Considerations" later in this chapter.

• Ensure that the debug attribute is set to false at the page level and in the Web.config file, as described in the following section.

Ensure Debug Is Set to False

When debug is set to true, the following occurs:

• Pages are not batch compiled.

• Pages do not time out. When a problem occurs, such as a problem with a Web service call, the Web server may start to queue requests and stop responding.

• Additional files are generated in the Temporary Files folder.

• The System.Diagnostics.DebuggableAttribute attribute is added to generated code. This causes the CLR to track extra information about generated code, and it also disables certain optimizations.

Before you run performance tests and before you move your application into production, be sure that debug is set to false in the Web.config file and at the page level. By default, debug is set to false at the page level. If you do need to set this attribute during development time, it is recommended that you set it at the Web.config file level, as shown in the following fragment.

The following shows how to set debug to false at the page level.

Note   A common pitfall is to set this attribute at the page level during development and then forget to set it back when the application is moved to production.

Optimize Expensive Loops

Expensive loops in any application can cause performance problems. To reduce the overhead that is associated with code inside loops, you should follow these recommendations:

• Avoid repetitive field or property access.

• Optimize code inside the loop.

• Copy frequently called code into the loop.

• Replace recursion with looping.

• Use For instead of ForEach in performance-critical code paths.

More Information

For more information about the recommendations in this section, see "Iterating and Looping" in Chapter 5, "Improving Managed Code Performance"

Consider Using Server.Transfer Instead of Response.Redirect

Response.Redirect sends a metatag to the client that makes the client send a new request to the server by using the new URL. Server.Transfer avoids this indirection by making a server-side call. When you use Server.Transfer, the URL in the browser does not change, and load test tools may incorrectly report the page size because different pages are rendered for the same URL.

The Server.Transfer, Response.Redirect, and Response.End methods all raise ThreadAbortException exceptions because they internally call Response.End. The call to Response.End causes this exception. Consider using the overloaded method to pass false as the second parameter so that you can suppress the internal call to Response.End.

More Information

For more information, see Knowledge Base article 312629, "PRB: ThreadAbortException Occurs If You Use Response.End, Response.Redirect, or Server.Transfer," at .

Use Client-Side Validation

Prevalidating data can help reduce the round trips that are required to process a user's request. In , you can use validation controls to implement client-side validation of user input.

Note   Ensure that you also use server-side validation for security reasons.

More Information

For more information on validation controls, see the following:

• "Web Forms Validation" in Visual Basic and Visual C# Concepts on MSDN at

• Knowledge Base article 316662, "HOW TO: Use Validation Controls from Visual Basic .NET" at

Server Controls

You can use server controls to encapsulate and to reuse common functionality. Server controls provide a clean programming abstraction and are the recommended way to build applications. When server controls are used properly, they can improve output caching and code maintenance. The main areas you should review for performance optimizations are view state and control composition. Use the following guidelines when you develop server controls:

• Identify the use of view state in your server controls.

• Use server controls where appropriate.

• Avoid creating deep hierarchies of controls.

Identify the Use of View State in Your Server Controls

View state is serialized and deserialized on the server. To save CPU cycles, reduce the amount of view state that your application uses. Disable view state if you do not need it. Disable view state if you are doing at least one of the following:

• Displaying a read-only page where there is no user input

• Displaying a page that does not post back to the server

• Rebuilding server controls on each post back without checking the postback data

More Information

For more information about view state, see "View State" later in this chapter.

Use Server Controls Where Appropriate

The HTTP protocol is stateless; however, server controls provide a rich programming model that manages state between page requests by using view state. Server controls require a fixed amount of processing to establish the control and all of its child controls. This makes server controls relatively expensive compared to HTML controls or possibly static text. Scenarios where server controls are expensive include the following:

• Large payload over low bandwidth. The more controls that you have on a page, the higher the network payload is. Therefore, multiple controls decreases the time to last byte (TTLB) and the time to first byte (TTFB) for the response that is sent to the client. When the bandwidth between client and server is limited, as is the case when a client uses a low-speed dial-up connection, pages that carry a large view state payload can significantly affect performance.

• View state overhead. View state is serialized and deserialized on the server. The CPU effort is proportional to the view state size. In addition to server controls that use view state, it is easy to programmatically add any object that can be serialized to the view state property. However, adding objects to the view state adds to the overhead. Other techniques such as storing, computed data or storing several copies of common data adds unnecessary overhead.

• Composite controls or large number of controls. Pages that have composite controls such as DataGrid may increase the footprint of the view state. Pages that have a large number of server controls also may increase the footprint of the view state. Where possible, consider the alternatives that are presented later in this section.

When you do not need rich interaction, replace server controls with an inline representation of the user interface that you want to present. You might be able to replace a server control under the following conditions:

• You do not need to retain state across postbacks.

• The data that appears in the control is static. For example, a label is static data.

• You do not need programmatic access to the control on the server-side.

• The control is displaying read-only data.

• The control is not needed during postback processing.

Alternatives to server controls include simple rendering, HTML elements, inline Response.Write calls, and raw inline angle brackets (). It is essential to balance your tradeoffs. Avoid over optimization if the overhead is acceptable and if your application is within the limits of its performance objectives.

Avoid Creating Deep Hierarchies of Controls

Deeply nested hierarchies of controls compound the cost of creating a server control and its child controls. Deeply nested hierarchies create extra processing that could be avoided by using a different design that uses inline controls, or by using a flatter hierarchy of server controls. This is especially important when you use list controls such as Repeater, DataList, and DataGrid because they create additional child controls in the container.

For example, consider the following Repeater control.

Assuming there are 50 items in the data source, if you enable tracing for the page that contains the Repeater control, you would see that the page actually contains more than 200 controls.

Table 6.2: Partial Repeater Control Hierarchy

|Control ID |Type |

|Repeater |System.Web.UI.WebControls.Repeater |

|repeater:_ctl0 |System.Web.UI.WebControls.RepeaterItem |

|repeater_ctl0:_ctl1 |System.Web.UI.LiteralControl |

|repeater_ctl0:_ctl0 |System.Web.UI.WebControls.Label |

|repeater_ctl0:_ctl2 |System.Web.UI.LiteralControl |

|repeater:_ctl49 |System.Web.UI.WebControls.RepeaterItem |

|repeater_ctl49:_ctl1 |System.Web.UI.LiteralControl |

|repeater_ctl49:_ctl0 |System.Web.UI.WebControls.Label |

|repeater_ctl49:_ctl2 |System.Web.UI.LiteralControl |

The list controls are designed to handle many different scenarios and may not be optimized for your scenario. In situations where performance is critical, you can choose from the following options:

• If you want to display data that is not very complex, you might render it yourself by calling Response.Write. For example, the following code fragment would produce the same output, as noted earlier in the section.

for(int i=0;i ................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download

To fulfill the demand for quickly locating and searching documents.

It is intelligent file search solution for home and business.

Literature Lottery

Related download
Related searches