George's Top 10 ASP Performance Tips
This page contains my top 10 recommendations for optimizing performance of Microsoft Internet Information Server Active Server Pages. Most of these recommendations are oriented towards getting the maximum performance from ADO queries. Database interaction is typically the most time sensitive part of an ASP page. While some of these tips are given in other places, most of them are not.
By default ADO connection pooling is enabled in IIS 4.0 and higher. However, there are a number of things that can cause connection pooling to not work. Some of the things that can cause connection pooling to not work are:
Use SQL Trace, or SQL Profiler to monitor how many connections are being created and destroyed.
Create a style for TR that contains all of the settings you want. If you need multiple styles based on conditions, create a class for each TR style you need.
On one page, implementing this tip reduced the generated page size from 500K bytes to 64K bytes. Needless to say, the people using the page saw a big improvement in performance.
Here is a sample.
To enable response buffer just add the following line to the head of your ASP page.
If you store an apartment threaded object on an IIS session, then all further requests for that session must be routed through the thread that created the object. IIS 4.0 by default creates 20 threads per processor. If you have two users who are sharing the same thread, even if 19 other threads are free, the second user will block until the first user's thread completes execution. If you have long queries or complex pages, this can have a big impact on the performance of your application.
For more information refer to Microsoft Knowledge Base Article Q243548 INFO: Design Guidelines for VB Components Under ASP.
If you have long running queries, these queries can block other queries from accessing the data. This also applies to explicit transactions. If you are using explicit transactions keep them as short as possible. Remember that every query run inside of a transaction results in locks held until the completion of the transaction. In other words, if you begin a transaction and then perform a query that locks a table (I.E. select count(*) from table) then the lock is held until the transaction is completed or rolled back.
If you are using manual transactions, remember to set transactions back to automatic as soon as the transaction is complete. If you don't, you can accidentally lock resources.
If your database provides reporting on blocked queries, carefully check for connections that are blocking other connections.
Another way to handle blocking issues on the server is to use optimizer hints to disable locking in the long running queries. I.E. if you have a query that is long running and cannot be improved, try using the (NOLOCK) hint (on MS SQL Server). This will keep the table free of locking while your query is running.
http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnserv/html/server03272000.asp
25+ Tips to Improve Performance and Style, Len Cardian and George V. Reilly, Microsoft Corporation
http://msdn.microsoft.com/library/default.asp?URL=/library/en-us/dnasp/html/ASPtips.asp
- Be careful with string concatenation.
- Ensure Connection Pooling Works
- Use Stored Procedures
- Use Query Logging To Identify Slow Queries
- Use Cascading Style Sheets to reduce output sizes.
- Use ADO Command Objects with Parameters for looping updates.
- Enable Response Buffering
- Do not store objects on your Session.
- Carefully check for contention issues.
- Stress test your application using an automated program.
- Additional Resources
1. Be careful with string concatenation.
Many developers concatenate large text blocks together to build string tables. The table is then written in one command using a Response.Write method call. This can actually hurt performance depending upon how you do it. For example: dim iOuter, iInner, cTable
cTable="<TABLE>"
for iOuter=1 to 100
cTable=cTable+"<TR>"
for iInner=1 to 10
cTable=cTable+"<TD>" & iOuter & " - " & iInner & "</TD>"
next
cTable=cTable+"</TR>"
next
cTable=cTable+"</TABLE>"
can take twice as long to execute as this code. dim iOuter, iInner, cTable, cRow
cTable="<TABLE>"
for iOuter=1 to 100
cRow="<TR>"
for iInner=1 to 10
cRow=cRow+"<TD>" & iOuter & " - " & iInner & "</TD>"
next
cTable=cTable+cRow+"</TR>"
next
cTable=cTable+"</TABLE>"
The problem appears to be that when you expand a very large string variable using concatenation, the overhead of copying the string to a new buffer when the string is expanded becomes very pronounced. Limiting the number of times you expand the larger variable can yield a large performance gain. 2. Ensure Connection Pooling Works
Creating a database connection is a very expensive operation. Constantly connecting and disconnecting from the database can substantially impair your application's performance. We have personnaly seen performance on ASP pages improved by a factor of four times by ensuring that connection pooling is used.By default ADO connection pooling is enabled in IIS 4.0 and higher. However, there are a number of things that can cause connection pooling to not work. Some of the things that can cause connection pooling to not work are:
- Adjusting Connection or Command timeout disables connection pooling. If your queries are timing out, you need to fix the queries, not increase the timeouts.
- Setting the APP portion of the datasource name explicitly to the name of the calling page allows that handle to only be pooled on that page. This can results in a very large number of concurrent database connections.
- Not explicitly closing record sets and freeing them when an implicit connection object is created. The following leaks connection handles:
Dim oRS set oRS=Server.CreateObject("ADODB.RecordSet") oRS.Open "Select * from MyTable", "DSN=MyConn;UID=MyUser;PWD=MyPwd" do while oRS.EOF=false ' some code here oRS.MoveNext loopFor more information on this refer to Microsoft Knowledge Base Article Q191572 INFO: Connection Pool Management by ADO Objects Called From ASP.
- Executing a command with a connection handle that is in use by a record set.
Given the code:dim oConnection, oRS set oConnection=Server.CreateObject("ADODB.Connection") oConnection.Open cDataSource set oRS=oConnection.Execute("select KeyField from MyTable where Condition=' SomeValue'") do while oRS.EOF=False oConnection.Execute "Update OtherTable set field='NewValue' where ForeignKey='" & oRS("KeyField") & "'" oRS.MoveNext loopEvery time that the oConnection.Execute method is called inside the loop, a new database connection is created and destroyed. This can have very negative effects on performance.
Use SQL Trace, or SQL Profiler to monitor how many connections are being created and destroyed.
3. Use Stored Procedures
If your code is executing a large number of statements, particularly within a loop, seriously consider using a stored procedure. While working on a recent project, we took a slow process (300 seconds) and rewrote it as a stored procedure. When we were finished, the process executed in under 10 seconds. Stored Procedures have the following advantages:- The query plan is pre-compiled and saved on the server.
- The overhead of sending each distinct command to the server for processing is eliminated.
- The overhead of sending results of queries back to the caller is eliminated.
- Stored procedures allow performing multiple intermediate queries and returning the results as one recordset. This can simplify the SQL statements and make them easier to tune for performance.
4. Use Query Logging To Identify Slow Queries
ODBC has the built-in capability to log long running queries to a file. Enable this feature and run your application. Examine the log file. If you see the same query over and over, it should be optimized. Remember that some queries in the log file will be there because they were blocked by another slow-running query. Some of the ways that you can optimize queries are:- Simplify the query and remove unnecessary joins if present.
- Where appropriate, add indexes to the tables.
- If your database supports them, use optimizer hints to force the query plan to a more efficient route.
- Use stored procedures with temporary result tables to simplify a complex query into smaller, more efficient queries.
- Identify Slow Queries.
- Run the query under a profiler, or a tool that displays the execution plan.
- Add indexes to tables if appropriate.
- Simplify the query where possible.
- Restructure the query and check the performance and execution plan.
- Use optimizer hints if supported, to get a better execution path.
5. Use Cascading Style Sheets to reduce output sizes.
If you look at your generated HTML, particularly tables and see masses of FONT, ALIGN, COLOR tags on each cell then this technique can help you.Create a style for TR that contains all of the settings you want. If you need multiple styles based on conditions, create a class for each TR style you need.
On one page, implementing this tip reduced the generated page size from 500K bytes to 64K bytes. Needless to say, the people using the page saw a big improvement in performance.
6. Use ADO Command Objects with Parameters for looping updates.
Building SQL update or insert statements through string concatenation is slow and painful. They can also cause problems if the embedded text contains a single or double quote character. ADO Parameter objects can speed up and simplify your code. A final benefit is that you can prepare or pre-compile the statement. This saves the server from having to reparse the statement on each execution and determine the query plan. The gain you will see on prepared statements will vary based on the complexity of the command. Simple commands like the INSERT statement shown below will yield little performance improvement.Here is a sample.
const cConnStr="DSN=Pubs;UID=SA;PWD="
const adVarChar =200
const adChar =129
Const adParamInput = 1
dim oCmd, oConn, cField1, cField2, tStart, tFinish
set oConn=Server.CreateObject("ADODB.Connection")
oConn.Open cConnStr
tStart=timer
set oCmd=Server.CreateObject("ADODB.Command")
set oCmd.ActiveConnection=oConn
oCmd.CommandText="create table #PrepTest (field1 int, field2 varchar(16), Field3
varchar(16))"
oCmd.Execute
oCmd.CommandText="insert into #PrepTest (Field1, Field2,Field3) values (1,?,?)"
oCmd.Parameters.Append oCmd.CreateParameter("Two",adVarChar,adParamInput,16,"")
oCmd.Parameters.Append oCmd.CreateParameter("Three",adVarChar,adParamInput,16,""
)
oCmd.Prepared=true
for i=1 to 10
cField1="Field'1-" & i
cField2="Field2-" & i
oCmd("Two")=cField1
oCmd("Three")=cField2
oCmd.Execute()
next
tFinish=timer
oCmd.Parameters.Delete 1
oCmd.Parameters.Delete 0
oCmd.Prepared=false
oCmd.CommandText="select * from #PrepTest"
dim oRS
set oRS=oCmd.Execute()
do while oRS.EOF=false
for each oField in oRS.Fields
Response.Write oField.Name & "=[" & oField.Value & "] "
next
Response.Write("<BR>")
oRS.MoveNext
loop
oRS.Close
set oRS=Nothing
set oCmd=Nothing
oConn.Close
set oConn=Nothing
Response.Write tFinish-tStart
7. Enable Response Buffering
Response buffering can make a big difference in performance. If response buffering is not enabled, every Response.Write is sent immediately to the client. If response buffering is enabled, then these writes are grouped and sent to the client in one larger, more efficient transaction. According to the documentation, response buffering is enabled by default in IIS 5.0 UNLESS the installation is an upgrade, in which case it takes the IIS 4.0 default which is to disable response buffering.To enable response buffer just add the following line to the head of your ASP page.
Response.Buffer=True
If your page is very large, and the processing very long you may want to periodically flush the response buffer to the client. This will ensure that the client steadily receives data and does not time out. If you are building large tables in a loop you may want to add something like the following to your code: dim iCounter
iCounter=0
do while oRS.eof=false
'
' Write output rows to client using Response.Write
'
iCounter=iCounter+1
if iCounter MOD 20 = 0 then
Response.Flush
end if
oRS.MoveNext
loop
8. Do not store objects on your Session.
The problem in a nutshell is this:If you store an apartment threaded object on an IIS session, then all further requests for that session must be routed through the thread that created the object. IIS 4.0 by default creates 20 threads per processor. If you have two users who are sharing the same thread, even if 19 other threads are free, the second user will block until the first user's thread completes execution. If you have long queries or complex pages, this can have a big impact on the performance of your application.
For more information refer to Microsoft Knowledge Base Article Q243548 INFO: Design Guidelines for VB Components Under ASP.
9. Carefully check for contention issues.
If you have SQL Statements that run fine most of the time, but are inexplicably slow other times, then the cause may be blocking. Blocking occurs when one connection is holding locks on a resource that another user needs.If you have long running queries, these queries can block other queries from accessing the data. This also applies to explicit transactions. If you are using explicit transactions keep them as short as possible. Remember that every query run inside of a transaction results in locks held until the completion of the transaction. In other words, if you begin a transaction and then perform a query that locks a table (I.E. select count(*) from table) then the lock is held until the transaction is completed or rolled back.
If you are using manual transactions, remember to set transactions back to automatic as soon as the transaction is complete. If you don't, you can accidentally lock resources.
If your database provides reporting on blocked queries, carefully check for connections that are blocking other connections.
Another way to handle blocking issues on the server is to use optimizer hints to disable locking in the long running queries. I.E. if you have a query that is long running and cannot be improved, try using the (NOLOCK) hint (on MS SQL Server). This will keep the table free of locking while your query is running.
| IMPORTANT SQL Server HOT TIP! |
|---|
Consider the following query:select Field1, Field2, Field3 Into #Temp From MyTable where UglyExpression=trueBecause the creation of the table #Temp must be atomic based on the completion of the statement you actually can end up locking the SYSOBJECTS table in TEMPDB for the duration of your command. This means that any other commands that create a temporary table will be blocked until the command completes. If the duration of the command is very small, this isn't a big problem. But, if your query is very long this can cause major contention issues. A better way to code this is: create table #Temp (Field1 int, Field2 int, Field3 int) insert into #Temp (Field1, Field2, Field3) select Field1, Field2, Field3 From MyTable where UglyExpression=true |
10. Stress test your application using an automated program.
Using either a commercial application or a custom developed application perform stress testing of your application. Ideally you should do this BEFORE you put it into production. One of the benefits of doing this is that you will have an automated way of testing your application to ensure that nothing is broken during maintenance. Look here again later for sample VB Script codeAdditional Resources
Improving ASP Performance, J.D. Meir Microsoft Corporationhttp://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnserv/html/server03272000.asp
25+ Tips to Improve Performance and Style, Len Cardian and George V. Reilly, Microsoft Corporation
http://msdn.microsoft.com/library/default.asp?URL=/library/en-us/dnasp/html/ASPtips.asp
浙公网安备 33010602011771号