1. Performance Improvement Steps
1.1. Architectural Changes
Below are the most important architectural points to be verified. Verify your application architecture with these points.
1.1.1. SharePoint data base and server should not be on same system
Use external system data base for configuring SharePoint and your application. In this way you can distribute load on different servers. When you are using external data base make sure that no other services or components running on that machine apart from SQL Server components.
1.1.2. Manage metadata should be configured on different server.
Similar to previous point, by configuring manage metadata on different server you can reduce the load on your server. But when the application has huge term store and using more taxonomy fields then go for this option. Or else you will degradation in your performance by this approach.
When creating term store makes sure that there won’t be more than 7 sub levels. As the number of sub level increases complexity of term store increase and it takes more time to fetch data using metadata service.
1.1.3. Custom Features
Create solutions specific to application (Sandbox) instead of public (farm) solutions. When you are creating more solutions for different purposes to install at different sites and sub sites then create solution based on the each individual functionality. When you are developing the features to be used for entire application (sites and sub sites) then do high level separation and install the features, instead of making a feature for individual functionality.
1.1.4. Connectivity between Systems
All systems must be closely connected in single network if possible with internal direct cabling. It will reduce network latency. When application is small and targeting to limited users then entire application can be hosted in single server.
1.1.5. Search Configuration
Search has to be configured on different machine and crawl data has to be stored in different server than SharePoint server. When crawl is running there will be less load on the server.
1.2. Application Configuration Changes
1.2.1. SharePoint Health analyzer
Verify health analyzer rules, and add if you want to any new rules to it. Configure rules with auto resolve the issue wherever possible. If issues are not auto resolve try to see possible solution and rectify them.
Enable cache on the application. SharePoint provide different cache techniques and enable all possible caches. When you are enabling cache on your application make sure that your application server having huge RAM and enough space. The idlest way of having RAM is 32GB and free space on C drive is 64GB.
The cache normally get refreshes whenever application pool restarts or huge content added to the application. By using cache we can get better performance but for the first hit after cache refresh will take longer time than usual.
1.2.3. Customize SharePoint logs
Customize SharePoint logging to log most import data instead of writing everything and clearing old logs from the server. By this way you can control IO operations.
1.2.4. Enable IIS Compression
Enable IIS compression at IIS level. When you are dealing with huge content on the page then go for 5-7 compression level. Make dynamic compression at 5-7 and static compression at 10. But this option gives more value to the application when application running over internet or across different intranet zones.
1.2.5. Resource throttling settings for Web Application
By enable resource throttling, you can restrict number data object to be fetched or updated at a time. This will be used when your application having huge number of pages or content.
1.2.6. Target Audience
Disable target audience service if your application is not using it. But this change has to be done only when application is not using target audience feature. It will reduce level of security check.
1.3. Code Optimization
1.3.1. OOTB Web Parts
184.108.40.206. Extended Content Query Web Part
When you are using extended content query web part to display any list data then you have to follow below things.
1. Don’t try to fetch data from sites and sub sites, if you are trying to get then make sure that you don’t have more sites and sub sites.
2. Use XSLT to present the data on the web part. XSLT are much faster and light weight when compare to data grids.
3. If data is fetching from different sites and sub sites to display then go for custom search web part instead of extended content query web part.
220.127.116.11. Search Web Parts
When application using different search web parts then make sure that search configured and crawl is running and storing on different server than SharePoint server. Search web parts gives much better performance than other web parts.
18.104.22.168. List View Web Part
When application using list view web part then create different views on the library and use that library for the list instead of showing everything. But creating custom view, you can minimize data that you have to display on the web part as well as amount of data that you fetch from the library or list.
1.3.2. Custom Web Parts
Custom web parts are 2 kinds
22.214.171.124. Extending OOTB Web Parts
OOTB web part extends to override a particular behavior or add another level of criteria. So when you are extending OOTB
1. Don’t break the existing behavior of the OOTB web part
2. When are looking for custom property add it from elements.xml and use that property in the code
3. When you are reading any data from the list or library make sure that you are reading only required data with columns list and criteria with order if required.
4. Use XSLT to present the data where ever required instead of grid controls.
126.96.36.199. New Custom Web Parts
Custom web parts has to follow below steps while developing
1. Design complete component.
2. Separation BL and DAL from presentation.
3. Optimized queries and most preferably use store procedures.
4. Run profilers to see execution time.
5. Use XSLT instead of grids.
6. Disable view state if you are not performing any use actions.
1.3.3. Asynchronous Web Parts
Asynchronous web parts reduce request time. This approach won’t bring much performance to the application but end user can see partial data on the page while loading rest of web parts. To execute web parts asynchronously, all web parts must develop on asynchronous framework. When you are using search web parts, by default these provide asynchronous execution. But when there is more number of web parts running on a page in asynchronous mode, and then there will be lot of load on the server. In long run it degrades the performance. So when we are making any web part as Asynchronous make sure that
1. Web part consuming more time to fetch the data.
2. End user can wait for data to be load.
3. And there won’t be more number of pages or web parts on a page.
1.3.4. Optimizing Code
Developers can utilize all .NET optimization techniques, and can run profilers on the code and verify the performance of the application. Most common points to be validated while coding are
1. Declare variable whenever required
2. Use specific types of variables
3. Destroy objects
4. Reduce type casting where ever possible
5. Minimize loops and recursive calls
6. Don’t call garbage collect
7. Don’t throw exceptions; re throw exceptions are much costlier in terms of memory consumption and execution.
8. Minimize exception handling blocks, use exception block where ever you are not able to handle or pretend exceptions.
9. Minimize class size (number of methods, logic, variables…. etc.) and follow better suitable component design.
1.4. Data Base Optimization
1.4.1. Writing SPs
Write stored procedures instead of different queries. If you have a block of code which completely deals with data base then move that code block to SP instead of writing in .NET
1.4.2. Optimizing Queries
When you are writing any query make sure that you are specifying column names and where clause. Whenever you are writing any query make sure that you are running that query for execution flow. In the execution flow you have to see time taken to process the request and number of bytes transfer while execution. When you are considering best optimized query then you have to consider both parameters. All less time execution queries won’t give performance in long run as well as less byte transfer queries won’t bring performance. So both parameters to be in particular range. If you are not very clear on your query always reach out your DB specialist.
1.4.3. Minimizing number of Queries
This can happen in 2 ways
1. Query can be called from different places for same purpose. Then better save that value in a global value by fetching at once and use across the application.
2. Query can be called with change in where clause. Then use named parameters when you are querying. The reason why I am suggesting the named parameters is, whenever user queries to the data base it compiles the queries and then executes. When user calls the same query then it uses compiled block to execute the query. So there won’t be compilation for the same query. If the query is changing with different values in the where clause normally SQL treats all queries as different and compiles every time. Instead of that if you use named parameters query will be compiled with names values and user can pass different values to the named parameters.
Performance is the most important exercise for any application. Most of the applications pass through performance issue after releasing to the customers and end users. When you are running the application for single user, you can’t see any performance issues. But the same application running for days, weeks, months and years then end users sees the difference in the performance.
Working on performance improvement for a SharePoint is a really a painful task. SharePoint internally having performance issues so when we are working on performance of the application we have to utilize all possible options that SharePoint provide and then we have to look at the pages that we are creating as part of the application which are killing the performance.
In this document I am going to talk about tools to identify pain points, analyzing the pain points and suggesting most suitable solutions.
I am preparing this document based on the experience that I got from the projects that I worked so far. I am not a SharePoint architect or expert so most of the solutions that I am proposing in view point of web application.
2. Identifying Performance Bottlenecks
To work on performance improvements for any application first and foremost activity that everyone has to do is, identifying problematic areas in your application and identifying the performance of the application before making changes to the application.
2.1. Measuring performance of the application
2.1.1. Enable Developer Dashboard.
SharePoint provides developer dashboard functionality to know the performance of the application. It provides complete execution time for the application. The execution time that shows on the develop dashboard are request processing time at server.
If you are having different web parts on the page then you can get each web part execution time on the developer dash board. So to know the complete application execution, enable developer dash board and see the execution plan.
Below are the power shell commands to enable developer dash board.
stsadm -o setproperty -pn developer-dashboard -pv on
(Get-SPFarm).PerformanceMonitor.DeveloperDashboardLevel = ”On”
2.1.2. Using HTTP Watch or Fiddler
You have to install these tools in your machine, start the application before hitting and SharePoint page. These tools provide response time; download response size, number of internally request and time taken to process each request.
By using these tools you can see total time taken to download content to the client browser. So you can see what exactly client feel for the response of the request. These tools also provide response object size and all internal download requests.
2.1.3. IIS Logs
Every request that process at IIS, will log complete request details along with execution time. Verifying each request execution time using HTTP watch or fiddler or developer dash board is not so easy. To identify the problematic pages in your application you have to use and analyze IIS logs. By analyzing one day or one week or one month logs you can list down all pages which are taking more time to execute, and test those pages using developer dash board or fiddler or HTTP watch to know more details.
Use Logparser software to analyze IIS logs. If you are using log parser software then use below query used to get requested data in the CSV format, which is most comfort format to verify all request.
logparser “select date,time,time-taken,cs-uri-stem from ‘D:\Logs\test1.log’ where time-taken >5000 order by time-taken” -e:10 -o:csv -i:W3C >> “D:\Logs\test1.csv”
The above query will fetch all requests which are taken more than 5 seconds in test1.csv file.
2.1.4. .NET Profilers
Run memory and execution related profilers on your custom code. To get more inner details about your code, build you code in debug mode and run profilers on your code. Look at time taken for each block of code and see which block is taking more time. Memory profilers provide data about memory leaks and number objects used and their sizes. Best profiler that I used is ANTS profiler.
2.1.5. SQL Profilers
SQL profiler provides data about the queries that are running or ran on the SQL server. Profilers capture execution time along with execution plan for the query.
To collect the data, run SQL server profiler where SharePoint SQL server configured before users hitting the application. Start and stop recording for a particular span of time, but most preferable way of collecting right data is by doing this operation after collecting data from IIS Logs.
2.2. Identifying bottlenecks
2.2.1. Analyzing IIS Logs
For identifying most problematic pain points in the application then first and far most thing that we can do is, analyzing IIS logs.
- 1. Capture IIS logs for a period of time
- 2. Ran log parser to collect the pages taking more than business execution time (5 sec or 10 sec)
- 3. Collect pages taking more time among all the pages.
- 4. Along with the execution time, collect the page which are having response size more than 500KB
Information that you collected using above process provides most important pages where we have to start working on performance.
List down all the pages those are taking most of the execution time and prioritize the page by execution time and number of hits.
2.2.2. Analyzing fiddler or HTTP watch traces
After collecting data using IIS logs, open those pages from browser by running fiddler or HTTP watch to see user response time, response object, number of internal request and types of data downloading. Then look at below components to see the actual problem
- 1. Look at each requests internally called when user requested for a page and calculate time taken for each individual request
- 2. Verify number of images or CSS or JS files downloaded and total time taken for these requests
- 3. Actual time taken for the page request.
If the page is taking more time to process then those pages to be analyzed using developer dash board. Also calculate total time taken for other internal request to minimize total response time for end user.
List down all the pages those are taking most of time to load in client machine.
2.2.3. Analyzing Developer Dash Board
Open pages which are taking more time to respond to the end user in the browser and enable developer dash board. Developer dash board will come below the actual page content. It contains time taken to execute the page and each web part. By look at each web part execution time you can clearly point out which web part is consuming most of the execution time. If there is no web part then you have to look at the page what exactly it contained and how it’s displaying content to the end user.
List down all web parts those are taking most of the execution time.
2.2.4. Analyzing SQL Profilers
Open pages which are taking more time to respond, before browsing page, start SQL profile on your data base. After browsing the page, stop the profiler and see the number of queries executed for the request and verify time taken for each request. Along with query execution verify amount of data fetched for each query.
Then look at queries for below data
1. Taking more time to execute
2. Queries without where clause
3. Queries which are fetching huge data
4. Queries which are executing most number of times
5. Identify queries which are related to your code.