SQL Server Monitoring with PowerShell and the SQL Server Platform: Analysis and Reporting (Part 3 of 3)

By:   |   Updated: 2015-06-03   |   Comments (5)   |   Related: 1 | 2 | 3 | > Monitoring


Problem

Collecting performance data for SQL Server is critical to ensure your systems are running properly. There are several ways and tools you can use to collect and analyze the data, but in this tip series I demonstrated a home grown approach that uses several features of SQL Server to build your own performance warehouse. In these series of tips we have demonstrated the ways to build centralized, integrated and flexible data collection and analysis system. In Part 1 I demonstrated the usage of PowerShell scripts to collect performance metrics from multiple servers into central repository database and in Part 2 we built ETL packages to process this data and upload it into data warehouse.  In this tip, let's checkout how to build a cube and analyze the data.

Solution

Overview

In this tip I’ll demonstrate how to build an OLAP cube based on the PerfDWH database, create jobs to process the cube and use a PivotTable report and PivotChart features of Excel to analyze the performance data. This solution was built on SSAS 2008 and could easily be upgraded to higher versions. You can download deployment package from here.

SQL Server Performance Analysis Cube Deployment

We'll use XMLA scripts to deploy the cube into an Analysis Services instance. Alternatively, it could be deployed using a SSAS project, which is included in the deployment package. Deployment steps are as follows:

1. Start SSMS, connect to the Analysis Services instance which will host this cube:

Connect to AS

2. Click on the Analysis Services instance name, open and execute the script PerformanceCountersAnalysis_Script.xmla to create the PerformanceCountersAnalysis cube:

deploying cube

3. In Object Explorer, expand the 'Roles' nodes under 'PerformanceCountersAnalysis' and double click on 'Users':

Users

4. Open the 'Membership' tab, remove any existing orphan SID's and add users or groups that need full access to this cube. Please note that the user account which will be used for processing this cube will also need to be added to this role. If you want to grant limited access to this cube for some users, you can create another custom role with specific permissions and add those users to that role.

user membership

5. Expand the 'Data Sources' node on the object explorer and double click 'PerfAnalysisDS' to configure the data source.

Data source1

6. Select the Provider, server name and database name pointing to your PerfDWH database as shown below:

cube connection manager

7. Test the connection and click 'OK' to confirm your selections.

8. To test cube processing right click on the SSAS database name (PerformanceCountersAnalysis) and select 'Process':

Process cube

9. On the 'Process Database-PerformanceCountersAnalysis' window click OK to start processing:

Process database dialog

10. Make sure the cube processing finishes successfully and close the dialog window:

Process successful

11. Open the CreateSecurityObjectsSSAS.sql script, specify domain name, username and password to create a credential which will be used by the cube processing jobs and execute it to create the credential and proxy accounts for SQL Server Agent. The user account will need to be member of the db_datareader role in PerfDWH database to be able to read from the data warehouse. This account will also need to be added to the cube role with full permissions (see step #6) to be able to process the cube.

12. Open the CreatePerfCubeProcessJobs.sql script, specify the processing frequency value for local variable @ProcessFrequency_Min and execute it to create the daily incremental cube processing jobs. The script will also create a weekly full cube processing job.

Cube internals

PerformanceCountersAnalysis cube contains a single fact table connected to 4 dimensions in a star-like schema:

DSV

Here are short descriptions of the dimensions:

  • SQL Instances – Dimension for the SQL Server instances.
  • Performance Counters – Dimension for the performance counters and counter instances.
  • Servers - Server names dimension.
  • Collection Times - Time dimension.

I've built a time hierarchy allowing you to slice the cube by year, month, day, hour and minute:

Time hierarchy

Another hierarchy has been created on the 'Performance counters' dimension, allowing you to analyze cube on object name, counter name and instance name:

Counter hierarchy

Custom cube calculations

As I mentioned in Part 1 and Part 2, the PerfCollections/PerfCounter_Groups table contains the 'Threshold' and 'Critical threshold' fields and for those metrics with non-empty values in these fields, classification would be done to see if a counter's value falls into 'normal', 'exceeding threshold' or 'exceeding critical threshold' categories. For simplicity, we'll call it 'bad' performance when performance data is between a metric's threshold and critical threshold values and we'll call it 'critical' performance when a counter value exceeds the metric's critical threshold.

Using MDX calculations I've created some additional measures, which could be useful in analyzing performance data over extended time periods, please see the list below:

  • Average (by time period)
  • Average for previous time period
  • Average increase (change from last time period)
  • Rolling average (for last 3 months)
  • Increase from last 3 months (change rate compared to rolling average)
  • Beyond threshold duration rate (percentage of 'bad' performance data within total time duration)
  • Beyond critical duration rate (percentage of 'critical' performance data within total time duration)

Most of the cube measures have conditional formatting on fore and back color, depending on relative percentage of bad or critical performance during the analysis period, as described below:

  • Fore color would be red, when critical performance lasted more than 20% of overall time period (month, day or hour)
  • Fore color would be orange, when bad performance lasted more than 20% of overall time period (month, day or hour)
  • Back color would be aqua, when critical performance lasted more than 70% of overall time period (month, day or hour)
  • Back color would be light aqua, when bad performance lasted more than 70% of overall time period (month, day or hour)

Here are the color expressions used:

for fore color:

CASE WHEN [Beyond critical duration rate]>=0.20
 THEN 255 /*Red*/
WHEN [Beyond threshold duration rate]>=0.20
 THEN 33023 /*Orange*/
ELSE 0 /*Black*/
END

for back color:

CASE WHEN [Beyond critical duration rate]>=0.70
 THEN 16776960 /*Aqua*/
WHEN [Beyond threshold duration rate]>=0.70
 THEN 16777175 /*Light Aqua*/
ELSE 16777215 /*White*/
END

Configuring reports

The deployment package includes an Excel template file for analysis. Here are the configuration steps:

1. Copy the spreadsheet to a computer with direct access to the PerformanceCountersAnalysis cube and open it

2. Go to the Data tab and click 'Connections'

Spreadsheet configuartion

3. You'll see two connection objects in the 'Workbook Connections' window: PerformanceCountersAnalysis cube and PerfCollectionErrors:

Workbook connections

4. For each selected connection click 'Properties', then open the 'Definition' tab and enter the correct SSAS or SQL instance names accordingly:

Connection properties

Future development

I've included a few report templates in this spreadsheet, using a PivotTable report and PivotChart features. You can easily change configuration (filtering, rows, columns, measures, etc.) of existing sheets, as well as add new sheets to analyze the performance data from different angles. You may also find helpful the articles I've included in the 'Next Steps' section below to get familiar with Power Pivot/Power Chart features.

Another useful development would be adding alert functionality. Alerts could be sent during the performance data collection of the ETL stage, when each counter is validated against pre-configured thresholds. You could also benefit from advanced BI features if you publish these reports on SharePoint, allowing team members to see near real-time performance reports.

A few notes regarding this reporting system's performance, this cube includes a single partition and basic aggregation built on top of that. I've used this solution with two months worth of data from 5 to 6 SQL Servers and it was performing well, however with a greater number of servers and longer analysis period, users may experience slowness. This could be improved by creating new partitions on the cube based on monthly or yearly ranges, as well as creating custom aggregations.

Next Steps


sql server categories

sql server webinars

subscribe to mssqltips

sql server tutorials

sql server white papers

next tip



About the author
MSSQLTips author Fikrat Azizov Fikrat Azizov has been working with SQL Server since 2002 and has earned two MCSE certifications. He’s currently working as a Solutions Architect at Slalom Canada.

This author pledges the content of this article is based on professional experience and not AI generated.

View all my tips


Article Last Updated: 2015-06-03

Comments For This Article




Tuesday, December 13, 2016 - 7:28:27 PM - Fikrat Back To Top (44965)

 Hi Rehan,

Your errors were likely caused by SSAS's inability to connect to PerfDWH.So make sure that PerfDWH is deployed (see part 2) ,your cube's data source has correct connection string  (steps 5,6 from part3) and account under which you've connected to cube has access to that PerfDWH.

I believe that should resolve your problems.

Fikrat 

 


Tuesday, December 13, 2016 - 8:54:40 AM - Rehan Back To Top (44956)

 Hey,

Got lots of red cross in step number 8.Can you suggest to get out of this errors.

 

 

   

     

        PerformanceCountersAnalysis

     

      ProcessFull

      UseExisting

   

 

 

Processing Database 'PerformanceCountersAnalysis' failed.

Start time: 12/13/2016 7:19:16 PM; End time: 12/13/2016 7:19:17 PM; Duration: 0:00:00

Processing Dimension 'Collection Times' failed.

Start time: 12/13/2016 7:19:16 PM; End time: 12/13/2016 7:19:17 PM; Duration: 0:00:00

Processing Dimension Attribute '(All)' completed.

Start time: 12/13/2016 7:19:16 PM; End time: 12/13/2016 7:19:16 PM; Duration: 0:00:00

Processing Dimension Attribute 'Collection Date Time' failed.

Start time: 12/13/2016 7:19:17 PM; End time: 12/13/2016 7:19:17 PM; Duration: 0:00:00

Error Messages 1

OLE DB error: OLE DB or ODBC error: Invalid object name 'dbo.DimCollectionTimes'.; 42S02.

SQL queries 1

SELECT 

  DISTINCT

 [dbo_DimCollectionTimes].[CollectionDateTime] AS [dbo_DimCollectionTimesCollectionDateTime0_0]

  FROM [dbo].[DimCollectionTimes] AS [dbo_DimCollectionTimes]

Processing Dimension Attribute 'Year' failed.

Start time: 12/13/2016 7:19:17 PM; End time: 12/13/2016 7:19:17 PM; Duration: 0:00:00

Error Messages 1

OLE DB error: OLE DB or ODBC error: Invalid object name 'dbo.DimCollectionTimes'.; 42S02.

SQL queries 1

SELECT 

  DISTINCT

 [dbo_DimCollectionTimes].[dbo_DimCollectionTimesYear0_0] AS [dbo_DimCollectionTimesYear0_0]

  FROM 

      (

     

 

    SELECT Year(CollectionDateTime)   AS [dbo_DimCollectionTimesYear0_0]

  FROM [dbo].[DimCollectionTimes]

     )

     AS [dbo_DimCollectionTimes]

Processing Dimension 'Performance Counters' completed.

Start time: 12/13/2016 7:19:17 PM; End time: 12/13/2016 7:19:17 PM; Duration: 0:00:00

Errors and Warnings from Response

Internal error: The operation terminated unsuccessfully.

OLE DB error: OLE DB or ODBC error: Invalid object name 'dbo.DimCollectionTimes'.; 42S02.

Errors in the OLAP storage engine: An error occurred while the dimension, with the ID of 'Collection Times', Name of 'Collection Times' was being processed.

Errors in the OLAP storage engine: An error occurred while the 'Collection Date Time' attribute of the 'Collection Times' dimension from the 'PerformanceCountersAnalysis' database was being processed.

Server: The current operation was cancelled because another operation in the transaction failed.

OLE DB error: OLE DB or ODBC error: Invalid object name 'dbo.DimCollectionTimes'.; 42S02.

Errors in the OLAP storage engine: An error occurred while the dimension, with the ID of 'Collection Times', Name of 'Collection Times' was being processed.

Errors in the OLAP storage engine: An error occurred while the 'Year' attribute of the 'Collection Times' dimension from the 'PerformanceCountersAnalysis' database was being processed.

 

 


Thursday, June 4, 2015 - 9:10:05 PM - david Back To Top (37723)

I found out the issue. I don't have SqlServerProviderSnapin100 installed. I can't find it to load I guess using 64 bit dosn't have it.


Thursday, June 4, 2015 - 4:11:48 PM - Fikrat Azizov Back To Top (37664)

PS collection scripts are scheduled to start every 3 min by default and every execution would last 1-2 min, depending number of configured metrics.If it's not collecting data, the reason could be account permission issues.Please make sure the proxy account you've used to run this job has necessary SQL permissions on repository server,monitored SQL server instances and their host machines (the tip includes required permissions).If it still fails,you may be able to find failure events in monitored servers's  Windows/SQL logs and fix appropriately.


Thursday, June 4, 2015 - 11:55:34 AM - David Back To Top (37519)

I followed all the steps and got it working. The only issue I have is the data gatering job that runs the PS code. I am testing it on one server and the job runs forever. I have had to stop it each time. The last one was 2 hours and 12 mins before I stopped it. I dosn't seem to be gathering data during that time. Is there a setting i missed? I can't enable the job till I know how long it will run.















get free sql tips
agree to terms