How to import Perfmon logs into a SQL Database to create Excel Pivot Charts

This post has been republished via RSS; it originally appeared at: TestingSpot Blog articles.

First published on MSDN on Nov 28, 2017

Authored by Edwin Hernandez


We came across a scenario where a customer needs to run a Performance Test of an Application but due to security policies, customer is not able to provide Server-Level access to the Servers where the Application is hosted. Without the access, a Visual Studio Load Test will not be able to collect performance metrics from the target servers. Usually, in such scenario, Customer gets someone who does have access to run a Perform Data Collection while the Performance test is running and then send us the .BLG file(s) for analysis.


The analysis can be done directly on PERFMON by loading the .blg file, dragging the various counters to the chart and doing actions like comparisons, looking for threshold violations, etc. However, if what you need is to create nice-looking charts, overlay counters, pie charts, etc. for a report, then you probably need something better that PERFMON charts.


In the following steps, We will show you how to load the PERFMON .blg files into a SQL Database and then pull data from the SQL Database into an Excel pivot chart. We have based this information on a 2012 blog-post by Prashant Kumar here , although We have updated it by using PowerShell to provide better maintainability and Pivot charts on Excel 2016:




  • Prerequisite: Create a PERFMON log file (.blg) by running a trace in a server, to do this you need to create a Perfmon Data Collection Set, there are a few templates by default or you can create or lookup a specific template. Once created you need to run the Data Collection for an given amount of time (e.g. 30min while a Load Test is running). Here is some additional information from TechNet .

  • Optionally: Once you have one or several .blg files, you can join them together into a single .blg file for easier handling, or you can chose to keep them separate. For this you can use relog.exe, it's important to point out that you can use relog.exe from the Command Prompt console, or you can use a PowerShell script as follows for easier editing of the attributes and local paths:




  • Here is the code snippet to copy code


##########################################
# #
# COMBINE BLG FILES INTO SINGLE BLG #
# #
##########################################
$blgFile1 = "C:LogsDataCollection1.blg"
$blgFile2 = "C:LogsDataCollection1.blg"

$combinedFile = "C:LogsDataCollection_Combined.blg"
$AllArgs = @($blgFile1,$blgFile2, '-f', 'bin', '-o', $combinedFile)

& 'relog.exe' $AllArgs



  • Step 1: You will need to create a new database in SQL (e.g. PerfmonDB). For this example I'm using SQL Server 2016, I have also successfully loaded Perfmon data to a SQL 2012 database.

    • Note: SQL Express editions do not work since it you would need to create the ODBC connection thru the SQL Native client (to localDB) and that will create problems with relog.exe since it won't recognize the DSN source.



  • Step 2: Create a SQL Server ODBC- System DSN source named Perfmon_DSN that points to your SQL Server and Database

    • To create the ODBC-System DNS connection:

      • Run odbcad32.exe, go to the System DSN tab and click on 'Add' to add a new Source

      • Select SQL Server (not native client) and type the connection configuration details (e.g. Name: Perfmon_DSN, Description: Perfmon, Server: <yourSQLServer>)

      • Preferably chose NT Authentication and change the default database to PerfmonDB, keep all other defaults







  • Step 3: Use relog.exe to load the combined, or single .blg file into the SQL database thru your DNS connection.

    • To do this you can run relog.exe from PowerShell as shown on the code below.






  • Here is the code snippet to copy code



##########################################
#                                        #
# IMPORT BLG FILES INTO SQL DATABASE #
#                                        #
##########################################
$sourceBlg = "C:LogsDataCollection_Combined.blg"
$sqlDSNconection = "SQL:Perfmon_DSN!logfile"

$AllArgs =  @($sourceBlg, '-f', 'SQL', '-o', $sqlDSNconection)
& 'relog.exe' $AllArgs


  • You can validate that the script worked by checking your PerfmonDB database. Tables CounterData, CounterDetails and DisplayToId should have been created.



  • Step 4. Now that the data has been loaded into the SQL Database, we can start working in pulling it into Excel reporting purposes. Let's do the Connection/Query first:

    • You will need Excel (I'm using 2016 for this example).

    • First you need to create a connection. In a new Spreadsheet, go to Data > Get External Data > From Other Sources > From SQL Server.

      • Type your SQLServer name and log on credentials.

      • Select your database PerfmonDB, there is no need to specify a default table just click next and then finish.

      • Cancel the select table dialog.



    • Now that you have a connection, you need a proper SQL query that pulls only the information that you need.

      • In excel, go to Data > Connections, select your connection and click Properties.

      • Go to the Definition tab and in Command Type, select SQL

      • Under Command Text copy the SQL query below and click the OK button. On the query, the interval is set at every 1 minute, you can tweak that according to your needs (e.g. every 1hr), also this query will pull all objects imported from the Perfmon logs, if it is too much information you can always filter out in the WHERE clause by adding (for example): AND objectname = 'Processor'.







SELECT counterdetails.machinename,
counterdetails.objectname,
counterdetails.countername,
counterdetails.countertype,
counterdetails.instancename,
Datediff(minute, st.starttime, Cast(LEFT(counterdata.counterdatetime, Len(counterdata.counterdatetime) - 5)
AS DATETIME)) AS interval,
Avg(counterdata.countervalue)
AS counterValues
FROM   counterdata
INNER JOIN counterdetails
ON counterdata.counterid = counterdetails.counterid,
(SELECT counterdetails.machinename,
Min(Cast(LEFT(counterdata.counterdatetime, Len(
counterdata.counterdatetime) - 5)
AS
DATETIME))AS startTime
FROM   counterdata
INNER JOIN counterdetails
ON counterdata.counterid = counterdetails.counterid
GROUP  BY machinename) AS st
WHERE  st.machinename = counterdetails.machinename
GROUP  BY counterdetails.machinename,
counterdetails.objectname,
counterdetails.countername,
counterdetails.countertype,
counterdetails.instancename,
Datediff(minute, st.starttime, Cast(LEFT(counterdata.counterdatetime, Len(counterdata.counterdatetime) - 5)
AS DATETIME)),
counterdata.countervalue


  • Step 5 - Let's use the query results to build a Pivot Table and Charts

    • First go to Insert > Charts > PivotChart > PivotChart & PivotTable

    • In the Create Pivot Table dialog, select "Use External Data Source" and "New Worksheet". Click ok.

    • That should show you an empty Table, empty Chart and the PivotTable Fields Selector. In the Selector, use the configuration shown below. Basically you need the internal field in the Rows area, the counterValues in the Values area and all other fileds in the Columns area.

      • Note: For the counterValues field you need to summarize the value field by average. To do this right-click in the counterValues field and go to Properties, then from the list select Average.









Now you should have all the information you need for the Charts, however it may have too many objects, counters, or Machines for the chart to make sense. In the table use the Column Labels filters to chose only the objects that you need. Please note that you can change the field on top of the picture below in the Select field dropdown:



A much more rewarding way to do this is by adding Slicers. To do this, in excel go to PivotTable Tools > Analyze > Filter > Insert Slicer, then select the counter(s) that you need for easier filtering. It should look something like this:




  • Now you can play around with selecting one or more combinations of Machine/Object/Instance and Counter. You can do multi-select with the button on top of each selector.

    • Keep in mind that not all Counters have an Instance.



  • Using the selectors and some formatting on the Chart, you should be able to easily come up with a Chart similar to the one below:






  • Additional Step :


You may have noticed that in the example above we merged together 2 .blg perfmon log files into one using Relog.exe. Those two files came from 2 differente test runs on two different servers. The end result show the test runs side by side since they have different Time Scales (first test starts at 0 end at 35, second test starts at 900 ends at 935).



This kind of chart is what we were going for on this particular report. If you had a single BLG file you are probably all set since you got by this point a single Chart, however if you had more than 1 BLG file perhaps you want to overlay the data instead of showing it side by side as above.



In this case we can update the query to get the interval of each sample individually or just get back all the results and add the interval after importing to excel.

You could use a query like this:

Select
CounterDetails.MachineName,
CounterDetails.ObjectName,
CounterDetails.CounterName,
CounterDetails.CounterType,
CounterDetails.InstanceName,
CounterData.CounterDateTime,
CounterData.CounterValue,
DisplayToID.GUID,
DisplayToID.DisplayString,
DisplayToID.LogStartTime
from
CounterData JOIN
CounterDetails ON CounterData.CounterID = CounterDetails.CounterID
Join  DisplayToID on CounterData.GUID = DisplayToID.GUID
where ObjectName in ('PhysicalDisk')
-- ObjectName in ('process')
--And CounterName = 'Available MBytes'
and DisplayString like '%_R2%'
-- and (InstanceName = ('sqlservr')
-- or InstanceName Like 'EXCEL%');

Then you would just need to add a calculated column to the excel table, LogStartTime minus CounterDateTime to get the intervals and then the results would overlap on the chart.

Leave a Reply

Your email address will not be published. Required fields are marked *

*

This site uses Akismet to reduce spam. Learn how your comment data is processed.