Tuesday, September 24, 2019

Power BI Desktop Dashboard for IIS Log Analysis - C# for IIS log parsing instead of PowerShell

Introduction

The last post in this blog was about Power BI to analyze IIS logs. It used Log Parser and PowerShell to prepare the W3C IIS .log files into CSV. That conversion is needed as Power BI don't understand the W3C log file format produced by IIS. Though it works for small to medium load, it takes more time to prepare larger files. Let see another way to prepare IIS logs to CSV for log analysis.

Limits of log parser

One of the limit is it cannot output a default value if the columns are not available in the source log file. At the other side Power BI needs all the columns in CSV to load data. This mismatch happens frequently as the default setting of IIS web server doesn't include the cs-bytes and sc-bytes columns and the PowerBI needs that to analyze the payload.

The solution was to load the CSV produced by Log Parser into PowerShell and add missing columns and output the same into CSV file. PowerShell loading the CSV and adding columns is time consuming out of the box. We can tune the PowerShell script to optimize CSV handling. But still, there will be 2 tools IISLogParser and PowerShell.

Enter C#

What if we use C# to convert the IIS Logs to the desired CSV with default values? It is not advisable to rewrite the IIS log parsing logic in C# and maintain the same. It is better to use a library from nuget. Fortunately, there is one library called IISLogParser. We can use the same to parse the logs and use CSV libraries such as ServiceStack to output into CSV.

It is easy to set up such utility in C#. Source code can be found at below GitHub location.


Please note that this uses .Net Core 2.1 target. Ideally, there should be no issues in adding full .Net framework version as target as those libraries support multi-targeting. Add an issue in GitHub repo in case of issue better with PR. Comments in this post also welcome. 

References

https://stackoverflow.com/questions/32120528/parse-iis-log-file-is-there-an-alternative-to-logparser

Tuesday, September 17, 2019

Power BI Desktop Dashboard for IIS Log Analysis

Introduction

As developers, we all might have encountered situations of analyzing IIS web server logs. During the development time, the file is small and easy to analyze in Notepad or Excel. But when it grows to GBs in production servers we use other tools. One such popular tool to query IIS logs is LogParser. It is a free command-line tool from Microsoft. There are graphical applications around it to generate even charts. One such free tool is the Log Parser Studio. It is also from Microsoft.

Once we move up in career and had to deal with managers, product stakeholders or even to CXO to show what the IIS logs say, we need more visuals or a dashboard reflecting the IIS logs. Though we can create visual using Log Parser Studio, it is tedious creating reports and charts one by one. 

One easier solution is to get in Azure and use Azure AppInsights which has built-in dashboards. We can do customizations easily. There are other services like Application Insight also such as Dynatrace, Retrace, etc...But the problem is that not all of us gets the luxury to buy the software and services we want. Sometimes our organization or clients may not be willing to buy. So let's look for other free solutions.

Enter Power BI

Power BI Desktop is a free visualization tool from Microsoft. It can easily produce great-looking visuals with data insights.
Fortunately, this is not the first post on PowerBI for IIS W3C log analysis. There is already one out. We are using it as a base and adding some more features.

https://blogs.msdn.microsoft.com/kennethteo/2018/04/27/analyzing-w3ciis-log-with-powerbi/

Prerequisites

Tools 

  • LogParser
  • Power BI Desktop - Free version is enough
Experience
  • Basics of PowerShell
  • LogParser queries.

Data preparation

The post by Kenneth uses .bat commands to trigger LogParser to convert .log files from IIS to csv. Those queries have hardcoded column names. The problem with that is we cannot just add more columns from IIS logs. Example cs-bytes and sc-bytes are the columns which are not enabled in IIS by default. Ideally, the solution is to use 'select * ' in the LogParser query. But the problem is that Power BI doesn't support (or more research needed) a flexible number of columns in CSV mapped properly to fixed columns. 
Hence using PowerShell to invoke LogParser which first try to convert .log to .csv including cs-bytes and sc-bytes. If those columns are not available, it tries without those columns which are the default setting in IIS. If those columns are not available, 0 / default value will be used.

PowerShell can be downloaded from the below location.

How to run PowerShell

  • Use the IIS to CSV.ps1 
  • Change the $inputPath to the folder where .log files are located
  • Run the ps1 file
  • The output .csv files will be generated in the same folder location.
Power BI template provided by Kenneth in his post has to be updated to reflect this change. Let's see those changes

Separation of concerns in Power BI dashboards

The Power BI template used by Kenneth has fewer reports and visuals. Obviously, that seems towards proving a concept. Let's add more visuals such as a dedicated page for errors, errors by server-ip to identify faulty servers.

Sample

Step by step instructions to add more visuals is not necessary as the PBI template itself can be shared and anyone can load their own data to it. Sample Power BI having more visuals has been uploaded to below location.

https://github.com/joymon/software-performance/tree/master/web/iis/log-analysis/Power%20BI/PBI-Template

Just prepare the CSV from .log files and update data source in PBI to reflect the changes in visuals.

Feel free to download and try it out. Any suggestions are welcome in Github as enhancements.

References

https://ideas.powerbi.com/forums/265200-power-bi-ideas/suggestions/6752494-web-servers-on-premise-for-iis-log-data
https://community.powerbi.com/t5/Desktop/IIS-log-files-convert-queries-to-PowerBI/td-p/166160
https://ittechnews.net/microsoft/official-microsoft-news/analyzing-w3c-iis-log-with-powerbi/

Tuesday, September 10, 2019

PowerShell to run same SQL to multiple databases in multi tenant system

Background

Multi-tenant systems are complex by nature and not everybody works on multi-tenant systems even in the Enterprise. In those systems, the boundary has to be clearly defined between tenants. It is mainly due to the noisy neighbour/tenant. No one wants to get disturbed by others in the same system. Some systems use all the tiers separated from the web server itself. Some separate only the database by sharing web and processing tiers. There is no one rule how to separate the tenants.

If we are maintaining a multi-tenant system with every tenant has their own database with identical structure, it is a question most of the time how to apply structural / schema changes quickly. Below is one method which is easy to apply a hotfix to all.

PowerShell script

This script iterate through all the databases in the SQL Instance then executes the SQL in a file specified. That file may contain an alter table, SP changes, etc...

# NOTE* It is advised to test on dev server before higher environments
# ----------------------------------------------

$Inst = "<Your sql instance with port number if applicable>"

$filepath = "<file path to sqlfile>"


# $databases grabs list of databases. Adjust the where condition to filter.

$databases = invoke-sqlcmd -ServerInstance $Inst -Database "master" -Query "select name from sys.databases where database_id > 4 and name <> 'TEST'"


foreach ($database in $databases)
{
    $DBname = $database.name
    # Outputs one file per database
    $outfile = $filepath + "."+$database.name+".log"
    #Execute scripts
    Invoke-Sqlcmd -ServerInstance ${Inst} -Database $database.name -InputFIle $filepath | out-file -filepath $outfile

    $database.name
}

Note 

This assumes there is only one SQL instance in the system and takes care of the databases inside. But in the real situation, huge multi-tenant systems will have multiple SQL instances.

Tuesday, September 3, 2019

Azure @ Enterprise - Querying entire Application Insights columns for something

Problem

We are about to debug somebody's else application. The original developers are not there. Nobody there to give details about how the logging is implemented except they say AppInsight is integrated. We are in trouble. We have an error / informal message from the application but don't know where to start with. Yes, this is a situation in the enterprise world.
In such conditions, it is time waste if we query the Application Insights item types one by one. It is like searching for a needle in the Haystack. Technically, it is searching for entire sources and all the columns inside.

If we are from the SQL background there is a tendency that we think in SQL and convert to AppInsight's query language currently called KQL (Kusto Query Language). Since SQL don't have a feature to search all the tables & columns of a database with one query, we never think there will be an option in KQL to do so.

Solution - Search entire AppInsight

But there is an easier way in KQL. Below is the query which searches entire Application Insights and all the columns.

union *
| where * contains "exception"

Hope the query is self explainable. It is simply the magic of *

Happy debugging.