Tuesday, August 25, 2015

Azure – Application Insights – Configuration & Usage


I still remember the good old days where we had a huge instrumentation database wherein developers would log all the data – URLs, user navigation path, query parameters, exceptions, browser information, region information, etc to help in debugging the issues and generating required reports. Application Insights offers all these capabilities and much more with simple configuration and ready to use detailed reports.

This post is divided into two parts –
a.      Installation & configuration
b.      Usage of the reports 

Installation & configuration

Prerequisites
·        Visual Studio 2013 or later version(s)
·        Subscription to Microsoft Azure
Visual Studio Application Insights has Free, Standard & Premium pricing tiers. 

             1.      Login to Azure portal and create a new Application Insight resource by selecting ASP.Net Web Application



2.      Make a note of the Instrumentation key
            
            Note: Intentionally erased the details 

These first two steps creates a resource and now it is ready for any application to be configured for monitoring.

3.      Install the SDK 

a.      Right click on your Project and install ApplicationInsights NuGet package




b.      Search for Microsoft.ApplicationInsights.Web and install


               
Note: If “Application Insights for Visual Studio” extension is installed, then on right click of the project you will see an option “Add Application Insights” which is another way of configuring Application Insights.


c.      ApplicationInsights.config file will be added by the NuGet installation. Edit this file and add the below tag just before the file end.

<InstrumentationKey>Your instrumentation key here</InstrumentationKey>

 Note: This above configuration in the ApplicationInsights.config will write the logistics from all the available environments. If the requirement is to configure different Application Insights for each environment then the best solution is to configure the Instrumentation Key in Cloud.config and write the below statement in Global.asax.cs to fetch the key based on environment.

Microsoft.ApplicationInsights.Extensibility.TelemetryConfiguration.Active.InstrumentationKey = CloudConfigurationManager.GetSetting("InstrumentKeyName");
At this point we are done with the installation & configuration. Running the project (debug mode/release mode/published code) will start writing the logs.

 Usage of the reports

          1.      Login to Azure portal and browse to Applications Insight.
 



2.      Click through the applications and then through the available reports to see more details.



Various reports are already created based on the ApplicationInsights data log –
·        Failures

This reports all the failures with exception details happened in last 24 hours along with giving additional details about the % of each exception against the total number of exceptions logged.

One great thing about this report –

It digs down from the URL to method name to operating system to even browser version for a generated exception. This is like a whole universe of data points for debugging the issue. This detailed information has helped our team more than once to find the root cause sooner.  

·        Performance

Gives performance number against each URL which will help in determining the low performance pages.

·        Browser

Gives the details about the browser & their versions used to access the application.

This post will not get into the details of each of the report. I request the readers to explore other reports & options.
Add Client side monitoring (Web browsers)

               The SDK installed from NuGet sends the monitoring data from server side code. To add monitoring on client side for web browsers,  
a.      Download the Javascript snippet from Azure portal and put it in every page or the Master Page.

The code snippet will have the Instrumentation Key. 
Please leave your feedback and queries in the comments section. Thank you!

Monday, August 10, 2015

Azure – Best practices, Learnings & Performance tips


Azure in many ways is simple to code and work with but sometimes it gets tricky. Remember when you are stuck with some problem and are searching all over the internet to find solution and wished there was a consolidated list where you can quickly read all the standards, learnings from other people, tips and find all the answers to queries ranging from best practices to performance improvements.

This document tries to put all this under one umbrella with work related to Azure table storage skipping the basic things like connections, coding, etc and it assumes that the reader has good understanding of Azure table storage. 

Best practices

a.      Create a table storage helper class for all the operations that can be performed on the Table storage (Insert/Update/Delete/Get/etc). All the classes in Data Access Layer should call these helper methods to perform any/all operations on Azure tables. This will help in maintaining code.

b.      Data comparisons in Azure are case sensitive, make sure that you address this in all the applicable places. The best/standard way of doing this is to use .ToLowerInvariant(), probably at both inserting & reading the entities from the table storage.

Learnings

a.      Azure table storage Inserts/Updates ie .Insert() or.InsertOrMerge() can sometimes return Bad Request without giving additional details. This exception being very generic, developers try to search on the internet for the solutions with less to no success. I have come up with most common mistakes which developers do which are responsible for these Bad Requests.

·     DateTime

Most common source for Bad Request. Never send DateTime.MinValue to Azure table. DateTime.MinValue is not within the supported DateTime range.

·     DataType

Second most common source. See that the data types of the values being sent to the Azure table are matched with Azure table storage’s data type.

b.      When using .ExecuteQuerySegmented() for scenario of fetching more than 1000 records or pagination, keep it in mind that the table service API will either return maximum of 1000 records or as much as entities as possible in 5 seconds. If the query takes more than this time for execution, it will return with empty entities and a continuation token.

Performance tips

a.      I am sure we all know that it is always faster & a good practice to fetch the data from table storage based on PartitionKey and RowKey but

-        Never combine two or more Partition Keys for fetching records in a query, this results into a full table scan 

Wrong way of doing entities fetch
TableQuery.CombineFilters(
                                TableQuery.GenerateFilterCondition(PartitionKey, QueryComparisons.Equal, value1),
TableOperators.And/Or,
                                TableQuery.GenerateFilterConditionForInt(PartitionKey, QueryComparisons.Equal, value2)
);

 In this scenario, spin up tasks for each PartitionKey filter condition, wait for the tasks to complete and perform operations on top of it. Please find below an example on how this can be achieved.

Correct way of doing entities fetch
var PartitionKeyQuery1 = TableQuery.GenerateFilterCondition(PartitionKey, QueryComparisons.Equal, value1);
var PartitionKeyQuery2 = TableQuery.GenerateFilterCondition(PartitionKey, QueryComparisons.Equal, value2);
var customerTaskA = Task.Run<IEnumerable<CustomerDataEntity>>(() => this.customerDataTable.ExecuteQuery(PartitionKeyQuery1));
                   
var customerTaskB = Task.Run<IEnumerable<CustomerDataEntity>>(() => this. customerDataTable.ExecuteQuery(PartitionKeyQuery2));
// If one or more customer(s) exists for given conditions then process the data
if (customerTaskA.Result != null || customerTaskB.Result != null)
{
   . . .
   . . .
}

Note: When a Task with .Result is called, it internally waits for the task to complete and then executes .Result attribute of the Task.

b.      Try to have the results from Azure table storage in IEnumerable object as long as you can in the code. If there is any filtering, sorting, etc to be applied on the data, use LINQ and perform these operations on IEnumerable object. When you are done with these operations then call .ToList() method of the IEnumerable object. The line of code where you call .ToList() is when it actually executes the Query. This is called Deferred Execution.

So the next time when IEnumerable<Object>.ToList() is taking lot of time or is a performance hit, don’t look for any other places but at the Query which is getting executed.
 This post will be updated periodically with new learnings. Thank you!

*********************************************************************************

Update - 8/21/2015

Learnings 
             
           ·     Unsupported characters
Below characters are not supported for PartitionKey and RowKeys –
i.                 The forward slash (/) character
ii.                Backward slash (\) character
iii.               Number sign (#) character
iv.               Question mark (?) character
v.                Control characters(\t, \n, \r, etc)