How to modify the filter levels without restarting?

Topics: Logging Application Block
Jan 2, 2013 at 6:48 PM
Edited Jan 2, 2013 at 6:49 PM

We are evaluating System.Diagnostics(+Essential.Diagnostics extensions for db) versus Enterprise Library. The target is a (multi-layered) Azure cloud service. I can distill our requirements as follows:

  1. We would like to compile in ALL our logging messages in production builds but only enable "error" and "critical" levels by default.
  2. Enable and escalate the logging levels only when a problem arises in production
  3. Our entire product consists of multiple flavors of web and worker roles (each targeting a major area) and each flavor has multiple instances
  4. To avoid affecting production availability and to prevent the issue being investigated from "disappearing" none of the roles should be recycled just to enable/escalate logging

How can be achieve the above with EntLib 5.0? If not, is live-escalation planned for 6.0? BTW, if someone also knows how to do this with the basic system.diagnostics, we're all ears :)

Jan 4, 2013 at 1:45 AM

Enterprise Library Logging Application Block will monitor for changes and reload the configuration when a change is detected.  If you combine this with a FileConfigurationSource then you can have the configuration reload outside of web.config/app.config.  

Note that this is not Azure specific so you would still need to modify config in each role instance.

Randy Levy
Enterprise Library support engineer
Support How-to 

Jan 4, 2013 at 3:59 AM

Thanks Randy. For your FileConfigurationSource suggestion, would it be possible to point to blob storage (or tables/SQL)? As you may know, modifying each instance's file through RDP isn't very robust because:

  1. required RDP'ing into each instance => modify file => tedious maintenance (and requires enabling RDP service + password)
  2. any role recycle (due to updates/unhealthy status) reverts to deployment configuration (although this would also disturb our debug session)

I guess there are ways around this (write custom code to poll blob => read => write local .xml configuration) but seems it's more of infrastructure code. Do you have any plans to make diagnostics in EntLib 6.0 more "cloud friendly"? I would imagine our use case above is fairly generic for any Azure customers wanting any form of dynamic logging escalation (without writing their own diagnostics infrastructure!)


Jan 4, 2013 at 1:33 PM

I actually should have mentioned that there is a BlobConfigurationSource that lets you store your configuration in an Azure blob which supports change notifications.  As you say, this is a better approach if you intend to modify configuration on the fly.  Also there is a Sql Configuration Source that will work with Azure.

Another good resource is p&p's white paper on using EntLib in Windows Azure.

Randy Levy
Enterprise Library support engineer
Support How-to 

Jan 8, 2013 at 3:41 AM

Hey Randy, 

Thanks for the white paper link. Helped me but now my mental-map of EntLib 5.0 is like an entangled interdependent spaghetti soup :( ... I don't claim that the code or design of EntLib is so - merely my understanding of EntLib 5.0 (i.e. possible user error!). I'm really hoping EntLib 6.0 is far less verbose in configuration and simpler in the inter-dependencies.

Back to the issue, lets say I'm going with the BlobConfigurationSource, is there a HOWTO or any documentation on being able to use it? Lets assume the user can configure logging (via EntLib Config tool) but has heard "BlobConfigurationSource" for the very first time. MSDN discusses the class methods but does little in terms of how to plug it into a real azure project and start using it (i.e. documentation isn't very productive).



Jan 9, 2013 at 5:55 AM
Edited Jan 11, 2013 at 12:01 AM

There is a learning curve involved especially when putting it all together with Azure.

I'm going to assume that you are familar with Windows Azure.  You will definitely need to interact with Azure Storage so I would recommend getting (if you don't already have it) Azure Storage Explorer.

In terms of configuration, the BlobConfigurationSource is conceptually no different than the FileConfigurationSource which you can read about in Using a Non-default Configuration Store.  At runtime Enteprise Library will look to another location other than web.config/app.config to retrieve the configuration information.  For FileConfigurationSource it's a file on disk and for BlobConfigurationSource it's a blob in Azure Storage.

The first step is to get the Enterprise Library 5.0 - Windows Azure Configuration Extensions as well as Enterprise Library 5.0 - Logging Application Block and add those to your Web or Worker Role Project(s).

The next step is to configure the local configuration e.g. web.config or app.config to point to the Blob Storage:

<?xml version="1.0" encoding="utf-8"?>
    <section name="enterpriseLibrary.ConfigurationSource" type="Microsoft.Practices.EnterpriseLibrary.Common.Configuration.ConfigurationSourceSection, Microsoft.Practices.EnterpriseLibrary.Common, Version=5.0.505.0, Culture=neutral, PublicKeyToken=null" requirePermission="true" />
  <enterpriseLibrary.ConfigurationSource selectedSource="Blob Configuration Source">
      <add name="Blob Configuration Source" type="Microsoft.Practices.EnterpriseLibrary.WindowsAzure.Configuration.BlobConfigurationSource, Microsoft.Practices.EnterpriseLibrary.WindowsAzure.Configuration, Version=5.0.1118.0, Culture=neutral, PublicKeyToken=null"
        blobContainerName="entlib-container" blobName="EntLib.Config"
        storageAccount="UseDevelopmentStorage=true" refreshInterval="00:00:59" />

You can also create this configuration using the ConfigConsole extension along with Nuget packages.  

Here we specify the blob container, blob name, account information, and refresh interval.  So in this case we will need to create a blob container called "entlib-container".  Inside that container we will create a blob called EntLib.Config.  It is this blob that will contain the actual configuration.

To create the actual configuration you can use the configuration tool and create a configuration suitable for Windows Azure:

<?xml version="1.0" encoding="utf-8"?>
    <section name="loggingConfiguration" type="Microsoft.Practices.EnterpriseLibrary.Logging.Configuration.LoggingSettings, Microsoft.Practices.EnterpriseLibrary.Logging, Version=5.0.505.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" requirePermission="true" />
  <loggingConfiguration name="" tracingEnabled="true" defaultCategory="General">
      <add listenerDataType="Microsoft.Practices.EnterpriseLibrary.Logging.Configuration.SystemDiagnosticsTraceListenerData, Microsoft.Practices.EnterpriseLibrary.Logging, Version=5.0.505.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"
        source="Enterprise Library Logging" type="Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener, Microsoft.WindowsAzure.Diagnostics, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35"
        name="MyListener" />
      <add switchValue="Error" name="General">
          <add name="MyListener" />
      <allEvents switchValue="All" name="All Events" />
      <notProcessed switchValue="All" name="Unprocessed Category" />
      <errors switchValue="All" name="Logging Errors & Warnings">
          <add name="MyListener" />

This should then be uploaded/entered to the blob.

The log information written by using this configuration will be moved by Azure to the WADLogsTable.  Actually, that's not true yet since we haven't set that up.  So in the role OnStart method you can start the DiagnosticMonitor:

//Get the configuration object
DiagnosticMonitorConfiguration diagObj = DiagnosticMonitor.GetDefaultInitialConfiguration();

//Set the service to transfer logs every minute to the storage account
diagObj.Logs.ScheduledTransferLogLevelFilter = LogLevel.Undefined;
diagObj.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(1);

//Start Diagnostics Monitor with the new storage account configuration as per your project settings
DiagnosticMonitor.Start("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", diagObj);

Now logging should be working.  For example:

Logger.Write("Get Logging in Windows Azure with Enterprise Library!",
    "General", 1, 0, System.Diagnostics.TraceEventType.Information);

Logger.Write("Some error with Windows Azure!",
    "General", 1, 0, System.Diagnostics.TraceEventType.Error);            

Since the logging configuration stored in the blob is set to only log Error Severity and above, the two Logger.Writes() will only result in one entry appearing in the WADLogsTable.  

Now you can go to the storage blob and upload a new configuration or modify the configuration.  Every refreshInterval (59 seconds in the above configuration) the BlobConfigurationSource will check the ETag value of the blob and if it is different than the last value the BlobConfigurationSource will refresh the Logging Block with the new configuration.  

So, if we changed the switchValue="Error" to switchValue="Information" in the logging configuration then the two log statements above will result in 2 entries being written without recycling the application.

Also, Get logging in Windows Azure with Enterprise Library is a good article.

Randy Levy
Enterprise Library support engineer
Support How-to