Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

logging on Azure

Tags:

log4net

azure

How can we log on Azure withe the granularity & control equivalent to log4net? We use log4net in our web apps we run on IIS and that works very well for us. Is that the best on Azure too?

We absolutely prefer log files (as opposed to database entries) but if there's something that works better in Azure, I'm open to improvements. The way Trace writes to a table in Azure is horrible - we definitely don't want that.

The reason I prefer log files is it's super easy to see what happened in sequence which is what I need 99% of the time.

This is for an Azure web app that will have multiple instances. It's fine if the logs are distinct to each instance.

thanks - dave

like image 497
David Thielen Avatar asked Sep 17 '14 23:09

David Thielen


2 Answers

This is pretty straight forward. I use the following log4net configuration to dump a log file in the web application root folder (easily changed to a sub-folder):

<log4net>
  <root>
    <level value="DEBUG" />
    <appender-ref ref="LogFileAppender" />
  </root>
  <appender name="LogFileAppender" type="log4net.Appender.RollingFileAppender" >
    <param name="File" value="my_web.log" />
    <param name="AppendToFile" value="true" />
    <rollingStyle value="Size" />
    <maxSizeRollBackups value="10" />
    <maximumFileSize value="10MB" />
    <staticLogFileName value="true" />
    <layout type="log4net.Layout.PatternLayout">
      <param name="ConversionPattern" value="%date{yyyy-dd-MM HH:mm:ss.fff} [%thread] %-5level %logger.%method [%property{NDC}] - %message%newline" />
    </layout>
  </appender>
</log4net>

I then inspect the log file when needed directly from Visual Studio (double clicking the file downloads it) Server Explorer:

Azure Website via Server Explorer

like image 78
Brendan Green Avatar answered Sep 18 '22 13:09

Brendan Green


I think you should be careful when storing the log file locally in Azure as this is not garanteed to stick around. The VM used to store the website can be reimaged and the logs will be lost.

A better solution is to use Azure diagnostics combined with log4net (would work the same for other logging mechanisms such as NLog). Process is sumarrized here:

  1. Set up local storage as a place on the role instance (virtual machine) where log files are written.

  2. Add a element to the diagnostics.wadcfg file to instruct Azure diagnostics to create and use a container in blob storage.

  3. Add a element within to instruct Azure diagnostics to monitor the logging folder within the LogStorage local resource location.

This way the locally stored logs will be copied to the blob storage.

Full story here: http://justazure.com/microsoft-azure-diagnostics-part-1-introduction/

like image 32
Stefan Iancu Avatar answered Sep 19 '22 13:09

Stefan Iancu