Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to log complex synchronization process?

I'm developing a complicated distributed service that makes iterative synchronization process. It synchronise every 10 seconds business entities in different information systems. One iteration is consist of bunch of 3d party service calls to retrieve current state of business objects (count of customers, goods, certain customer and goods details etc.), queries to local DB and then get differences between them and smooth out, synchronize this differences.

There are different types of iterations. They are fast (only changes in set of objects) and slow iterations (full revieweing of data). Fast are every 10 seconds and slow are once a day.

So, how can I log this processes using NLog? I'm using SQLite for storing data. But I'm stuck in DB design for logs.

So I want to log flow of every iteration: 1. Request for current state of objects to 3d party service 2. Query the local database for current state of objects 3. Get differences list 4. Invoke external service to commit insufficient data 5. Update local database for insufficient data

But there is so many kinds of info to log so I can't just put it into one TEXT field.

At the moment I'm using such structure for logs:

CREATE TABLE [Log] (
  [id] INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT, 
  [ts] TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP, 
  [iteration_id] varchar, 
  [request_response_pair] varchar, 
  [type] VARCHAR NOT NULL, 
  [level] TEXT NOT NULL, 
  [server_id] VARCHAR, 
  [server_alias] VARCHAR, 
  [description] TEXT, 
  [error] Text);

So every service request and response puts to description and request_response_pair is a key to link every response to every request.

Here is my NLog config:

<?xml version="1.0" encoding="utf-8" ?>
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd"
      xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" internalLogFile="D:\nlog.txt" internalLogLevel="Trace">
  <targets>
    <target name="Database" xsi:type="Database" keepConnection="false"
            useTransactions="false"
            dbProvider="System.Data.SQLite.SQLiteConnection, System.Data.SQLite, Version=1.0.82.0, Culture=neutral, PublicKeyToken=db937bc2d44ff139"
            connectionString="Data Source=${basedir}\SyncLog.db;Version=3;"
            commandText="INSERT into Log(iteration_id, request_response_pair, type, level, server_id, server_alias, description, error) values(@Iteration_id, @Request_response_pair, @Type, @Loglevel, @server_id, @server_alias, @Description, @Error)">
      <parameter name="@Type" layout="${message}"/>
      <parameter name="@Loglevel" layout="${level:uppercase=true}"/>
      <parameter name="@Request_response_pair" layout="${event-context:item=request_response_pair}"/>
      <parameter name="@Iteration_id" layout="${event-context:item=iteration_id}"/>
      <parameter name="@server_id" layout="${event-context:item=server_id}"/>
      <parameter name="@server_alias" layout="${event-context:item=server_alias}"/>
      <parameter name="@Description" layout="${event-context:item=description}"/>
      <parameter name="@Error" layout="${event-context:item=error}"/>
    </target>
  </targets>
  <rules>
    <logger name="*" minlevel="Trace" writeTo="Database" />
  </rules>
</nlog>

Here is how I log:

namespace NLog
{
    public static class LoggerExtensions
    {
        public static void InfoEx(this Logger l, string message, Dictionary<string, object> contextParams)
        {
            LogEventInfo eventInfo = new LogEventInfo(LogLevel.Info, "", message);
            foreach (KeyValuePair<string, object> kvp in contextParams)
            {
                eventInfo.Properties.Add(kvp.Key, kvp.Value);
            }

            l.Log(eventInfo);
        }

        public static void InfoEx(this Logger l, string message, string server_id, string server_alias, Dictionary<string, object> contextParams = null)
        {
            Dictionary<string, object> p = new Dictionary<string, object>();
            p.Add("server_id", server_id);
            p.Add("server_alias", server_alias);
            if (contextParams != null)
            {
                foreach (KeyValuePair<string, object> kvp in contextParams)
                {
                    p.Add(kvp.Key, kvp.Value);
                }
            }

            l.InfoEx(message, p);
        }
    }
}

I know about logging levels but I need all this verbose logs, so I log it as info. I can't find any tutorial how to log these complicated, structured logs. Only plain dumb log messges.

like image 681
kseen Avatar asked May 30 '13 09:05

kseen


1 Answers

I am assuming, you are talking about "Logs" for the typical "log things so we have something to look at if we need to inspect our workflow (errors/performance)". I assume that you do NOT mean logs as e.g. in "We need accounting information and the log is part of our domain data and is included in the workflow."

And from what I got from your posting, you are worrying about the backend storage format on the logs, so that you can later process them and use for said diagnostics?


Then I'd recommend that you make the logging code independend of the domain specifics.

Question: How will the logs you create be processed? Do you really need to access them all over the place so you need the database to provide you a structured view? Is it in any kind relevant how fast you can filter your logs? Or will they end up in one big log-analyzer application anyway, that is ran only ever second week when something bad happened?

In my opinion, the biggest reasons you want to avoid any domain specifics in the log are that "logs should work if things go wrong" and "logs should work after things changed".

Logs should work if things go wrong

If you have columns in your log table for domain specific values like "Request_response_pair", and there is no pair, then to write the log itself might fail (e.g. if its an index field). Of course, you can make sure to not have non-null columns and no restrictions in your DB-design, but take a step back and ask: Why do you want the structure in your log database anyway? Logs are meant to work as reliable as possible, so any kind of template you press them on may restrict the use cases or may make you not be able to log crucial information.

Logs should work after things changed

Especially if you need logs for detecting and fixing bugs or improve performance, that means that you will regulary compare logs from "before the change" to logs from "after the change". If you need to change the structure of your log database, because you changed your domain data, this is going to hurt you when you need to compare the logs.

True, if you make a data structure change, you probably still need to update some tools like log analyzer and such, but there is usually a large part of logging/analyzing code that is completely agnostic to the actual structure of the domain.


Many systems (including complex ones) can live with "just log one simple string" and later write tools to take the string apart again, if they need to filter or process the logs.

Other systems write logs in simple string key/value pairs. The log function itself is not domain specific but just accepts a string dictionary and writes it off (or even easier, a params string[] which should have an even number of parameters and you use every second parameter as key - if you aren't scared by that proposition :-D).

Of course, you will probably start writing another tooling layer on top of the base log functions that knows about domain specific data structures and then composes the string dictionary and pass it on. You certainly don't want to copy the decomposing code all place around. But make the base functions available at all places where you might want to log something. Its really helpfull if you indeed run into "strange" situations (exception handler) where some information is missing.

like image 63
Imi Avatar answered Nov 07 '22 14:11

Imi