We are developing a database tool and we'd like to write a log file in a format which is extendable and easy to be imported into a database table. We all feel that filtering this info by using SQL is a good idea, since the log will be a long file and "search" may not be good enough. Could you give me some suggestions? Any experiences will be useful too! Thanks in advance.
The JSON (JavaScript Object Notation) is a highly readable data-interchange format that has established itself as the standard format for structured logging. It is compact and lightweight, and simple to read and write for humans and machines.
A prerequisite for good logging is to have a standard structure of your log file, which would be consistent across all log files. Each log line should represent one single event and contain at least the timestamp, the hostname, the service and the logger name.
The first thing I would say is that your file format ought to be human readable. My reasons are given here: Why should I use a human readable file format.
Beyond that, it is impossible to answer with such a vague question. However, here are some of the issues you should consider:
When you can answer all these questions, you'll probably know the answer yourself. If not, make your question more specific with these questions answered and it will be easier for someone to help you.
Personally I've always been grateful when log data has been written as CSV. It is flexible enough to expand (add extra columns, change the length of a field), is quick to read and write in to a database spreadsheet, and hundreds of other tools, and is codeable in seconds. However, it does have a number of disadvantages - it is verbose, easy to get escapes wrong, untyped, and easy to break if you rearrange columns.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With