Please let me know your suggestions about how to design logging and how to test it, in the following scenario.
I've an API which can be called by multiple threads. One call of this API by a single thread generates a log of 50 KB.
Does any design pattern exist for logging in multi-threaded environment. i.e one log file for all threads vs one dedicated log file for each thread?
And, How to do testing for this feature. (should it be tested?)
Thanks.
If you're talking about logging of transactional activity, where all of the data in the 50 KB log is related to a particular transaction performed by the thread, there may be a case for a log per thread, as otherwise disentangling the output can become a significant problem. Another solution for this particular case may actually be a single log file per transaction with a 'global' log that merely mentions on a single line that a transaction was initiated and perhaps another entry for the final result, if applicable. A third solution is to make sure the log entries are tagged in such a way that you can determine the transaction with which each line is associated and then have post-processing tools that can filter the log to view particular transactions.
The second solution (a file per transaction) can become a problem if you have very frequent transactions, as some file system operations (notably, inspecting a folder over the network) become slow as the number of files in a directory increases. The third solution can work well (as you can add additional features to the filtering/viewing tool), but it does mean developing and maintaining another tool.
Wrap your logging instance in a thread-safe singleton. Don't use double check locking! Also, it probably makes sense to use a logging library like log4net or Enterprise Library 5.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With