I am using Serilog - RollingFile Sink, but it stores all data in a single file for a day. In my application, 1 GB log is written in a day. So I want to roll log file on the basis of date and size.
How can I configure RollingFile Sink to roll files based on date and size?
Nowadays Serilog.Sinks.RollingFile package is deprecated in favor of Serilog.Sinks.File (see the github project readme intro). Serilog.Sinks.File package has been upgraded to support file rolling. You can use the following Serilog config to enable rolling both by time and size:
"Serilog": {
"Using": ["Serilog.Sinks.File"],
"MinimumLevel": "Debug",
"WriteTo": [
{
"Name": "File",
"Args": {
"path": "logs/log.txt",
"rollingInterval": "Day",
"rollOnFileSizeLimit": true,
"fileSizeLimitBytes": "512",
"retainedFileCountLimit": 3,
"formatter": "Serilog.Formatting.Json.JsonFormatter, Serilog"
}
}
]
}
Then you will get something like this:
From the documentation:
To avoid bringing down apps with runaway disk usage the rolling file sink limits file size to 1GB by default. The limit can be changed or removed using the fileSizeLimitBytes parameter.
.WriteTo.RollingFile("log-{Date}.txt", fileSizeLimitBytes: null)
The example shows removing the limit by setting it to null
. In your case, set it to an appropriate size.
UPDATE
Yes, based on your comment I looked at the source code and it looks like the RollingFileSink's lowest unit of measure is a day so having more than one on the same day seems to be not supported. However, and I didn't look closely, it looks like the OpenFile
methods in RollingFileSink.cs does something with sequence numbers. You might want to take a peek and see what that code is doing.
I believe you're looking for this alternative implementation of the RollingFile sink:
Serilog Rolling File Sink (alternative)
This is a rolling file sink that allows you to specify roll over behaviour based on file size. https://github.com/BedeGaming/sinks-rollingfile
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With