Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Google Cloud Ops Agent

I am having an issue where Google Cloud Ops Agent logging gathers a lot of data and fills up my entire debian server hard drive in about 3 weeks due to the ever increasing size of the log file.

I do not want to increase the size of my server hard drive.

Does anyone know how to configure Google Cloud Ops Agent so that it only retains log data for the previous 7 days ?

EDIT: Google Cloud Ops Agent log file is stored in directory below

/var/log/google-cloud-ops-agent/subagents/logging-module.log
like image 715
mister_cool_beans Avatar asked Oct 22 '25 11:10

mister_cool_beans


1 Answers

I faced the same issue recently while using agent 2.11.0. And it's not just an enormous log file, it's also a ridiculous CPU usage! Check it out in htop. If you open the log file you'll see it spamming errors about buffer chunks. Apparently, they got broken smh, so the agent can't read them and send away. Thus, high IO and CPU usage.

The solution is to stop the service:

sudo service google-cloud-ops-agent stop

Then clear all buffer chunks:

sudo rm -rf /var/lib/google-cloud-ops-agent/fluent-bit/buffers/

And delete log file if you want:

sudo rm -f /var/log/google-cloud-ops-agent/subagents/logging-module.log

Then start the agent:

sudo service google-cloud-ops-agent start

This helped me out.

Btw this issue is described here and it seems that Google "fixed" it since 2.7.0-1. Whatever they mean by it since we still faced it...

like image 106
insanie Avatar answered Oct 25 '25 08:10

insanie



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!