Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Cloudflare R2 : get usage metrics through the API

I can't find how to get usage metrics to my R2 bucket through Cloudflare's API.

I'd like to get the bucket size and the number of class A and class B operations.

like image 517
Zoltar1999 Avatar asked Oct 24 '25 14:10

Zoltar1999


1 Answers

You can use their https://api.cloudflare.com/client/v4/graphql endpoint for both of those queries. Which is what their frontend does anyways. I'm writing this in nodejs but I did see that this can be done with cURL instead if needed.

I mainly followed their docs.

Here is a JavaScript file which does both requests. If you wanted to filter to a certain bucket you would add the bucketName prop to the filter. However, keep in mind operation requests for billing purposes are account wide not per bucket.

// main.js
const date = new Date()
date.setDate(1) // set to first of the month

const res = await fetch('https://api.cloudflare.com/client/v4/graphql', {
  method: 'POST',
  headers: {
    'X-AUTH-EMAIL': process.env.EMAIL,
    'X-AUTH-KEY': process.env.CLOUDFLARE_API_TOKEN,
    'Content-Type': 'application/json',
  },
  body: JSON.stringify({
    query: `{
      viewer {
        accounts(filter: { accountTag: "${process.env.CLOUDFLARE_ACCOUNT}" }) {
          r2OperationsAdaptiveGroups(
            filter: { datetime_geq: "${date.toISOString()}" }
            limit: 9999
          ) {
            dimensions {
              actionType
            }
            sum {
              requests
            }
          }
        }
      }
    }`
  })
})

const body = await res.json()
const classA = ["ListBuckets", "PutBucket", "ListObjects", "PutObject", "CopyObject", "CompleteMultipartUpload", "CreateMultipartUpload", "ListMultipartUploads", "UploadPart", "UploadPartCopy", "ListParts", "PutBucketEncryption", "PutBucketCors", "PutBucketLifecycleConfiguration"]
const classB = ["HeadBucket", "HeadObject", "GetObject", "UsageSummary", "GetBucketEncryption", "GetBucketLocation", "GetBucketCors", "GetBucketLifecycleConfiguration"]
let [classATotal, classBTotal] = [0, 0]
body.data.viewer.accounts[0].r2OperationsAdaptiveGroups.forEach(item => {
  if (classA.includes(item.dimensions.actionType)) {
    classATotal += item.sum.requests
  } else if (classB.includes(item.dimensions.actionType)) {
    classBTotal += item.sum.requests
  }
})

const aUsage = Math.round((classATotal / 1_000_000) * 100)
const bUsage = Math.round((classBTotal / 10_000_000) * 100)

const storage = await fetch('https://api.cloudflare.com/client/v4/graphql', {
  method: 'POST',
  headers: {
    'X-AUTH-EMAIL': process.env.EMAIL,
    'X-AUTH-KEY': process.env.CLOUDFLARE_API_TOKEN,
    'Content-Type': 'application/json',
  },
  body: JSON.stringify({
    query: `{
      viewer {
        accounts(filter: { accountTag: "${process.env.CLOUDFLARE_ACCOUNT}" }) {
          r2StorageAdaptiveGroups(
            limit: 9999
            filter: { datetime_geq: "${date.toISOString()}" }
          ) {
            max {
              payloadSize
            }
          }
        }
      }
    }`
  })
})


const bodyStorage = await storage.json()
const bytes = bodyStorage.data.viewer.accounts[0].r2StorageAdaptiveGroups[0].max.payloadSize

console.log(`R2 read=${aUsage}% (${classATotal} req) write=${bUsage}% (${classBTotal} req) storage=${bytesUsage}% (${megabytes}Mb)`)

if (aUsage > 50 || bUsage > 50 || bytesUsage > 50) {
  console.log("over 50% usage")
}

this assumes you have a .env or .dev.vars file with these three values filled out

CLOUDFLARE_ACCOUNT=
CLOUDFLARE_API_TOKEN=
EMAIL=

Generate a token.

Your account ID can be found on the right.

Then assuming you are running Node v20.6+ (which can read .env files) it will just be node --env-file .env main.js.

Worker email

I wanted to automate this into a CloudFlare Worker. I found that there is good integration with MailChannel so that emails can be sent from an authenticated domain. There is a lot of steps to doing that and veers off the topic. So, I'll just link that here but here is the intro paragraph:

This Worker sends weekly emails with a percent report on all class A, class B, and storage size metrics for R2. Uses CF Worker and GitHub Actions for automation.

like image 177
Coda Bool Avatar answered Oct 28 '25 06:10

Coda Bool