Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

CloudWatch logs stream to Lambda python

I have created a subscription filter in CloudWatch log group and made it stream to my lambda function, but am getting an error in my lambda function.

Code:

import boto3
import binascii
import json
import base64
import zlib

def stream_gzip_decompress(stream):
    dec = zlib.decompressobj(32 + zlib.MAX_WBITS)  # offset 32 to skip the header
    foo=''
    for chunk in stream:
        rv = dec.decompress(chunk)
        if rv:
            foo += rv
    return foo

def lambda_handler(event, context):
    # Decode and decompress the AWS Log stream to extract json object
    stream=json.dumps(event['awslogs']['data'])
    f = base64.b64decode(stream)
    payload=json.loads(stream_gzip_decompress(f.decode(f)))
    print(payload)

Error:

Response:

{
  "errorMessage": "decode() argument 1 must be str, not bytes",
  "errorType": "TypeError",
  "stackTrace": [
    [
      "/var/task/lambda_function.py",
      34,
      "lambda_handler",
      "payload=json.loads(stream_gzip_decompress(f.decode(f)))"
    ]
  ]
}

Any help or clue would be greatly appreciated! If you have any alternative solution please suggest. My requirement is to handle logs from CloudWatch using lambda.

Thanks in Advance !!

like image 708
Abdul Salam Avatar asked May 11 '18 15:05

Abdul Salam


People also ask

How do I add CloudWatch logs to Lambda?

Using the Lambda consoleOpen the Functions page of the Lambda console. Choose a function. Choose Monitor. Choose View logs in CloudWatch.

Can CloudWatch logs trigger Lambda?

A custom CloudWatch event rule invokes a Lambda function when a new log group is created. The first Lambda function contains Python code that captures the name of new log groups from the event.

How do I stream CloudWatch logs?

To view log dataOpen the CloudWatch console at https://console.aws.amazon.com/cloudwatch/ . In the navigation pane, choose Log groups. For Log Groups, choose the log group to view the streams. In the list of log groups, choose the name of the log group that you want to view.


2 Answers

In case anyone else is looking for help with this topic.

I took a slightly different approach, but I did see an 'awslog' key in the event.

Here is a sample that I was successful with. Python 3.6 Lambda. Setup cloudwatch trigger to call the lambda

import gzip
import json
import base64


def lambda_handler(event, context):
    print(f'Logging Event: {event}')
    print(f"Awslog: {event['awslogs']}")
    cw_data = event['awslogs']['data']
    print(f'data: {cw_data}')
    print(f'type: {type(cw_data)}')
    compressed_payload = base64.b64decode(cw_data)
    uncompressed_payload = gzip.decompress(compressed_payload)
    payload = json.loads(uncompressed_payload)

    log_events = payload['logEvents']
    for log_event in log_events:
        print(f'LogEvent: {log_event}')
like image 97
P. Ryan Avatar answered Oct 18 '22 13:10

P. Ryan


Below is the outline I normally follow when processing CloudWatch Logs being sent to AWS Lambda.

import gzip
import json
from StringIO import StringIO

def lambda_handler(event, context):
    cw_data = str(event['awslogs']['data'])
    cw_logs = gzip.GzipFile(fileobj=StringIO(cw_data.decode('base64', 'strict'))).read()
    log_events = json.loads(cw_logs)
    for log_event in logevents['logEvents']:
        # Process Logs

I see that you are treating the data sent to the AWS Lambda as a JSON object. You first want to base64 decode then unzip the data. After decoding and decompressing you should have the JSON object with the log information.

like image 28
quasar Avatar answered Oct 18 '22 13:10

quasar