We can easily save data between different AWS Services for ex. Kinesis to DynamoDB; or AWS IoT to Redshift etc.
But what is best strategy to save streaming data to suppose MongoDB ( which does NOT have AWS PaaS ; Atlas is there but it has no integrations with other AWS Services )
I can see some third party solutions are there; but what is best strategy to implement on AWS itself...Is execution of lambda function for each insert (batching) the only option ?
I am assuming that you are using Kinesis Firehose. If that's the case, what you can do is:
From Firehose write to S3 every 5 mins.
Firehose will create a new file on S3 every 5 mins.
Trigger a Lambda function to read the new file on S3.
Write the data of the new file to MongoDB.
If you are using Kinesis (not firehose), you can simply write a Kinesis consumer which will read data from the Kinesis and write directly yo MongoDB.
FYI, There is DocumentDB with MongoDB like API, you can use that as AWS Hosted MongoDB
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With