I am implementing a Java 8 package for a AWS Lambda that gets triggered from S3, gets some file data and puts into another S3 bucket.
This is my current Handler
:
public class Handler implements RequestHandler<S3Event, Void> {
private static final String TARGET_BUCKET = "some-bucket";
private AmazonS3Client s3Client = new AmazonS3Client(new DefaultAWSCredentialsProviderChain());
private Runner runner = new Runner(s3Client, TARGET_BUCKET);
@Override
public Void handleRequest(S3Event s3Event, Context context) {
runner.run(s3Event, context);
return null;
}
}
I have moved the business logic to my Runner
class, so that I can properly test it (following the AWS Lambda best practices white paper).
However, I am struggling to see how I can pass a fake S3Event
to test my run
function.
My test currently is:
@Test
public void putsDataIntoS3() throws IOException {
runner.run(new ObjectMapper().readValue(loadJsonFromFile("s3-event.json"), S3Event.class), context);
assertTrue(true);
}
Where loadJsonFromFile
gets the resource with the filename I am passing, converts it into an input stream and then into a String
.
However, that results in an error:
com.fasterxml.jackson.databind.JsonMappingException: No suitable constructor found for type [simple type, class com.amazonaws.services.lambda.runtime.events.S3Event]: can not instantiate from JSON object (missing default constructor or creator, or perhaps need to add/enable type information?)
So my question is, how can I properly test my run
function by passing fake S3Event
JSON?
These are the aws-related dependencies I am using:
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-lambda-java-core</artifactId>
<version>1.1.0</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-lambda-java-events</artifactId>
<version>2.2.2</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-s3</artifactId>
<version>1.11.455</version>
</dependency>
I can use the S3Event.parseJson
function like this:
I've also seen the parseJson
function, and I can use it like this:
@Test
public void putsFilteredDataIntoS3() throws IOException {
runner.run(new S3Event(S3Event.parseJson(loadJsonFromFile("s3-event.json")).getRecords()), context);
assertTrue(true);
}
But is doing all this a best practice?
You need to build a continuous integration/deployment pipeline for a set of Lambdas. What should you do? Create configuration files and deploy them using AWS CodePipeline. Create CloudFormation templates and deploy them using AWS CodePipeline.
This is how a typical Lambda function is written and tested using the AWS Toolkit for Eclipse. For more advanced use cases, you can use the S3 Event and DynamoDB Event examples provisioned by AWS Lambda.
You can use Mockito library and try to mock this S3Event
.
Another option to create S3Event
from JSON:
S3EventNotification notification = S3EventNotification.parseJson(loadJsonFromFile("s3-event.json"));
S3Event event = new S3Event(notification.getRecords());
EDIT:
Third option is to update your aws-lambda-java-events
to version 2.2.4
, where they added default constructor for S3Event
so you will be able to deserialize it like this:
objectMapper.readValue(loadJsonFromFile("s3-event.json"), S3Event.class)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With