Thursday, January 28, 2016

AWS Lambda S3 integration

AWS recently added new compute service called Lambda. The lambda compute service can process the data from S3, Dynamodb, SQS etc without provisioning the required compute explicitly. The lambda service can listen to S3 and can process the file as it is put into the S3 bucket.

For this you need to create a s3 bucket.

Screen Shot 2016-01-28 at 7.58.48 PM

Name the bucket as per your choice.

Screen Shot 2016-01-28 at 7.59.10 PM

Log in the AWS console.

Screen Shot 2016-01-28 at 7.55.51 PM

Choose Lambda. If you are accessing it the first time then you will see below page.

Screen Shot 2016-01-28 at 8.15.40 PM

Click Get Started.

If you already have lambda functions in your account the screen will be as below, Click create a Lambda Function.

Screen Shot 2016-01-28 at 7.56.33 PM



The lambda function is 2 step process. You can skip the blueprint for this tutorial.

Screen Shot 2016-01-28 at 7.57.22 PM

Name your Lambda function. Provide short description. Choose the Java 8 as runtime.

Screen Shot 2016-01-28 at 7.58.17 PM

Now we require java code in the form of zip file. Lets create the java file using maven project.

Create a maven project using below command

[code language="xml"]
mvn archetype:generate -DarchetypeArtifactId=maven-archetype-quickstart -DgroupId=com.jbksoft.lambda -DartifactId=lambda-s3-demo -DinteractiveMode=false -DarchetypeVersion=1.1
[/code]

Download the below versions of AWS-Java-SDK, AWS-Lambda-java-core and aws-lambda-java-events using the maven dependency. Update pom.xml with below dependency.
 <dependency>
   <groupId>com.amazonaws</groupId>
   <artifactId>aws-lambda-java-core</artifactId>
   <version>1.0.0</version>
 </dependency>
 <dependency>
   <groupId>com.amazonaws</groupId>
   <artifactId>aws-lambda-java-events</artifactId>
   <version>1.0.0</version>
 </dependency>
 <dependency>
   <groupId>com.amazonaws</groupId>
   <artifactId>aws-java-sdk-s3</artifactId>
   <version>1.10.36</version>
 </dependency>

 <build>
   <finalName>lambdas3demo</finalName>
   <plugins>
     <plugin>
       <groupId>org.apache.maven.plugins</groupId>
       <artifactId>maven-shade-plugin</artifactId>
       <version>2.3</version>
       <configuration>
         <createDependencyReducedPom>false</createDependencyReducedPom>
       </configuration>
       <executions>
         <execution>
           <phase>package</phase>
             <goals>
               <goal>shade</goal>
             </goals>
         </execution>
       </executions>
     </plugin>
   </plugins>
 </build>

Create a java file S3LambdaDemo.java. This handler returns the bucket concat with filename like sourcebucket\HappyFace.jpg.

[code language="java"]
package com.jbksoft.lambda;

import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.RequestHandler;
import com.amazonaws.services.lambda.runtime.events.S3Event;
import com.amazonaws.services.s3.event.S3EventNotification;

public class S3LambdaDemo implements RequestHandler&amp;lt;S3Event, String&amp;gt; {
public String handleRequest(S3Event s3Event, Context context) {
S3EventNotification.S3EventNotificationRecord record = s3Event.getRecords().get(0);
String srcBucket = record.getS3().getBucket().getName();
String srcKey = record.getS3().getObject().getKey().replace('+', ' ');
return srcBucket+"/"+srcKey;
}
}
[/code]

Run on terminal

mvn clean package

Upload the jar file.

Screen Shot 2016-01-28 at 9.38.35 PM

Provide Handler: com.jbksoft.lambda.S3LambdaDemo and Choose the role: lambda-execution-s3-role(Refer the execution role for role creation).

Screen Shot 2016-01-28 at 9.40.03 PM

Click create.

Screen Shot 2016-01-28 at 9.47.51 PM

Click test and choose s3 put from drop down.

Screen Shot 2016-01-28 at 9.48.52 PM

Click 'save and test'. The lambda will provision the aws service and trigger the lambda with test file HappyFace.jpg(line 14) and source bucket 'sourcebucket' (line 19).Screen Shot 2016-01-28 at 10.17.32 PM

The output is name of bucket concat with filename.

Similarly the handler can process the content of the file on s3 as well.

Reference:

2 comments:

  1. Hello,

    If we update code, again we need to upload the 6MB jar file to server?. Its pretty tedious right?. any alternatives?. Can you please help

    Thanks

    ReplyDelete
  2. You can put the jar file to s3 and reuse it from s3. But if you update then you will have to upload it again.. :(

    ReplyDelete