AWS recently added new compute service called Lambda. The lambda compute service can process the data from S3, Dynamodb, SQS etc without provisioning the required compute explicitly. The lambda service can listen to S3 and can process the file as it is put into the S3 bucket.
For this you need to create a s3 bucket.
Name the bucket as per your choice.
Log in the AWS console.
Choose Lambda. If you are accessing it the first time then you will see below page.
Click Get Started.
If you already have lambda functions in your account the screen will be as below, Click create a Lambda Function.
The lambda function is 2 step process. You can skip the blueprint for this tutorial.
Name your Lambda function. Provide short description. Choose the Java 8 as runtime.
Now we require java code in the form of zip file. Lets create the java file using maven project.
Create a maven project using below command
[code language="xml"]
mvn archetype:generate -DarchetypeArtifactId=maven-archetype-quickstart -DgroupId=com.jbksoft.lambda -DartifactId=lambda-s3-demo -DinteractiveMode=false -DarchetypeVersion=1.1
[/code]
Download the below versions of AWS-Java-SDK, AWS-Lambda-java-core and aws-lambda-java-events using the maven dependency. Update pom.xml with below dependency.
Create a java file S3LambdaDemo.java. This handler returns the bucket concat with filename like sourcebucket\HappyFace.jpg.
[code language="java"]
package com.jbksoft.lambda;
import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.RequestHandler;
import com.amazonaws.services.lambda.runtime.events.S3Event;
import com.amazonaws.services.s3.event.S3EventNotification;
public class S3LambdaDemo implements RequestHandler<S3Event, String> {
public String handleRequest(S3Event s3Event, Context context) {
S3EventNotification.S3EventNotificationRecord record = s3Event.getRecords().get(0);
String srcBucket = record.getS3().getBucket().getName();
String srcKey = record.getS3().getObject().getKey().replace('+', ' ');
return srcBucket+"/"+srcKey;
}
}
[/code]
Run on terminal
mvn clean package
Upload the jar file.
Provide Handler: com.jbksoft.lambda.S3LambdaDemo and Choose the role: lambda-execution-s3-role(Refer the execution role for role creation).
Click create.
Click test and choose s3 put from drop down.
Click 'save and test'. The lambda will provision the aws service and trigger the lambda with test file HappyFace.jpg(line 14) and source bucket 'sourcebucket' (line 19).
The output is name of bucket concat with filename.
Similarly the handler can process the content of the file on s3 as well.
Reference:
For this you need to create a s3 bucket.
Name the bucket as per your choice.
Log in the AWS console.
Choose Lambda. If you are accessing it the first time then you will see below page.
Click Get Started.
If you already have lambda functions in your account the screen will be as below, Click create a Lambda Function.
The lambda function is 2 step process. You can skip the blueprint for this tutorial.
Name your Lambda function. Provide short description. Choose the Java 8 as runtime.
Now we require java code in the form of zip file. Lets create the java file using maven project.
Create a maven project using below command
[code language="xml"]
mvn archetype:generate -DarchetypeArtifactId=maven-archetype-quickstart -DgroupId=com.jbksoft.lambda -DartifactId=lambda-s3-demo -DinteractiveMode=false -DarchetypeVersion=1.1
[/code]
Download the below versions of AWS-Java-SDK, AWS-Lambda-java-core and aws-lambda-java-events using the maven dependency. Update pom.xml with below dependency.
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-lambda-java-core</artifactId>
<version>1.0.0</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-lambda-java-events</artifactId>
<version>1.0.0</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-s3</artifactId>
<version>1.10.36</version>
</dependency>
<build>
<finalName>lambdas3demo</finalName>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<configuration>
<createDependencyReducedPom>false</createDependencyReducedPom>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
Create a java file S3LambdaDemo.java. This handler returns the bucket concat with filename like sourcebucket\HappyFace.jpg.
[code language="java"]
package com.jbksoft.lambda;
import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.RequestHandler;
import com.amazonaws.services.lambda.runtime.events.S3Event;
import com.amazonaws.services.s3.event.S3EventNotification;
public class S3LambdaDemo implements RequestHandler&lt;S3Event, String&gt; {
public String handleRequest(S3Event s3Event, Context context) {
S3EventNotification.S3EventNotificationRecord record = s3Event.getRecords().get(0);
String srcBucket = record.getS3().getBucket().getName();
String srcKey = record.getS3().getObject().getKey().replace('+', ' ');
return srcBucket+"/"+srcKey;
}
}
[/code]
Run on terminal
mvn clean package
Upload the jar file.
Provide Handler: com.jbksoft.lambda.S3LambdaDemo and Choose the role: lambda-execution-s3-role(Refer the execution role for role creation).
Click create.
Click test and choose s3 put from drop down.
Click 'save and test'. The lambda will provision the aws service and trigger the lambda with test file HappyFace.jpg(line 14) and source bucket 'sourcebucket' (line 19).
The output is name of bucket concat with filename.
Similarly the handler can process the content of the file on s3 as well.
Reference:
Hello,
ReplyDeleteIf we update code, again we need to upload the 6MB jar file to server?. Its pretty tedious right?. any alternatives?. Can you please help
Thanks
You can put the jar file to s3 and reuse it from s3. But if you update then you will have to upload it again.. :(
ReplyDelete