Friday, December 9, 2016

Install Maven on CentOS

Maven is open source (written in java) powerful build tool for java development projects. We can automate task such as compile,clean, build, deploy and also dependency management.

In order to install Maven on CentOS, follow the below steps:

1.Download the Maven tar ball

Download the tar in the folder you want to extract to using below command

[code]

wget http://mirror.reverse.net/pub/apache/maven/maven-3/3.3.9/binaries/apache-maven-3.3.9-bin.tar.gz

[/code]

2. Extract the tar ball

[code]

tar xvf apache-maven-3.3.9-bin.tar.gz

[/code]

3. Move Maven to /usr/local directory. This is a optional step.

[code]
sudo mv apache-maven-3.3.9 /usr/local/apache-maven

[/code]

4. Edit ~/.bashrc to set env. properties

[code]

# Add below lines to .bashrc

export M2_HOME=/usr/local/apache-maven
export PATH=$M2_HOME/bin:$PATH

[/code]

4. Execute below command to set the env. properties

[code]

source ~/.bashrc

[/code]

5. Verify Maven is installed

[code]

$ mvn -version
Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5; 2015-11-10T08:41:47-08:00)
Maven home: /usr/local/apache-maven
Java version: 1.8.0_72, vendor: Oracle Corporation
Java home: /usr/java/jdk1.8.0_72/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "3.10.0-327.el7.x86_64", arch: "amd64", family: "unix"

[/code]

Please let me know if you face any problem.

Happy Coding!!!!

Functional vs Object Oriented Programming



There are many programming model exists today such as Procedural Programming, Structural Programming,Event Driven Programming, Object Oriented Programming, Functional Programming and many more. In this tutorial, we will be focus on just two programming models.

Overview of Object Oriented Programming

In this model, there is a concept of Object which consists of set of data and operations on it.

The object is basically the blue print of any real thing such as aeroplane, car,chair, table, any living person and so on. Or object can be represents any conceptual term such as Company, Bank, Account,Customer and so on.

Each object will have attributes called data associated to it. And we can manipulate the data in the object using the method/operations defined in the it. This concept is called Data encapsulation.

Also, the object can inherit attributes and operations from other object. This terminology is called inheritance. For eg. Horse is a 'kind of' Animal. Therefore, it can inherit the data and operations from Animal.

Many programming language support object oriented model such as Java, C++, PHP,Python, Scala and many more.

Lets take a example of the Scala code wherein we will demonstrate a class called Account and inherited class Saving Account.

[code language="scala"]
package com.xxx

class Account(val AccountId:String, var AccountBalance:Double,val CustomerName: String,val AccountType:String ="ACC") {

def accId=AccountId
def accBal=AccountBalance
def custName=CustomerName
def acctType=AccountType
def setacctType(bal:Double):Unit= {AccountBalance=bal}
override def toString() =
"" + accId + (if (accBal < 0) "" else "+") + accBal + custName+ "i"
}
[/code]

[code language="scala"]

package com.xxx

class SavingsAccount(AccountId:String, AccountBalance:Double, CustomerName: String,var MinimumBalance:Double,AccountType:String ="SAV") extends Account(AccountId,AccountBalance,CustomerName,AccountType){

def minBal=MinimumBalance

override def toString() =
"" + accId + (if (accBal < 0) "" else "+") + accBal + custName+minBal+ "i"

}

[/code]

Overview of Functional Programming

In this model, the computation are evaluated as mathematical operations. Thus, output of the mathematical function f(x) depends only on the input data 'x'  and not any external variables.  For same input data 'x' there should be same output irrespective of the number of times the function is executes.

The mathematical function can be defined as high order function, lambda function, pure function, recursive function and many more.

Mostly, the programming language is hybrid of one or more models. The programming languages that support functional programming are Lisp, Scala, Clojure ,Standard ML.

Lets take the example of Scala code to define mathematical function

High order function: In this function can be passed as a parameter to a function as shown below

[code language="scala"]

def cube(a:Int):Int={
return a*a*a
}

def sum(f:Int=>Int,a:Int):Int={
if(a<=0) return 0
f(a)+sum(f,a-1)
}

//Now,execute the method as sum of cube

sum(cube,3)

[/code]

Recursive function

[code language="scala"]

//Recursive function
def fibonacci(a:Int):Int={
if(a <= 0) return 0
if ( a == 1) return 1
else return fibonacci(a-1)+fibonacci(a-2)

}
//Now, execute function and get fabonacci series

fibonacci(6)

[/code]

Difference between two with an example

We now understand some computation problem which can be solved using OOPs concept and some using functional based programming.

Suppose we have a list and we need to compute the total deposit of the bank.

In Object Oriented approach, we will loop through the list of account and find the results as shown below(using Scala Language)

[code language="scala"]

//Call the above method using for loop as shown below
var accList=List(new Account("Acc1",100,"XXX1"),new SavingsAccount("Sav",500,"XXX1",250))

var sum:Double = 0
for(i<-0 to accList.length-1){
sum+=accList(i).accBal
}

print(sum)

[/code]

In Functional programming approach, we can call list function "foldLeft" to perform operations on list.

[code language="scala"]

print(accList.foldLeft(0.0){(z,f)=> z+f.accBal})

[/code]

Spark Development using SBT in IntelliJ

Apache Spark

Apache Spark is open source big data computational system. It is developed using Scala programming language which run on JVM ( Java Virtual Machine) platform. Today, popularity of Spark is increasing due to it's in-memory data storage  and real time processing capabilities. This computational system provides high level API in Java, Scala and Python. Therefore, we can run data analytical queries using these high level API on Spark system and get desire insights. Spark can deployed to standalone cluster, Hadoop 2 (YARN) or Mesos.

SBT Overview

SBT is Simple Build Tool. A build tool help in automation of tasks like build,compile, test, package, run, deploy. Other build tools are like Maven, Ant, Gradle, Ivy. SBT is also one othe build tool that focus mainly on Scala projects.

Today, I am going to explore to write a basic query using Spark high level API in Scala 2.10. Also, I will be using IntelliJ as IDE for development.

Now, all set. Let get our hands dirty with some actual coding.

Prerequisite (Make sure your machine has below components already installed):

  1. Install  Java JDK 7+.

  2. Install SBT.

  3. Unzip IntelliJ.

Working on Code

a. Creating project structure

There are different ways project structure can be created. We can even use the existing project templates to create it automatically.Today, we are going to create the project structure  manually. In below code, we have create the root directory/project name ( scalaProjectDemo) and folder src/main/scala inside it as shown below:

[code language="java"]

mkdir scalaProjectDemo

cd scalaProjectDemo
mkdir project
mkdir -p src/main/scala
mkdir -p src/main/resources
touch project/build.properties
touch project/plugins.sbt
touch project/assembly.sbt
[/code]

b. Creating a build file

We will be creating the build file "build.sbt" in the root directory as shown below:

[code language="java"]

import AssemblyKeys._

assemblySettings

name := "scalaProjectDemo"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.1.0" % "provided"

libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.1.0"

libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.0.0-mr1-cdh4.2.0" % "provided"

resolvers ++= Seq(
"Cloudera Repository" at "https://repository.cloudera.com/artifactory/cloudera-repos/",
"Akka Repository" at "http://repo.akka.io/releases/",
"Spray Repository" at "http://repo.spray.cc/")

[/code]



The project can be imported here into IntelliJ. Please refer the item " Importing the code into IntelliJ "

c. Creating a Scala file for testing

Next, we create a sample Scala file which just a single print statement as shown below:

[code language="text"]

package com.xxx

object HelloWorld {
def main(args: Array[String]){
println("Hello World")
}
}

[/code]

d. Run the code

Next, we will run the code and make sure the code compile successfully as shown below:

[code language="java"]
$ cd scalaProjectDemo
$ sbt run
Getting org.scala-sbt sbt 0.13.6 ...

downloading http://repo.typesafe.com/typesafe/ivy-releases/org.scala-sbt/sbt/0.13.6/jars/sbt.jar ...
[SUCCESSFUL ] org.scala-sbt#sbt;0.13.6!sbt.jar (1481ms)
downloading http://repo.typesafe.com/typesafe/ivy-releases/org.scala-sbt/main/0.13.6/jars/main.jar ...
[SUCCESSFUL ] org.scala-sbt#main;0.13.6!main.jar (3868ms)
downloading http://repo.typesafe.com/typesafe/ivy-releases/org.scala-sbt/compiler-interface/0.13.6/jars/compiler-interface-bin.jar ...
[SUCCESSFUL ] org.scala-sbt#compiler-interface;0.13.6!compiler-interface-bin.jar (1653ms)
......
[SUCCESSFUL ] org.scala-sbt#test-agent;0.13.6!test-agent.jar (1595ms)
downloading http://repo.typesafe.com/typesafe/ivy-releases/org.scala-sbt/apply-macro/0.13.6/jars/apply-macro.jar ...
[SUCCESSFUL ] org.scala-sbt#apply-macro;0.13.6!apply-macro.jar (1619ms)
:: retrieving :: org.scala-sbt#boot-app
confs: [default]
44 artifacts copied, 0 already retrieved (13750kB/320ms)
[info] Loading project definition from /home/xxx/dev/scalaProjectDemo/project
[info] Updating {file:/home/xxx/dev/scalaProjectDemo/project/}scalaprojectdemo-build...
[info] Resolving org.scala-sbt.ivy#ivy;2.3.0-sbt-14d4d23e25f354cd296c73bfff40554[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] downloading https://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/com.eed3si9n/sbt-assembly/scala_2.10/sbt_0.13/0.11.2/jars/sbt-assembly.jar ...
[info] [SUCCESSFUL ] com.eed3si9n#sbt-assembly;0.11.2!sbt-assembly.jar (2136ms)
[info] downloading https://repo1.maven.org/maven2/org/apache/spark/spark-core_2.10/1.1.0/spark-core_2.10-1.1.0.jar ...
[info] [SUCCESSFUL ] org.slf4j#slf4j-log4j12;1.7.5!slf4j-log4j12.jar (78ms)
1.14.v20131031!jetty-webapp.jar (103ms)
............
[info] downloading https://repo1.maven.org/maven2/com/google/protobuf/protobuf-java/2.4.0a/protobuf-java-2.4.0a.jar ...
[info] [SUCCESSFUL ] com.google.protobuf#protobuf-java;2.4.0a!protobuf-java.jar (387ms)
[info] downloading https://repo1.maven.org/maven2/asm/asm/3.2/asm-3.2.jar ...
[info] [SUCCESSFUL ] asm#asm;3.2!asm.jar (195ms)
[info] Done updating.
[info] Compiling 1 Scala source to /home/pooja/dev/scalaProjectDemo/target/scala-2.10/classes...
[info] Running com.jbksoft.HelloWorld
Hello World
[success] Total time: 97 s, completed Dec 8, 2016 2:58:29 PM
[/code]

e. Importing the code into IntelliJ

Edit the file plugins.sbt

[code]

addSbtPlugin("com.github.mpeltonen" % "sbt-idea" % "1.5.2")

[/code]

Edit the assembly.sbt

[code]

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.11.2")

[/code]

Edit the build.properties

[code]sbt.version=0.13.6[/code]

Open the IntelliJ

screenshot-from-2016-12-09-13-43-36

Then from Menu choose File> Open

screenshot-from-2016-12-09-14-01-01

The open dialog opens

screenshot-from-2016-12-09-14-01-34

choose the project path and click OK

screenshot-from-2016-12-09-14-01-46

Leave default and click OK

screenshot-from-2016-12-09-14-02-00

The project will take time as it downloads the dependencies. Click OK

Screenshot from 2016-12-09 14-02-27.png

Select the new window as it open the project in new window.

screenshot-from-2016-12-09-14-02-34

The project will be imported in IntelliJ

screenshot-from-2016-12-09-14-02-48

Expand the src > main > scala

Screenshot from 2016-12-09 14-03-27.png

You can now add more files, run and debug the code in IntelliJ

screenshot-from-2016-12-09-14-20-21

The code will get compiled by scala compiler and with then be executed.

screenshot-from-2016-12-09-14-21-08

The output is run the window.

Let me know i you face any issues.

Happy Coding

Tuesday, December 6, 2016

CentOS 6 Installation using pen drive

On my laptop Dell Inspiron 5010, operating system windows 7 got crashed and I was not able to restore it.  Another problem was that it was running too slow so decided to format it and freshly install CentOS 6.

Below are the steps for installation of CentOS:

1. Created a bootable pen drive using Mac machine (used dd command)

Prerequisite:
Pen drive should have ample space (350 MB for minimal boot media and 4.5 GB for full installation media).I have done minimal installation. It will be formatted before iso image is copied to it.

a. Download your favourite iso image using the url https://www.centos.org/download/ (Latest CentOS) or http://isoredirect.centos.org/centos/6/isos/x86_64/ (Centos 6).

b. Figure out the device of USB stick.
First, list all the disk attached to Mac machine

[code language="text"]
diskutil list
[/code]

Screen Shot 2015-09-07 at 11.54.08 PM
The one at the bottom is my USB drive (look at its memory or name to identify the USB stick)

c. Detached USB stick from current accessible filesystem
The USB stick data will be earsed before iso image, therefore if it's file is accessed by user then for dd command resource will keep on waiting. Therefore, it is required to umount pen drive using command below.

[code language="text"]
diskutil unmountDisk /dev/disk2
Unmount of all volumes on disk2 was successful
[/code]

d. Copy the iso image over USB Stick
Lets type in dd command to copy the iso images to USB stick. It may take time but don't terminate the session.

[code language="text"]
sudo dd if=CentOS-6.7-x86_64-minimal.iso of=/dev/disk2
[/code]

For i copied Centos 6 minimal iso image and it took me around 1 min.

2. Make the laptop boot from pen drive.
Go into laptop set up configuration (Press F2) and then select boot tab and then change the 1st boot priority to USB Storage Device. Exit and save configuration.

3. Installing from Pen drive to hard disk

Plug-in the bootable usb in the laptop and start on the machine. The screen will show you the installation step for CentOS.
a. Welcome to CentOS screen choose "Install or upgrade an existing system"
Library installation with start and it will take few minutes.
b. Then choose the language (chosen english). Then type of keyboard (chosen us). Then choose installation method as "Hard Drive"
c. Select Partition screen will show partition on the disk drive holding the CentOS iso image.
d. What type of devices will your installation involve screen show up. I have choose "Basic Storage Device" and press Next button.
e. Now, "Please name this computer. The hostname identifies the computer on network." screen shows up. Didn't touch localhost name. On the screen there is "Configure Network" button also. Just configure network and press next.
f. Next, screen shows Select your nearest city in the timezone (Select your city). And press Next.
g. Next screen show up the root password. Specify it and press next.
h. Next screen show "What type of installation would you like". "Use All Space" chosen (Not selected any checkbox "Encrypt System" or "Review and modify partitioning layout" and press next.
i. Then warning popup message shows up "Writing storage configuration to disk". Press "Write changes to disk" button.Result in formatting the hard drives.
j. The next screen will show up the "Boot loader operating system list" (you can add/delete device). Just press next.
k. The next screen show up with message "The default installation of CentOS is a minimum install. You can optionally to select different set of Software now".Chosen "Minimal" only and press next

Now the "CentOS installation starting" popup message will show up.

Lastly, "Congratulations, your CentOS installation is complete" screen shows up.

Conclusion

In this tutorial, I  had written the step followed to install the centOS 7 minimal operating system on Dell machine.

Thursday, January 28, 2016

AWS Lambda S3 integration

AWS recently added new compute service called Lambda. The lambda compute service can process the data from S3, Dynamodb, SQS etc without provisioning the required compute explicitly. The lambda service can listen to S3 and can process the file as it is put into the S3 bucket.

For this you need to create a s3 bucket.

Screen Shot 2016-01-28 at 7.58.48 PM

Name the bucket as per your choice.

Screen Shot 2016-01-28 at 7.59.10 PM

Log in the AWS console.

Screen Shot 2016-01-28 at 7.55.51 PM

Choose Lambda. If you are accessing it the first time then you will see below page.

Screen Shot 2016-01-28 at 8.15.40 PM

Click Get Started.

If you already have lambda functions in your account the screen will be as below, Click create a Lambda Function.

Screen Shot 2016-01-28 at 7.56.33 PM



The lambda function is 2 step process. You can skip the blueprint for this tutorial.

Screen Shot 2016-01-28 at 7.57.22 PM

Name your Lambda function. Provide short description. Choose the Java 8 as runtime.

Screen Shot 2016-01-28 at 7.58.17 PM

Now we require java code in the form of zip file. Lets create the java file using maven project.

Create a maven project using below command

[code language="xml"]
mvn archetype:generate -DarchetypeArtifactId=maven-archetype-quickstart -DgroupId=com.jbksoft.lambda -DartifactId=lambda-s3-demo -DinteractiveMode=false -DarchetypeVersion=1.1
[/code]

Download the below versions of AWS-Java-SDK, AWS-Lambda-java-core and aws-lambda-java-events using the maven dependency. Update pom.xml with below dependency.
 <dependency>
   <groupId>com.amazonaws</groupId>
   <artifactId>aws-lambda-java-core</artifactId>
   <version>1.0.0</version>
 </dependency>
 <dependency>
   <groupId>com.amazonaws</groupId>
   <artifactId>aws-lambda-java-events</artifactId>
   <version>1.0.0</version>
 </dependency>
 <dependency>
   <groupId>com.amazonaws</groupId>
   <artifactId>aws-java-sdk-s3</artifactId>
   <version>1.10.36</version>
 </dependency>

 <build>
   <finalName>lambdas3demo</finalName>
   <plugins>
     <plugin>
       <groupId>org.apache.maven.plugins</groupId>
       <artifactId>maven-shade-plugin</artifactId>
       <version>2.3</version>
       <configuration>
         <createDependencyReducedPom>false</createDependencyReducedPom>
       </configuration>
       <executions>
         <execution>
           <phase>package</phase>
             <goals>
               <goal>shade</goal>
             </goals>
         </execution>
       </executions>
     </plugin>
   </plugins>
 </build>

Create a java file S3LambdaDemo.java. This handler returns the bucket concat with filename like sourcebucket\HappyFace.jpg.

[code language="java"]
package com.jbksoft.lambda;

import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.RequestHandler;
import com.amazonaws.services.lambda.runtime.events.S3Event;
import com.amazonaws.services.s3.event.S3EventNotification;

public class S3LambdaDemo implements RequestHandler&amp;lt;S3Event, String&amp;gt; {
public String handleRequest(S3Event s3Event, Context context) {
S3EventNotification.S3EventNotificationRecord record = s3Event.getRecords().get(0);
String srcBucket = record.getS3().getBucket().getName();
String srcKey = record.getS3().getObject().getKey().replace('+', ' ');
return srcBucket+"/"+srcKey;
}
}
[/code]

Run on terminal

mvn clean package

Upload the jar file.

Screen Shot 2016-01-28 at 9.38.35 PM

Provide Handler: com.jbksoft.lambda.S3LambdaDemo and Choose the role: lambda-execution-s3-role(Refer the execution role for role creation).

Screen Shot 2016-01-28 at 9.40.03 PM

Click create.

Screen Shot 2016-01-28 at 9.47.51 PM

Click test and choose s3 put from drop down.

Screen Shot 2016-01-28 at 9.48.52 PM

Click 'save and test'. The lambda will provision the aws service and trigger the lambda with test file HappyFace.jpg(line 14) and source bucket 'sourcebucket' (line 19).Screen Shot 2016-01-28 at 10.17.32 PM

The output is name of bucket concat with filename.

Similarly the handler can process the content of the file on s3 as well.

Reference:

Thursday, October 8, 2015

Send data between 2 JVMs using protocol buffer RPC

Protocol Buffer is flexible,efficient,automated,inter-operable solution for serialization.It makes writing remote procedure call(rpc) simpler and easy to implement by serializing/deserializing request/response objects. But, it doesn't implement any transport details by itself.
Now, protobuf-socket-rpc is a simple tcp/ip socket based rpc implementation in java, python for protobuf rpc services.

In this blog, we will discuss about implementing a rpc program
1. Creating the request/response object using .proto
2. Creating the Service class using .proto
3. Implementing the Service method
4. Write the Server application
5. Write Client application

Prerequisites:
1. Java installed. If not, install java.
2. Protocol Buffer 2.4.0 installed. If not, install
3. If not maven project download protobuf-java-2.4.1.jar and also download protobuf-socket-rpc-2.0.jar.
If maven project add dependency and download protobuf-socket-rpc-2.0.jar in local machine in directory ${project.basedir}/thirdparty/protobuf-socket-rpc-2.0.jar.
[code language="text"]
<dependency>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-java</artifactId>
<version>2.4.1</version>
</dependency>
<dependency>
<groupId>com.googlecode.protobuf.socketrpc</groupId>
<artifactId>protobuf-socket-rpc</artifactId>
<version>2.0</version>
<scope>system</scope>
<systemPath>${project.basedir}/thirdparty/protobuf-socket-rpc-2.0.jar</systemPath>
</dependency>
</dependencies>
[/code]
Note: protoc-java.jar version should be same as protoc –version. Otherwise error will popup java.lang.UnsupportedOperationException: This is supposed to be overridden by subclasses.

Now, we will be creating a simple greeting service.

Creating the request/response object using .proto.
We will be creating the objects for remote communication. We have named the file as 'testData.proto'.

[code language="text"]
package protobufDemo; //this is protobuf namespace, not java's
option java_package = "com.xxx.protobufferrpc.protobufferrpcData";
option java_outer_classname = "GreetingProtos";
message HelloRequest
{
required string name = 1;
}
message HelloReply
{
required string message = 1;
}
[/code]

After it generate the automatic classes using below command:

[code language="text"]
$ protoc --java_out=. name_of_protoc.proto
[/code]

Creating the Service class using .proto
We will be creating the service class.
[code language="text"]
package protobufDemo;
import "testData.proto";
option java_package = "com.xxx.protobufferrpc.protobufferrpcData";
option java_outer_classname = "MyGreetingService";
option java_generic_services = true; //if you don't do this, protoc wont generate the stubs you need for rpc

service Greeting //In generated class, this class is abstract class that extends service method need to extends this
{
rpc sayHello(HelloRequest) returns (HelloReply);
}
[/code]
After it generate the automatic classes using below command:
[code language="text"]
$ protoc --java_out=. name_of_protoc.proto
[/code]

Implementing the Service method
Now, we will be implementing rpc sayHello method defined above.
[code language="text"]
package com.xxx.protobufferrpc;

import com.google.protobuf.RpcCallback;
import com.google.protobuf.RpcController;
import com.xxx.protobufferrpc.protobufferrpcData.GreetingProtos;
import com.xxx.protobufferrpc.protobufferrpcData.MyGreetingService.Greeting;

public class MyGreetingServiceImpl extends Greeting {

@Override
public void sayHello(RpcController controller, GreetingProtos.HelloRequest request, RpcCallback<GreetingProtos.HelloReply> done) {
GreetingProtos.HelloReply.Builder build= GreetingProtos.HelloReply.newBuilder();
if(request.getName().equalsIgnoreCase("namenode")){
build.setMessage("This is message for namenode only");

}else{
build.setMessage("Please see person sending message");
}
done.run(build.build());
}
}
[/code]

Write the Server application
The server application is listening on port 4446.

[code language="text"]
package com.xxx.protobufferrpc;

import java.util.concurrent.Executors;
import com.googlecode.protobuf.socketrpc.RpcServer;
import com.googlecode.protobuf.socketrpc.ServerRpcConnectionFactory;
import com.googlecode.protobuf.socketrpc.SocketRpcConnectionFactories;

public class ServerCode {
/**
* @param args the command line arguments
*/
public static void main(String[] args)
{
ServerRpcConnectionFactory rpcConnectionFactory = SocketRpcConnectionFactories.createServerRpcConnectionFactory(4446);
RpcServer server = new RpcServer(rpcConnectionFactory, Executors.newFixedThreadPool(5), true);
server.registerService(new MyGreetingServiceImpl());
server.run();
}

}
[/code]

Write Client application
In the client code, we will be passing the IP Address of server machine and port (4446) for calling remote procedure "sayHello".

[code language="text"]
package com.xxx.protobufferrpc;

import com.google.protobuf.RpcCallback;
import com.google.protobuf.RpcChannel;
import com.google.protobuf.RpcController;
import com.googlecode.protobuf.socketrpc.RpcChannels;
import com.googlecode.protobuf.socketrpc.RpcConnectionFactory;
import com.googlecode.protobuf.socketrpc.SocketRpcConnectionFactories;
import com.googlecode.protobuf.socketrpc.SocketRpcController;
import com.xxx.protobufferrpc.protobufferrpcData.GreetingProtos;
import com.xxx.protobufferrpc.protobufferrpcData.MyGreetingService;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;

public class ProtoClient {
public static void main(String[] args)
{
// Create a thread pool
ExecutorService threadPool = Executors.newFixedThreadPool(1);
// Create channel
String host =args[0]; //IP Address of machine "192.16.42.22";
int port = 4446;
RpcConnectionFactory connectionFactory = SocketRpcConnectionFactories.createRpcConnectionFactory(host, port);
RpcChannel channel = RpcChannels.newRpcChannel(connectionFactory, threadPool);

// Call service
MyGreetingService.Greeting.Stub myService = MyGreetingService.Greeting.newStub(channel);
RpcController controller = new SocketRpcController();

GreetingProtos.HelloRequest.Builder cr = GreetingProtos.HelloRequest.newBuilder();
cr.setName("Hello");
myService.sayHello(controller, cr.build(),
new RpcCallback<GreetingProtos.HelloReply>()
{
public void run(GreetingProtos.HelloReply myResponse)
{
System.out.println("Received Response: " + myResponse);
}
});
// Check success
if (controller.failed())
{
System.err.println(String.format("Rpc failed %s ", controller.errorText()));
}
}

}
[/code]

Now, run the server and client application.

Hope you follow the discussion.

References:
Protocol Socket RPC
Google Protocol Buffer

Protocol Buffer Serializing in Java

Google Protocol Buffer is platform neutral, extensible tool for serializing structure data. It is inter-operable and not specific to any language.

In java, we have number of way we can serialized the object, some of the ways are listed below:
1. Java Build-in library : Marker class java.io.Serializable and just implement read,write method in class. But this way, serialized stream can be read/unmarshalled by Java program only.
2. Simple String : Need to implement code to covert to bytes and vice versa.
3. XML : XML DOM parsing can be memory intensive.
Protocol Buffer is flexible,efficient,automated solution to above problem.

In this post, we will be discussing about
1. Creating the .proto file (Structure Data)
2. Generating the automated classes
3. Writing Serialized Data
4. Reading Serialized Data

Prerequisites:
1. Java installed. If not, install.
2. Protocol Buffer installed. If not, install
3. If not maven project download protobuf-java-2.5.0.jar.
If maven project add dependency.

[code language="text"]
<dependency>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-java</artifactId>
<version>2.5.0</version>
</dependency>
[/code]
Note: protoc-java.jar version should be same as protoc --version. Otherwise error will popup java.lang.UnsupportedOperationException: This is supposed to be overridden by subclasses.

Creating the .proto file
Let create a simple structure object PersonAccountsBook embedding the Account object as as shown below:

[code language="text"]
package protobufDemo; //this is protobuf namespace, not java's
option java_package = "com.xxx.protobuf.protobufferData";
option java_outer_classname = "PersonAccountsBookProtos";

message Person {
required string name = 1;
required int32 id = 2;
optional string email = 3;

enum AccountType{
CHECKING = 0;
SAVING = 1;
VISA = 2;
}

message AccountNumber {
required string number = 1;
optional AccountType type = 2 [default = SAVING];
}

repeated AccountNumber accounts= 4;
}

message PersonAccountsBook {
repeated Person person = 1;
}
[/code]

Now, the Data Type can be any predefined like bool, int32, float, double, and string or user defined as AccountType.The, number specified with fields "=1" or "=2" is unique tags given. Number 1-15 required less code encoding than higher number than 15.
The fields can be
required- If field value required by the structured data like person_id,AccountNumber
optional-If field value has 0 or 1 occurrence
repeated-If field value has 0 to many occurrence.

Generating the automated classes from .proto file

The below code will generate the class PersonAccountsBookProtos in package "com.xxx.protobuf.protobufferData".Note .proto in path project_path/src/java/PersonAccountsBook.proto

[code language="text"]
protoc --java_out=. PersonAccountsBook.proto
[/code]

Writing Serialized Structure Data
In this class we need to provide the person account book file path argument, if file not exist then it will create one. Note: We have Builder class to pass value to the fields as shown below.

[code langugae="text"]
package com.xxx.protobuf;

import com.xxx.protobuf.protobufferData.PersonAccountsBookProtos.PersonAccountsBook;
import com.xxx.protobuf.protobufferData.PersonAccountsBookProtos.Person;

import java.io.*;

public class AddPerson {

// This function fills in a Person message based on user input.
static Person PromptForAddress(BufferedReader stdin,
PrintStream stdout) throws IOException {
Person.Builder person = Person.newBuilder();

stdout.print("Enter person ID: ");
person.setId(Integer.valueOf(stdin.readLine()));

stdout.print("Enter name: ");
person.setName(stdin.readLine());

stdout.print("Enter email address (blank for none): ");
String email = stdin.readLine();
if (email.length() > 0) {
person.setEmail(email);
}

while (true) {
stdout.print("Enter a Account Number (or leave blank to finish): ");
String number = stdin.readLine();
if (number.length() == 0) {
break;
}

Person.AccountNumber.Builder accountNumber =
Person.AccountNumber.newBuilder().setNumber(number);

stdout.print("Is this a saving,checking,visa account? ");
String type = stdin.readLine();
if (type.equals("saving")) {
accountNumber.setType(Person.AccountType.SAVING);
} else if (type.equals("visa")) {
accountNumber.setType(Person.AccountType.VISA);
} else if (type.equals("checking")) {
accountNumber.setType(Person.AccountType.CHECKING);
} else {
stdout.println("Unknown account type. Using default.");
}

person.addAccounts(accountNumber);
}

return person.build();
}

// Main function: Reads the entire account book from a file,
// adds one person based on user input, then writes it back out to the same
// file.
public static void main(String[] args) throws Exception {
if (args.length != 1) {
System.err.println("Usage: AddPerson ACCOUNT_BOOK_FILE");
System.exit(-1);
}

PersonAccountsBook.Builder accountBook = PersonAccountsBook.newBuilder();

// Read the existing address book.
try {
accountBook.mergeFrom(new FileInputStream(args[0]));
} catch (FileNotFoundException e) {
System.out.println(args[0] + ": File not found. Creating a new file.");
}

// Add an address.
accountBook.addPerson(
PromptForAddress(new BufferedReader(new InputStreamReader(System.in)),
System.out));

// Write the new address book back to disk.
FileOutputStream output = new FileOutputStream(args[0]);
accountBook.build().writeTo(output);
output.close();
}
}
[/code]

Reading Serialized Data
We need to provide person account book file path argument to the java application.

[code langugae="text"]
package com.xxx.protobuf;

import com.xxx.protobuf.protobufferData.PersonAccountsBookProtos.PersonAccountsBook;
import com.xxx.protobuf.protobufferData.PersonAccountsBookProtos.Person;

import java.io.FileInputStream;

public class ListPerson {
// Iterates though all people in the AccountBook and prints info about them.
static void Print(PersonAccountsBook accountBook) {
for (Person person: accountBook.getPersonList()) {
System.out.println("Person ID: " + person.getId());
System.out.println(" Name: " + person.getName());
if (person.hasEmail()) {
System.out.println(" E-mail address: " + person.getEmail());
}

for (Person.AccountNumber accountNumber : person.getAccountsList()) {
switch (accountNumber.getType()) {
case VISA:
System.out.print(" Visa Account #: ");
break;
case SAVING:
System.out.print(" Saving Account #: ");
break;
case CHECKING:
System.out.print(" Checking Account #: ");
break;
}
System.out.println(accountNumber.getNumber());
}
}
}

// Main function: Reads the entire address book from a file and prints all
// the information inside.
public static void main(String[] args) throws Exception {
if (args.length != 1) {
System.err.println("Usage: ListPeople Account_BOOK_FILE");
System.exit(-1);
}

// Read the existing address book.
PersonAccountsBook accountBook =
PersonAccountsBook.parseFrom(new FileInputStream(args[0]));

Print(accountBook);
}
}
[/code]

Hope you follow the discussion and it would be useful to you.

References:
Google protocol buffer tutorial
Java Serialization API