This article demonstrates using gRPC technology in microservices architecture. The project show the new features of gRPC and apply it into a simple microservices project. You can have an overview of this technology and how to use it in real world project.

Getting Started

gRPC is an open source remote procedure call (RPC) system initially developed at Google. It uses HTTP/2 for transport, Protocol Buffers as the interface description language, and provides features such as authentication, bidirectional streaming and flow control, blocking or nonblocking bindings, and cancellation and timeouts. It generates cross-platform client and server bindings for many languages.

Like previous article, you’re going to use Docker to build microservices. You should have the basic understanding some basic commands of Docker before getting start.

The following sequence diagram show the final operations of the project:

The aims of this project are to setup a simple microservices application and to use gRPC for communicating messages between services.

Setup container communication

The first step for setting up project environment is creating a docker network. It allows containers communicate within it. Bridge networks offer the easiest solution to create your own Docker network. Open Terminal and type following command:

1
$ docker network create -d bridge qNet

Setup Users module

In this section, you’re going to create a docker container for users data management. First, you need generate protobuf files by compiling the users.proto file. The content for users.proto is shown below.

1
2
3
4
5
6
7
8
9
10
11
syntax = "proto3";
message UserRequest {
int32 userId = 1;
}
message UserResponse {
string name = 1;
string imageUrl = 2;
}
service Users {
rpc RequestUser (UserRequest) returns (UserResponse) {}
}

By using the grpcio-tools to compile the users.proto file, you can get the users_pb2.py and users_pb2_grpc.py two files.

The compile command is:

1
$ python -m grpc_tools.protoc -I. --python_out=. --grpc_python_out=. users.proto

Then you create the server using the grpc generated python file.

The server.py file:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
from concurrent import futures
import time

import grpc

import users_pb2
import users_pb2_grpc
import json

_ONE_DAY_IN_SECONDS = 60 * 60 * 24

data = json.load(open('users.json'))

class UsersService(users_pb2_grpc.UsersServicer):
def RequestUser(self, request, context):
filterResult = list(filter(lambda user: user['id'] == request.userId, data))
userDic = filterResult[0]
return users_pb2.UserResponse(name = userDic['name'], imageUrl = userDic['imgUrl'])

def serve():
server = grpc.server(futures.ThreadPoolExecutor(max_workers=10))
users_pb2_grpc.add_UsersServicer_to_server(UsersService(), server)
server.add_insecure_port('[::]:22222')
server.start()
try:
while True:
time.sleep(_ONE_DAY_IN_SECONDS)
except KeyboardInterrupt:
server.stop(0)

if __name__ == '__main__':
serve()

In this project, the service file is very simple , it just get data from json file and pass it to grpc message.

Next, you need to create Dockerfile to build the docker container for users module. The content of Dockerfile is:

1
2
3
4
5
FROM grpc/python:1.4
ADD . /code
WORKDIR /code
EXPOSE 22222
CMD ["python", "server.py"]

Run following command to build the docker image:

1
$ docker build -t uses-module ./users

Then run the docker container by using this command:

1
$ docker run -d -it --name users-module --net qNet -v $(pwd)/users:/code users-module

Setup comments module

This module uses for comments data management. Similar to users module, you create the comments.proto file for it. The content of comments.proto file is:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
syntax = "proto3";
message CommentsRequest {
int32 contentId = 1;
}
message CommentsResponse {
message Comment {
string content = 1;
string name = 2;
string imageUrl = 3;
}
repeated Comment comments = 1;
}
service Comments {
rpc RequestComments (CommentsRequest) returns (CommentsResponse) {}
}

Then, you generate comments_pb2.py and comments_pb2_grpc.py using this command:

1
$ python -m grpc_tools.protoc -I. --python_out=. --grpc_python_out=. comments.proto

In this section, you need to get data from users module so copy users_pb2.py and users_pb2_grpc.py from previous part to /comments directory.

Next, create the service file. The content of server.py is:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
from concurrent import futures
import time

import grpc

import comments_pb2
import comments_pb2_grpc
import users_pb2
import users_pb2_grpc

import json

_ONE_DAY_IN_SECONDS = 60 * 60 * 24

data = json.load(open('comments.json'))

class CommentsService(comments_pb2_grpc.CommentsServicer):
def RequestComments(self, request, context):
contentId = request.contentId
comments = list(filter(lambda comment: comment['contentId'] == contentId, data))
response = comments_pb2.CommentsResponse()
channel = grpc.insecure_channel('users-module:22222')
stub = users_pb2_grpc.UsersStub(channel)
for item in comments:
comment = response.comments.add()
comment.content = item['comment']
responseUser = stub.RequestUser(users_pb2.UserRequest(userId = item['userId']))
comment.name = responseUser.name
comment.imageUrl = responseUser.imageUrl
return response

def serve():
server = grpc.server(futures.ThreadPoolExecutor(max_workers=10))
comments_pb2_grpc.add_CommentsServicer_to_server(CommentsService(), server)
server.add_insecure_port('[::]:22222')
server.start()
try:
while True:
time.sleep(_ONE_DAY_IN_SECONDS)
except KeyboardInterrupt:
server.stop(0)

if __name__ == '__main__':
serve()

The comments service get data content from comments.json file and request user information via gRPC message.

Next, you need to create Dockerfile to build the docker container for comments module. The content of Dockerfile is:

1
2
3
4
5
FROM grpc/python:1.4
ADD . /code
WORKDIR /code
EXPOSE 22222
CMD ["python", "server.py"]

Run following command to build the docker image:

1
$ docker build -t uses-module ./comments

Then run the docker container by using this command:

1
$ docker run -d -it --name comments-module --net qNet -v $(pwd)/comments:/code comments-module

Setup Locations module

Locations module uses for locations data management. First, you need to define proto file for this part. The content of locations.proto is:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
syntax = "proto3";
import "google/api/annotations.proto";
message LocationsRequest {}
message Location {
int32 id = 1;
string title = 2;
string subtitle = 3;
float lat = 4;
float long = 5;
}
message LocationsResponse {
repeated Location locations = 1;
}
message LocationDetailRequest {
int32 locationId = 1;
}
message LocationDetailResponse {
message Comment {
string content = 1;
string name = 2;
string imageUrl = 3;
}
Location location = 1;
repeated Comment comment = 2;
}
service Locations {
rpc RequestLocations (LocationsRequest) returns (LocationsResponse) {
option (google.api.http).get = "/locations";
}
rpc RequestLocationDetail (LocationDetailRequest) returns (LocationDetailResponse) {
option (google.api.http).get = "/locations/{locationId}";
}
}

You can see the difference in this proto file is import “google/api/annotations.proto”; and the options in services code. You can find more useful information about it here. This options are going to use for generating reverse proxy code and swagger definition later in this article.

Then, you generate locations_pb2.py and locations_pb2_grpc.py using this command:

1
$ python -m grpc_tools.protoc -I. --python_out=. --grpc_python_out=. locations.proto

Copy comments_pb2.py and comments_pb2_grpc.py to /locations directory for communicating with comments module.

Next, you need create service file. The content of server.py file is:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
from concurrent import futures
import time

import grpc

import locations_pb2
import locations_pb2_grpc
import comments_pb2
import comments_pb2_grpc

import json

_ONE_DAY_IN_SECONDS = 60 * 60 * 24
data = json.load(open('locations.json'))

class LocationsService(locations_pb2_grpc.LocationsServicer):
def RequestLocations(self, request, context):
response = locations_pb2.LocationsResponse()
for item in data:
location = response.locations.add()
location.id = item['id']
location.title = item['title']
location.subtitle = item['subtitle']
location.lat = item['lat']
location.long = item['long']
return response
def RequestLocationDetail(self, request, context):
contentId = request.locationId
filterResult = list(filter(lambda location: location['id'] == contentId, data))
locationDic = filterResult[0]
channel = grpc.insecure_channel('comments-module:22222')
stub = comments_pb2_grpc.CommentsStub(channel)
response = stub.RequestComments(comments_pb2.CommentsRequest(contentId = contentId))
detailResoponse = locations_pb2.LocationDetailResponse(location = locations_pb2.Location(id = locationDic['id'], title = locationDic['title'], subtitle = locationDic['subtitle'], lat = locationDic['lat'], long = locationDic['long']))
for item in response.comments:
comment = detailResoponse.comment.add()
comment.content = item.content
comment.name = item.name
comment.imageUrl = item.imageUrl
return detailResoponse

def serve():
server = grpc.server(futures.ThreadPoolExecutor(max_workers=10))
locations_pb2_grpc.add_LocationsServicer_to_server(LocationsService(), server)
server.add_insecure_port('[::]:22222')
server.start()
try:
while True:
time.sleep(_ONE_DAY_IN_SECONDS)
except KeyboardInterrupt:
server.stop(0)

if __name__ == '__main__':
serve()

The comments service get data content from locations.json file and request comment information via gRPC message.

Next, you need to create Dockerfile to build the docker container for location module. The content of Dockerfile is:

1
2
3
4
5
6
7
FROM grpc/python:1.4
ADD . /code
WORKDIR /code
RUN pip install --upgrade gcloud
RUN pip install --upgrade google-api-python-client
EXPOSE 22222
CMD ["python", "server.py"]

Run following command to build the docker image:

1
$ docker build -t locations-module ./locations

Then run the docker container by using this command:

1
$ docker run -d -it --name locations-module --net qNet -v $(pwd)/locations:/code locations-module

Now, the our backend site of application has been finished. You will create client site to test it in next section.

Setup the client side

Role of client is sending request and displaying the result. Firstly, copy the locations_pb2.py and locations_pb2_grpc.py to /location-clients directory. Then, create the service file, the content of client.py is:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
from __future__ import print_function

import grpc
import locations_pb2
import locations_pb2_grpc
from pprint import pprint

def run():
channel = grpc.insecure_channel('locations-module:22222')
stub = locations_pb2_grpc.LocationsStub(channel)
response = stub.RequestLocationDetail(locations_pb2.LocationDetailRequest(locationId = 1))
pprint(response)

if __name__ == '__main__':
run()

Now, create the Dockerfile to build docker image, the content of it is:

1
2
3
4
5
FROM grpc/python:1.4
ADD . /code
WORKDIR /code
RUN pip install --upgrade gcloud
RUN pip install --upgrade google-api-python-client

Run following command to build the docker image:

1
$ docker build -t location-client-module ./locations-clients

Then run the docker container by using this command:

1
$ docker run -d -it --name location-client-module --net qNet -v $(pwd)/locations-clients:/code location-client-module

Final step is testing, run this command to access to location-client-module module ‘s command prompt:

1
$ docker exec -it location-client-module bash

Run the client.py inside this container, you should see the following output:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
# python client.py
location {
id: 1
title: "Test location 1"
subtitle: "Fun fun fun fun"
lat: 21.029273986816406
long: 105.84941864013672
}
comment {
content: "This is bad"
name: "Spider man"
imageUrl: "http://www.memes.at/faces/spiderpman.jpg"
}
comment {
content: "%&%&&((**))"
name: "Troller"
imageUrl: "http://www.memes.at/faces/troll_face.jpg"
}

At this point, you have completed all the microservices application. The next section is discussing advance topic about generating reverse proxy and swagger definition

Set up API gateway (Experimental)

grpc-gateway is a plugin of protoc.It reads gRPC service definition, and generates a reverse-proxy server which translates a RESTful JSON API into gRPC. This server is generated according to custom options in your gRPC definition.

You can find more information about it here

In /grpc-gateway directory is the reverse proxy container code. You can build it by running this command:

1
$ docker build -t locations-gateway ./grpc-gateway

Run this container by run this command:

1
$ docker run --name locations-gateway --net qNet -p 8080:80 locations-gateway --backend=locations-module:22222

Testing the proxy container, by running this commad to get the json data of location with id 3:

1
2
$ curl http://localhost:8080/locations/3
{"location":{"id":3,"title":"Test location 3","subtitle":"Fun fun fun fun","lat":21.026318,"long":105.85042},"comment":[{"content":"This is spam","name":"Quan","imageUrl":"http://www.memes.at/faces/wow.jpg"}]}

You can also convert the swagger definition file to other programming languages like swift, objC … It’s my most interesting feature of grpc-gateway plugin. You can find the swagger definition file in /grpc-gateway/src/gen/pb-go/locations.swagger.json and use swagger editor to convert.

You can check out my IOS swift client for this project.

Conclusions

In this article, a microservice application has been constructed using gRPC to communicating. During process you have review the basic feature of gRPC by real example.

Final code for this project can find here