Last year, Google released it’s new RPC (remote call procedure) framework called gRPC. In the world of mobile and IoT, gRPC is a huge step up from the internal stubby framework.

What is gRPC?

gRPC is a language-neutral, platform-neutral framework that allows you to write applications where independent services can work with each other as if they were native. If you write a REST API in, say, Python, you can have Java clients work with it as if it were a local package (with local objects).

This figure should clarify it even further.

As in many RPC systems, gRPC is based around the idea of defining a service, specifying the methods that can be called remotely with their parameters and return types. On the server side, the server implements this interface and runs a gRPC server to handle client calls. On the client side, the client has a stub that provides exactly the same methods as the server.

Source: http://www.grpc.io/docs/

Why is it ?

  • gRPC is supported in tons of languages. That means, you write your service in, say, Python, and get FREE!!! native support in 10 languages.
    • C++
    • Java
    • Go
    • Python
    • Ruby
    • Node.js
    • Android Java
    • C#
    • Objective-C
    • PHP
  • gRPC is based on the brand new shiny HTTP/2 standard which offers a bunch of cool stuff over HTTP/1. My favorite HTTP/2 feature is bidirectional streaming.

  • gRPC uses Protocol Buffers (protobufs) to define the service and messages. Protobuf is a thing for serializing structured data that Google made for the impatient world (meaning it’s fast and efficient).

  • As I mentioned it, gRPC allows bi-directional streaming out of the box. No more long polling and blocking HTTP calls. This is valuable for a lot of services (any real-time service for example).

Why it could fail.

  • gRPC is Alpha Software™. That means it comes with no guarantees. It can (and will) break, docs are not yet comprehensive, support could be lacking. Expect tears and blood if you use it in production.

  • It has no browser support, yet. (Pure JS implementation of protobufs is in alpha, so this point will likely be moot in a few months).

  • No word from other browser vendors on standardization (which is why Dart didn’t catch on).

What are we making?

I was thinking really hard what I should make that is simple enough for most people to follow, but also practical so you can actually use it in your projects.

I use Twitter a lot, and have worked on a lot of project using the Twitter API. Almost every project requires parsing the tweet text to extract tagged users, hashtags, URLs etc. I always use twitter-text-python library. I think it will be great to write a server wrapping this package in Python, and then generating stubs (native client libraries) in Python and Ruby.

All code is here: https://github.com/karan/grpc-practical-tutorial

What you need

  • protoc - install
  • grpc-python - pip install grpcio
  • grpc-ruby - gem install grpc

Installation of these should be easy.

parser.proto

The proto file is where we define our service, and the messages that compose the service. For this particular project, this is the proto file we are using.

I have commented out the file so it should be pretty straightforward.

// We're using proto3 syntax
syntax = "proto3";

package twittertext;

// This is the service for our API
service TwitterText {
  // This is where we define the methods in this service

  // We have a method called `Parse` which takes
  // parameter called `TweetRequest` and returns
  // the message `ParsedResponse`
  rpc Parse(TweetRequest) returns (ParsedResponse) {}
}

// The request message has the tweet text to be parsed
message TweetRequest {
  // The field `text` is of type `string`
  string text = 1;
}

// The request message has the tweet text to be parsed
message ParsedResponse {
  // `repeated` is used for a list
  repeated string users = 1;
  repeated string tags = 2;
  repeated string urls = 3;
}

Full proto3 syntax guide can be found here.

Generate gRPC code

Now comes the fun part. We are going to use gRPC to generate libraries for Python and Ruby.

# Python client
protoc  -I protos/ --python_out=. --grpc_out=. --plugin=protoc-gen-grpc=`which grpc_python_plugin` protos/parser.proto

# Ruby
protoc -I protos/ --ruby_out=lib --grpc_out=lib --plugin=protoc-gen-grpc=`which grpc_ruby_plugin` protos/parser.proto

What has happened is that based on the proto file we defined earlier, gRPC has made native libraries for us.

The first command will generate parser_pb2.py. The latter will generate lib/parser.rb and lib/parser_service.rb. All three files are small and easy to understand.

A python client can now just import parser_pb2 and start using the service as if it were a native package. Same for ruby.

Make server.py

I decided to make my server in Python, but I could have used Ruby as well.

import time

from ttp import ttp

// Bring in the package for our service
import parser_pb2

_ONE_DAY_IN_SECONDS = 60 * 60 * 24

// This is the parser from the third-party package,
// NOT from gRPC
p = ttp.Parser()

class Parser(parser_pb2.BetaTwitterTextServicer):
    def Parse(self, request, context):
        print 'Received message: %s' % request
        result = p.parse(request.text)
        return parser_pb2.ParsedResponse(users=result.users,
                                         tags=result.tags,
                                         urls=result.urls)

def serve():
    server = parser_pb2.beta_create_TwitterText_server(Parser())
    server.add_insecure_port('[::]:50051')
    server.start()
    try:
        while True:
            time.sleep(_ONE_DAY_IN_SECONDS)
    except KeyboardInterrupt:
        server.stop(0)

if __name__ == '__main__':
    serve()

At this point, it is helpful to have parser_pb2.py open. What we are doing in class Parser is implementing the interface parser_pb2.BetaTwitterTextServicer that was generated, and implementing the Parse method.

In Parse, we receive the request (which is a TweetRequest object), parse it using the third-party package, and respond with a parser_pb2.ParsedResponse object (structure defined in the proto file).

In serve(), we create our server, bind it to a port and start it. Simple. :)

To start the server, simply run python server.py.

Write the clients

client.py

from grpc.beta import implementations

import parser_pb2

_TIMEOUT_SECONDS = 10

text = ("@burnettedmond, you now support #IvoWertzel's tweet "
        "parser! https://github.com/edburnett/")

def run():
    channel = implementations.insecure_channel('localhost', 50051)
    stub = parser_pb2.beta_create_TwitterText_stub(channel)
    response = stub.Parse(parser_pb2.TweetRequest(text=text), _TIMEOUT_SECONDS)
    print 'Parser client received: %s' % response
    print 'response.users=%s' % response.users
    print 'response.tags=%s' % response.tags
    print 'response.urls=%s' % response.urls

if __name__ == '__main__':
    run()

The generated code also contains a helpful method for creating a client stub. We bind that to the same port as the server, and call our Parse method. Notice how we build the request object (parser_pb2.TweetRequest(text=text)) - it must be the same as defined in the proto file.

You can run this client using python client.py and see this output:

response.users=[u'burnettedmond']
response.tags=[u'IvoWertzel']
response.urls=[u'https://github.com/edburnett/']

client.rb

this_dir = File.expand_path(File.dirname(__FILE__))
lib_dir = File.join(this_dir, 'lib')
$LOAD_PATH.unshift(lib_dir) unless $LOAD_PATH.include?(lib_dir)

require 'grpc'
require 'parser_services'

def main
  stub = Twittertext::TwitterText::Stub.new('localhost:50051', :this_channel_is_insecure)
  response = stub.parse(Twittertext::TweetRequest.new(text: "@burnettedmond, you now support #IvoWertzel's tweet parser! https://github.com/edburnett/"))
  puts "#{response.inspect}"
  puts "response.users=#{response.users}"
  puts "response.tags=#{response.tags}"
  puts "response.urls=#{response.urls}"
end

main

Similarly, we build the client for Ruby, construct the Twittertext::TwitterText::Stub, pass in a Twittertext::TweetRequest, and receive a Twittertext::ParsedResponse back.

To run this client, use ruby client.py. You should expect the following output:

<Twittertext::ParsedResponse: users: ["burnettedmond"], tags: ["IvoWertzel"], urls: ["https://github.com/edburnett/"]>
response.users=["burnettedmond"]
response.tags=["IvoWertzel"]
response.urls=["https://github.com/edburnett/"]

Conclusion

Again, the full code is at https://github.com/karan/grpc-practical-tutorial.

You can keep building clients the same way for 10+ languages. Write once, use everywhere (almost). We haven’t even touched the sweet parts of gRPC, especially streaming, but if you look at this guide, they cover it well. I myself am just beginning to explore gRPC, but so far it seems promising. I can’t wait to see what you make with it.

Additional resources:

Cover: Flickr