Go API Design With Protocol Buffers and gRPC | by Yair Fernando | May, 2022

A step-by-step information based mostly on a social media app

When constructing APIs utilizing gRPC, the API design is often constructed utilizing protocol buffers to outline API endpoints, request messages, and response messages. That is usually completed utilizing proto recordsdata that may later be used to generate the go code to make precise RPC calls and likewise to generate a reverse proxy to translate gRPC into JSON APIs, and so forth.

On this article, we’ll concentrate on the next:

  • constructing API designs utilizing protocol buffers
  • creating endpoints
  • defining request messages
  • defining response messages
  • utilizing enumerables
  • dealing with timestamps
  • returning empty responses

I received’t go into particulars on what gRPC is or why it is best to use protocol buffers versus JSON, however in case you are concerned with attending to know extra, right here’s one other article that explains a bit extra about it.

For this text, the API design will likely be based mostly on a social media app.

Earlier than leaping in, it is very important point out that I’ll be utilizing Go modules. So, to comply with alongside on this collection, it’ll be essential to have Go modules working accurately in your machine since we’ll be testing the API in a Go software later.

First, let’s navigate to the $GOPATH utilizing this command:


Then create or navigate to the next path:

cd src/github.com/YourGithubHandle/sma

Creating this path now will likely be useful later as a result of we’ll finally push this code to GitHub.

As soon as we’re within the sma folder, which stands for social media app, let’s initialize a Go module.

go mod init github.com/YourGithubHandle/sma

This command will create the go.mod file for us.

Then we’ll create the posts.proto file underneath protos.

So, let’s create that folder and put the posts.proto file inside it.

protos/, posts.proto, go.mod, go.sum

We’ll first outline the create endpoint. This endpoint will obtain a request message with some parameters, and it’ll return a response message notifying us that the publish object was created.

Let’s see the definition under:

The primary line of this proto file defines which proto syntax we’re utilizing. On this case, that’s the proto3 syntax. If this line is just not specified, the protobuf compiler will assume the proto2 syntax is getting used. If you wish to discover extra about their variations, right here is the proto3 and proto2 documentation.

In line 3, we now have the package deal definition. This may also help you keep away from title clashes if you’ll want to outline the identical message definition underneath a unique package deal title.

Then we even have the possibility go_package. In Golang’s case, this will likely be used because the package deal title, however for this instance, it’s the identical as the unique package deal title. Should you want it to be completely different, you should use this feature.

Then we now have an import. This import permits us to correctly serialize created_at and updated_at timestamp. This import is actually one other proto file timestamp.proto that protobuf will use to deal with timestamps. Similar to we’re importing this exterior proto file, we will likely be utilizing different exterior proto recordsdata for various functions.

Additionally, we will use definitions from different proto recordsdata we personal if we wanted, together with ones that don’t come from Google. We’ll be overlaying that later.

In line 9, we now have the service definition for Posts. By defining a service in a proto file, we will generate Go code when the protocol buffer compiler makes use of this file. It should generate server and shopper interfaces with all of the code to make RPC calls.

We outline the endpoint that we’ll need to assist inside this service. On this case, we now have the create publish endpoint, which takes the CreatePostReq and returns the created Put up object.

Let’s concentrate on the CreatePostReq message first.

This message defines the parameters this endpoint will settle for within the request.

The construction is first defining the sphere’s sort, then the sphere’s title, after which the sphere’s quantity.

The sphere’s sort specifies the information sort of this param, both a string, integer, boolean, float, double, bytes, and so forth.

The sphere’s title is simply the title of the param itself.

The sphere’s quantity is the distinctive quantity that’s used to determine every area within the binary format. And mustn’t change as soon as your message sort is already getting used. One necessary factor to notice right here is that it is best to attempt to reserve the numbers from 1 to fifteen for a very powerful and frequent fields in your message.

It is because the sphere numbers from 1 to fifteen take one byte to encode and from 16 to 2047 take two bytes. So, it’s completely wonderful you probably have 10 fields in a message that goes from 1 to 10, after which two different fields which have a area quantity larger than 15 since you need to reserve these 5 numbers for future fields.

The very last thing to say about this CreatePostReq message is the final area has an additional phrase initially of the definition. It is because this final param is an array, and the way in which to characterize this in protocol buffers is utilizing the repeated phrase initially of the sphere definition. The default worth of a repeated area can be an empty listing or array, relying on the programming language this message is translated to.

Let’s now examine the definition of the Put up message. The primary 4 fields are self-explanatory, so let’s go to the standing area. Right here we’re utilizing an enum (outlined in line 50) as the sphere’s sort. Utilizing an enum will guarantee this area solely accepts one of many choices within the enum. If no worth is handed, the enum worth with the sphere variety of 0 would be the default.

A few issues for enums are that it has to have a zero worth because the zero worth is taken into account the default worth. One other factor is that for compatibility with proto2 syntax, the zero worth must be the primary ingredient within the enum.

Lastly, we now have the created and up to date fields within the Put up message, and right here the sphere’s sort is google.protobuf.Timestamp, which is, in flip, has one other protobuf message with two fields: seconds and nanos.

We’ll be including extra endpoints to this file, however for now, let’s go forward and generate the Go code for this proto file.

First, let’s set up protobuf with this command:

brew set up protobuf

We’ll additionally want to put in protoc-gen-go and protoc-gen-go-grpc, as proven under:

go set up google.golang.org/protobuf/cmd/protoc-gen-go@newest
go set up google.golang.org/grpc/cmd/protoc-gen-go-grpc@newest

Then be sure you replace your path

export PATH="$PATH:$(go env GOPATH)/bin"

After that, ensure “allow module” is on

export GO111MODULE=on

Should you sort protoc, it is best to see the next:

Now, let’s navigate to the protos folder and run the next command to generate the Go code. Earlier than doing this, be sure you create the src/go/sma folder construction.

protoc --go_out=src/go/sma 

It will generate the Go code with the Posts service and endpoint definitions.

protos/, src/go/sma/, posts.pb.go, posts_grpc.pb.go, posts.proto, go.mod, go.sum

Nice! We’ve generated the code for the Posts service containing the create endpoint.

To proceed constructing the Posts service, we have to add the UPDATE, SHOW, LIST, and DELETE endpoints.

Right here’s the definition for the replace endpoint:

UpdatePost RPC endpoint

As standard, we outline the RPC endpoint with a request message and a response message. That’s not new at this level. Let’s study the UpdatePostReq message definition.

The primary area of this message is update_mask , and its sort is google.protobuf.FieldMask.

Area masks are used for 2 issues, both to explicitly outline the set of fields that must be returned by a get operation or the fields that will likely be up to date within the case of an replace operation. It filters out the fields despatched in a patch request and solely leaves or permits those specified within the masks.

For a get operation, it’s going to take the response and solely return the fields which might be current within the masks. The remaining fields will default to the default worth of the information sort they maintain. For endpoints that return a group of objects, the sphere masks will apply to every object within the assortment. On this case, the masks object is the final area on this message.

Later, we’ll see how this area is used after we write the annotations for this proto file. One factor to notice right here is that the REST verb for this endpoint will likely be PATCH since we’re utilizing a area masks. PUT is just used for full updates.

On line 8, we now have the import of this proto file to make use of FieldMask.

UpdatePost holds the permitted attributes to be up to date. Because of this the endpoint will allow the title, description, media_ids, and schedule_at attributes to be up to date.

PostId is the id of the publish to be up to date.

All proper, let’s transfer on to the present endpoint, which is proven under:

ShowPost rpc endpoint

This can be a fairly easy one. Within the PostIdReq message, we now have just one area, which is the post_id of the publish we need to present. As a response, we return the Put up message we declared earlier than.

Right here’s the code for the ListPosts endpoint definition:

ListPosts RPC endpoint

For this endpoint, we’ll return the gathering of posts related to a given consumer in order that the consumer id will likely be wanted within the request message.

If we check out the ListPostsReq message, the primary area is the precise user_idThe following one is a search question string in case we need to search by the gathering. It additionally has web page and per_page fields to specify which web page to return and what number of components every web page ought to have.

The final ingredient is the filter, and right here once more, we now have an enum representing the kind of this area, which implies we will solely filter by the weather specified within the Filters enum, which maps the posts’ statuses.

In ListPostsResp, we now have solely two fields. The primary one represents the gathering of posts that will likely be returned, and the second focuses on the pagination information object.

Lastly, let’s see the definition of the delete endpoint:

DeletePost RPC endpoint

This endpoint is fairly easy; it takes the PostIdReq we beforehand outlined and returns an empty response message.

Let’s additionally replace the CreatePostResp message so as to add the scheduled_at area since I added it within the UpdatePost endpoint.

CreatePostResp up to date. Code under.

We’re executed with the API for the Posts service.

Right here’s the whole proto file:

Now, let’s generate the Go code for this service.

To keep away from having to recollect the final protoc command we ran to generate the Go code, let’s create a Makefile underneath the protos folder with the next code:


Now, let’s open the terminal underneath the protos folder and run the make command:

make proto proto_file=posts.proto

It will generate the Go code! Nice, we now have outlined the API endpoints for the Posts useful resource utilizing protocol buffers and gRPC!

The folder construction seems like this:

protos/, src/go/sma/,posts.pb. go, posts_grpc.pb.go,makefile, posts.proto, go.mod, go.sum, readme.md
Folder construction

Let’s now see generate swagger and openapi documentation for the Posts API.

First, let’s outline the service configuration file for the posts.proto file. This file accommodates the HTTP configurations and mappings from gRPC to REST API.

Create a brand new file contained in the protos folder and name it posts_annotations.yml . The code follows:

Right here we now have the HTTP guidelines for every endpoint from the posts.proto file.

The selector specifies the RPC endpoint, after which we specify the HTTP verb adopted by the remaining path we need to map this endpoint to. Right here we specify the parameters that have to be within the path. On this case, the user_id and the post_id rely on every endpoint.

Within the UpdatePost endpoint, we used a area masks, and we referred to as the updatable object Put up. That’s the reason the physique has a string that claims “publish.” That is referring to the Put up message we now have within the UpdatePostReq message.

Run this command to put in swagger and protoc-gen-openapiv2:

go set up github.com/grpc-ecosystem/grpc-gateway/protoc-gen-swagger@newest
go set up github.com/grpc-ecosystem/grpc-gateway/v2/protoc-gen-openapiv2@newest

Let’s create two new folders. Beneath src, create the swagger folder and the openapi folder.

Now, we have to prolong the command within the Makefile to generate the API documentation with swagger and openapi. We may also embrace the command to generate the gateway Go code in order that our API is able to implement a reverse proxy.

The brand new command seems like this:

Let’s generate the code.

make proto annotation_file=posts_annotations.yml proto_file=posts.proto

It will create the swagger JSON recordsdata and likewise a brand new file posts.pb.gw.go, which accommodates the code to implement a reverse proxy.

That is the brand new folder construction after working the make command:

protos, src, go/sma, posts.pb.go, posts_grpc.pb.go, openapi, posts.swagger.json, swagger/, posts.swagger.json, makefile, posts.proto, posts_annotations.yml, go.mod, go.sum, readme.md
Folder construction

Nice. Now we have coated so much, and our API design is prepared for use in a gRPC venture. If you wish to see implement the reverse proxy and eat this API, go away a remark!

Thanks for studying. Keep tuned.

More Posts