Integrate Protobuf with Rabbitmq in NodeJS - Part III

Often, the task at hand involves building a system that enables communication between services using asynchronous methods. There are several ways to accomplish this, but for our purposes, we’ll be using a queue system.

In this third part of our series on building a communication system between microservices using Node.js, Protobuf, and RabbitMQ, we’ll focus on integrating all the components into a cohesive project.

Let’s begin by creating two files: producer.ts and consumer.ts.

// src/producer.ts
import * as dotenv from 'dotenv';
import { 
  RabbitMQClient,
  RabbitMQClientOptions 
} from './rabbitmq-client';

dotenv.config();

const opts: RabbitMQClientOptions = {
  host: process.env.RABBITMQ_HOST || 'localhost',
  port: process.env.RABBITMQ_PORT 
    ? parseInt(process.env.RABBITMQ_PORT) 
    : 5672,
  user: process.env.RABBITMQ_USER || 'user',
  password: process.env.RABBITMQ_PASSWORD || 'password',
};

export const producer = (): RabbitMQClient => 
  new RabbitMQClient(opts);

const main = async () => {
  const client = producer();
  await client.connect();
}

main().catch(console.error);
// src/consumer.ts
import * as dotenv from 'dotenv';
import { 
  RabbitMQClient,
  RabbitMQClientOptions 
} from './rabbitmq-client';

dotenv.config();

const opts: RabbitMQClientOptions = {
  host: process.env.RABBITMQ_HOST || 'localhost',
  port: process.env.RABBITMQ_PORT 
    ? parseInt(process.env.RABBITMQ_PORT) 
    : 5672,
  user: process.env.RABBITMQ_USER || 'user',
  password: process.env.RABBITMQ_PASSWORD || 'password',
};

export const consumer = (): RabbitMQClient => 
  new RabbitMQClient(opts);

const main = async () => {
  const client = consumer();
  await client.connect();
}

main().catch(console.error);

To run a simple application on our machine, the best approach is to use a Docker Compose setup. Before setting that up, we’ll use Nodemon to streamline our development process. Let’s start by installing the necessary tools.

npm i -D nodemon

Additionally lets add to our scripts in package.json two more scripts which would be responsbile for running consumer and producer.

  "scripts": {
    ...
    "start:producer": 
      "nodemon -r dotenv/config src/producer.ts",
    "start:consumer": 
      "nodemon -r dotenv/config src/consumer.ts"
  },

Now we are ready to continue our work in docker-compose.yml setup.

services:
  rabbitmq:
    container_name: rabbitmq
    image: rabbitmq:3.12.9-management
    ports:
      - "5672:5672"
      - "15672:15672"
    environment:
      - RABBITMQ_DEFAULT_USER=user
      - RABBITMQ_DEFAULT_PASS=password
    volumes:
    - rabbitmq:/var/lib/rabbitmq/mnesia
    - ./.erlang.cookie:/var/lib/rabbitmq/.erlang.cookie

volumes:
  rabbitmq:

We should define in our configuration a container for rabbitmq, our choice is rabbitmq:3.12.9-management. In this configuration it is required to define certain value such as RABBITMQ_DEFAULT_USER and RABBITMQ_DEFAULT_PASS that are required for running RabbitMQ

Lets add to our producer capibilities to actually produce a message that will then be published to the queue.

// src/producer.ts
import { 
  DeviceInformationMessageSchema 
} from '../gen';
...
const main = async () => {
  ...
  onst info = {
    time: {
      seconds: BigInt(1630000000),
      nanos: 0,
    },
    mac: '00:00:00:00:00:00',
    name: 'device-name',
  };

  const message = create(
    DeviceInformationMessageSchema, 
    info
  );
  await client.publish(
    DeviceInformationMessageSchema, 
    message, 
    'device', 
    ''
  );
}

As well we should implement the other part of consuming the message, since it is also part of the whole idea of implementing a Pub/Sub Patter with RabbitMQ

// src/consumer.ts
...

const handler = (
  message: Message, 
  originalMessage?: GetMessage
) => {
  console.log('Received message');
};

const main = async () => {
  ...
 await client.consumeOne(
    'device',
    {
      DeviceInformationMessage: handler,
    },
    true,
  );
}

The only missing step before we start to use and test our demo, is to configure RabbitMQ, so head to UI of RabbitMQ and create and exchange with name device, a queue with name device and bind them together with empty routing key. If you missed it, you should start RabbitMQ instance by executing:

docker compose up

We’ll extend our setup by defining two additional services: one for consuming and one for producing, both built with Docker for local development. To ensure both containers can access RabbitMQ, we’ll pass environment variables for the connection and establish RabbitMQ as a dependency for these services.

services:
  ...
  producer:
    container_name: producer
    build:
      context: .
      dockerfile: Dockerfile
    volumes:
      - .:/app
    depends_on:
      - rabbitmq
    environment:
      RABBITMQ_HOST: rabbitmq
      RABBITMQ_PORT: 5672
      RABBITMQ_USER: user
      RABBITMQ_PASS: password
    entrypoint: ["npm", "run", "start:producer"]

  consumer:
    container_name: consumer
    build:
      context: .
      dockerfile: Dockerfile
    volumes:
      - .:/app
    depends_on:
      - rabbitmq
    environment:
      RABBITMQ_HOST: rabbitmq
      RABBITMQ_PORT: 5672
      RABBITMQ_USER: user
      RABBITMQ_PASS: password
    entrypoint: ["npm", "run", "start:consumer"]

As seen, there a build context of a docker with a Dockerfile, below you may find what was defined in it

FROM node:20 as build

WORKDIR /app

COPY package*.json ./
RUN npm install
COPY . .

EXPOSE 3000

The final step is to run the command that will bring the two new services up and running.

docker compose up

As you may have noticed, there are still some features missing, such as long-running consume processes (since we only implemented consumeOne) and the ability to publish multiple messages. These functionalities can be built upon the existing methods like consumeOne and publish, but I’ll leave that exploration to those curious enough to take this project further.

This concludes our journey of integrating Protobuf with RabbitMQ in Node.js through a simple project. It has been an incredible experience, filled with concepts, integrations, and a demonstration of the true power of combining these tools. Although our implementation focused on Node.js, the principles can easily be extended to other ecosystems as well.

Keep pushing forward and savor every step of your coding journey.