Further improvements

There are two key improvements which you can do:

  • Integrating the message broker
  • Developing a distributed transaction

Let's discuss each one in brief.

Integrating the message broker

For now, we are using the EventEmitter class of Node.js to emit and receive events. You can hook advanced message brokers such as RabbitMQ or Apache Kafka. All you need to do is, instead of emitting the event in the Polyglot class, just push the changes in the message broker.

At the consumer end, any program can read it and use it for their use. This way, there will be high cohesion and less tight coupling.

If you are going to do it using RabbitMQ then we have already covered the integration part in Chapter 7, Extending RethinkDB.

Moving right along to the next improvement.

Developing a distributed transaction

For now, we are not using any transaction in between the databases to perform the CRUD operation. Since our databases can be SQL, NoSQL, key-value, graph-based, and so on, and not all databases provide transaction support, we need to develop one on our own.

The topic is still open for debate as to whether we should have the transaction in such heterogeneous databases, but if you want to then I believe it can be developed using batch operations.

Upon insertion, updating, and deletion of documents, you need to run the batch operations that take the query, status, and fallback query (the query to execute in case of an error and push it in some sort of queue). When the last database gets updated, you need to send the response to the client that the operation is completed.

In case of an error, start the process and execute the fallback query on each database and return the error to the client.

You are welcome to suggest and use the approach in a better way than this, but this is what I can think of to the best of my knowledge.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.223.239.226