Creating a Real-Time App with Nuxt

In this chapter, we are going to venture a bit further with Nuxt to see how we can use it to work with other frameworks for making real-time apps. We will continue using Koa as the backend API but "enhance" it with RethinkDB and Socket.IO. In other words, we will turn our backend API into a real-time API with these two awesome frameworks and tools. At the same time, we will turn our frontend Nuxt app into a real-time Nuxt app with help from them as well. You can develop these two real-time apps on the single-domain approach if you prefer. However, this book favors the cross-domain approach so that we don't mix up the frontend and the backend dependencies and get confused over time. So, this will be another interesting and exciting chapter for you to learn from!

In this chapter, we will cover the following topics:

  • Introducing RethinkDB
  • Integrating RethinkDB with Koa
  • Introducing Socket.IO
  • Integrating Socket.IO with Nuxt

Let's get started!

Introducing RethinkDB

RethinkDB is an open-source JSON database for real-time apps. It pushes JSON data to your apps in real-time from the database whenever a change occurs in the database tables that you subscribe to their real-time feeds – changefeeds. Despite that the changefeeds lies at the heart of RethinkDB's real-time functionality, you can skip this functionality if you want to. You can use RethinkDB just like MongoDB to store and query your NoSQL database.

Even though you can use the Change Streams in MongoDB to access the real-time data changes but it requires some configuration to get it started, while the real-time feeds are ready for use by default in RethinkDB and you can tap in right away without any configuration. Let's get started by installing the RethinkDB server in your system and see how you can use it in the next section.

Installing RethinkDB Server

At the time of writing this book, the current stable version of RethinkDB is 2.4.0 (Night Of The Living Dead), released on 19 December 2019. There are a few ways to install a RethinkDB server, depending on the platform (Ubuntu, or OS). You can check out the guide at https://rethinkdb.com/docs/install/ for your platform. Note Windows is not supported yet in 2.4.0. For more information about this issue for Windows, please visit https://rethinkdb.com/docs/install/windows.

In this book, we will install RethinkDB 2.4.0 on Ubuntu 20.04 LTS (Focal Fossa). It works the same if you are on Ubuntu 19.10 (Eoan Ermine), Ubuntu 19.04 (Disco Dingo), or the older versions of Ubuntu, such as 18.04 LTS (Bionic Beaver). Let's get started:

  1. Add the RethinkDB repository to your list of Ubuntu repositories, as follows:
$ source /etc/lsb-release && echo "deb https://download.rethinkdb.com/apt $DISTRIB_CODENAME main" | sudo tee /etc/apt/sources.list.d/rethinkdb.list
  1. Get the public key of RethinkDB using wget:
$ wget -qO- https://download.rethinkdb.com/apt/pubkey.gpg | sudo apt-key add -

You should get an OK message on your terminal for the preceding command line.

  1. Update your version of Ubuntu and install RethinkDB:
$ sudo apt update
$ sudo apt install rethinkdb
  1. Verify RethinkDB:
$ rethinkdb -v

You should get the following output on the terminal:

rethinkdb 2.4.0~0eoan (CLANG 9.0.0 (tags/RELEASE_900/final))

RethinkDB comes with an administrative UI for you to manage databases on a browser at localhost:8080. This can be very handy and useful during project development. If you ever want to uninstall RethinkDB and remove all its databases, you can do so with the following commands:

$ sudo apt purge rethinkdb.
$ sudo rm -r /var/lib/rethinkdb

The administrative UI that came with the installation is like the PHP Adminer that you used to manage the MySQL databases for the PHP API in the previous chapter. You can use the RethinkDB administrative UI to add databases and tables by using the graphical buttons on the UI or using the RethinkDB query language (in JavaScript), ReQL. We'll explore the administrative UI and ReQL in the following section.

Introducing ReQL

ReQL is the query language of RethinkDB and is used for manipulating the JSON documents in RethinDB databases. The queries are constructed automatically by calling RethinkDB's built-in chainable functions on the server-side. These functions are embedded in the driver in various programming languages JavaScript, Python, Ruby, and Java. You can check out the ReQL commands/functions at the following links:

We will be using JavaScript in this book. Let's use the Data Explorer on the administrative UI to perform some CRUD operations by using the respective ReQL commands. You can navigate to the page where the Data Explorer is or point your browser to localhost:8080/#dataexplorer and start playing with the queries, as shown here. The default top-level namespace on the Data Explorer is r, so the ReQL commands must be chained to this namespace.

However, we can change this r namespace and call anything we like when using the driver in our app, which we will do in the next section. For now, let's stick to the default namespace, r, for this exercise:

  1. Create a database:
r.dbCreate('nuxtdb')

Click the Run button. You should get a result similar to the following on the screen showing that one database has been created with the database name you chose and that an ID was generated by RethinkDB:

{
"config_changes": [
{
"new_val": {
"id": "353d11a4-adc8-4958-a4ae-a82c996dcb9f" ,
"name": "nuxtdb"
} ,
"old_val": null
}
] ,
"dbs_created": 1
}
If you want to find out more information about the dbCreate ReQL command, please visit https://rethinkdb.com/api/javascript/db_create/.
  1. Create a table in an existing database; for example, create a user table in the nuxtdb database:
r.db('nuxtdb').tableCreate('user')

Click the Run button. You should get a result similar to the following on the screen showing that one table has been created with an ID generated by RethinkDB for you and other information about the table that you created:

{
"config_changes": [{
"new_val": {
"db": "nuxtdb",
"durability": "hard",
"id": "259e0066-1ffe-4064-8b24-d1c82e515a4a",
"indexes": [],
"name": "user",
"primary_key": "id",
"shards": [{
"nonvoting_replicas": [],
"primary_replica": "lau_desktop_opw",
"replicas": ["lau_desktop_opw"]
}],
"write_acks": "majority",
"write_hook": null
},
"old_val": null
}],
"tables_created": 1
}
If you want to find out more information about the tableCreate ReQL command, please visit https://rethinkdb.com/api/javascript/table_create/.
  1. Insert new documents into the user table:
r.db('nuxtdb').table('user').insert([
{ name: "Jane Doe", slug: "jane" },
{ name: "John Doe", slug: "john" }
])

Click the Run button. You should get a result similar to the following on the screen, showing that two documents have been inserted with keys generated by RethinkDB for you:

{
"deleted": 0,
"errors": 0,
"generated_keys": [
"7f7d768d-0efd-447d-8605-2d460a381944",
"a144001c-d47e-4e20-a570-a29968980d0f"
],
"inserted": 2,
"replaced": 0,
"skipped": 0,
"unchanged": 0
}
If you want to find out more information about the table and insert ReQL commands, please visit https://rethinkdb.com/api/javascript/table/ and https://rethinkdb.com/api/javascript/insert/, respectively.
  1. Retrieve documents from the user table:
r.db('nuxtdb').table('user')

Click the Run button. You should get a result similar to the following on the screen, showing two documents from the user table:

[{
"id": "7f7d768d-0efd-447d-8605-2d460a381944",
"name": "Jane Doe",
"slug": "jane"
}, {
"id": "a144001c-d47e-4e20-a570-a29968980d0f",
"name": "John Doe",
"slug": "john"
}]

You can chain the count method to the query if you want to count the total documents in a table, as follows:

r.db('nuxtdb').table('user').count()

You should get 2 in the user table after injecting the new documents.

If you want to find out more information about the count ReQL command, please visit https://rethinkdb.com/api/javascript/count/.
  1. Update the documents in the user table by filtering the table with the slug key:
r.db('nuxtdb').table('user')
.filter(
r.row("slug").eq("john")
)
.update({
name: "John Wick"
})

Click the Run button. You should get the following result on the screen, showing that one document has been replaced:

{
"deleted": 0,
"errors": 0,
"inserted": 0,
"replaced": 1,
"skipped": 0,
"unchanged": 0
}
If you want to find out more information about the filter and update ReQL commands, please visit https://rethinkdb.com/api/javascript/filter/ and https://rethinkdb.com/api/javascript/update/, respectively.

Also, if you want to find out more information about the row and eq ReQL commands, please visit https://rethinkdb.com/api/javascript/row/ and https://rethinkdb.com/api/javascript/eq/, respectively.
  1. Delete a document from the user table by filtering the table with the slug key:
r.db('nuxtdb').table('user')
.filter(
r.row("slug").eq("john")
)
.delete()

Click the Run button. You should get the following result on the screen, showing that one document has been deleted:

{
"deleted": 1,
"errors": 0,
"inserted": 0,
"replaced": 0,
"skipped": 0,
"unchanged": 0
}

If you want to delete all the documents in a table, then simply chain the delete method to the table without filtering, as follows:

r.db('nuxtdb').table('user').delete()
If you want to find out more information about the delete ReQL command, please visit https://rethinkdb.com/api/javascript/delete/.

It is fun and easy in using ReQL commands, isn't it? You don't have to read through all the ReQL commands and study each of them in great detail to be productive. You just have to know what you want to do and find the commands you need from the ReQL command reference/API page based on the programming language that you already know about. Next, you will find out how to add the RethinkDB client or driver to your app. Let's get to it!

Integrating RethinkDB with Koa

In this section, we will build a simple API following the PHP APIs that we created in the previous chapter to list, add, update, and delete users. In the previous API, we used PHP and MySQL, while in this chapter, we will use JavaScript and RethinkDB. We will still use Koa as the framework for our API. But this time, we will restructure the API directory so that the structure is consistent (as much as possible) with the directory structure you already familiar with for the Nuxt app and PHP API. So, let's get started!

Restructuring API directories

Remember the default directory structure that you get in your project when using Vue CLI, which you learned about in Chapter 11, Writing Route Middleware and Server Middleware? After installing a project with Vue CLI, if you take a look inside the project directory, you will see a barebones project structure in which you can find a /src/ directory to develop your components, pages, and routes, as follows:

├── package.json
├── babel.config.js
├── README.md
├── public
│ ├── index.html
│ └── favicon.ico
└── src
├── App.vue
├── main.js
├── router.js
├── components
│ └── HelloWorld.vue
└── assets
└── logo.png

We have been using this kind of standard structure for the cross-domain apps since Chapter 12, Creating User Logins and API Authentication. For example, the following is the directory structure for Koa APIs, which you made previously:

backend
├── package.json
├── backpack.config.js
├── static
│ └── ...
└── src
├── index.vue
├── ...
├── modules
│ └── ...
└── core
└── ...

But this time, we will eliminate the /src/ directory from the APIs that we are going to make in this chapter. So, let's move everything in the /src/ directory up to the top level and reconfigure how we bootstrap the app, as follows:

  1. Create the following files and folders in the project's root directory:
backend
├── package.json
├── backpack.config.js
├── middlewares.js
├── routes.js
├── configs
│ ├── index.js
│ └── rethinkdb.js
├── core
│ └── ...
├── middlewares
│ └── ...
├── modules
│ └── ...
└── public
└── index.js

Again, the directory structure here is merely a suggestion; you can design your directory structure as you wish so that it suits you the most. But let's take a glance at this suggested directory and study what these folders and files are used for:

  • The /configs/ directory is used to store the app's basic information and the RethinkDB database connection's details.
  • The /public/ directory is used to store the files for initiating the app.
  • The /modules/ directory is used to store the modules of the app, such as the 'user' module, which we will create in the upcoming sections.
  • The /core/ directory is used to store the common functions or classes that can be used throughout the app.
  • The middlewares.js file is the core location for importing middleware from the /middlewares/ and /node_modules/ directories.
  • The routes.js file is the core location for importing routes from the /modules directory.
  • The backpack.config.js file is used to customize the webpack configuration for our app.
  • The package.json file contains the scripts and dependencies of our app and is always located at the root level.
  1. Point the entry file to the index.js file in the /public/ directory:
// backpack.config.js
module.exports = {
webpack: (config, options, webpack) => {
config.entry.main = './public/index.js'
return config
}
}

Remember that the default entry file in Backpack is an index.js file in the /src/ directory. Since we have moved this index file to the /public/ directory, we must configure this entry point through the Backpack config file.

If you want to know more about the entry points in webpack, please visit https://webpack.js.org/concepts/entry-points/.
  1. Add aliases for the /configs, /core, /modules, and /middlewares paths to the resolve option in the webpack configuration before returning the config object in the Backpack config file:
// backpack.config.js
const path = require('path')

config.resolve = {
alias: {
Configs: path.resolve(__dirname, 'configs/'),
Core: path.resolve(__dirname, 'core/'),
Modules: path.resolve(__dirname, 'modules/'),
Middlewares: path.resolve(__dirname, 'middlewares/')
}
}

Using aliases to resolve the file path in our app is very useful and handy. Typically, we import files using the relative paths, like this:

import notFound from '../../Middlewares/notFound'

Instead of doing this, now, we can import files from anywhere with the alias that tucks away the relative path, thus making our code neater:

import notFound from 'Middlewares/notFound'
If you want to find out more about the alias and resolve options in webpack, please visit https://webpack.js.org/configuration/resolve/resolvealias.

Once you have the preceding structure ready and the entry file sorted, you can start applying the CRUD operations with RethinkDB to this API. But first, you will need to install the RethinkDB JavaScript client into your project. So, let's get started!

Adding and using the RethinkDB JavaScript client

Depending on the programming knowledge you have, there are several official client drivers you can choose from regarding JavaScript, Ruby, Python, and Java. There are many community-supported drivers as well, such as PHP, Perl, and R. You can check them out at https://rethinkdb.com/docs/install-drivers/.

In this book, we will be using the RethinkDB JavaScript client driver. We will guide you through the installation and how to use the CRUD operations using this driver in the following steps:

  1. Install the RethinkDB JavaScript client driver via npm:
$ npm i rethinkdb
  1. Create a rethinkdb.js file that will contain the RethinkDB server connection details in the /configs/ directory, as follows:
// configs/rethinkdb.js
export default {
host: 'localhost',
port: 28015,
dbname: 'nuxtdb'
}
  1. Create a connection.js file for opening a RethinkDB server connection with the preceding connection details in the /core/ directory, as follows:
// core/database/rethinkdb/connection.js
import config from 'Configs/rethinkdb'
import rethink from'rethinkdb'

const c = async() => {
const connection = await rethink.connect({
host: config.host,
port: config.port,
db: config.dbname
})
return connection
}
export default c
  1. Also, create an open connection middleware with an open.js file in the /middlewares/ directory and bind it to the Koa context as another option to connect to RethinkDB, as follows:
// middlewares/database/rdb/connection/open.js
import config from 'Configs/rethinkdb'
import rdb from'rethinkdb'

export default async (ctx, next) => {
ctx._rdbConn = await rdb.connect({
host: config.host,
port: config.port,
db: config.dbname
})
await next()
}
It is a good practice, which we learned from PHP's PSR-4, to use the directory path to describe your middleware (or CRUD operations) so that you don't have to use a long name to describe your file. For example, you might want to name this middleware rdb-connection-open.js to describe what it is as clearly as possible if you are not using a descriptive directory path for it. But if you are using the directory path to describe the middleware, then you can just name the file something as simple as open.js
  1. Create a close connection middleware with a close.js file in the /middlewares/ directory and bind it to the Koa context as the last middleware, as follows:
// middlewares/database/rdb/connection/close.js
import config from 'Configs/rethinkdb'
import rdb from'rethinkdb'

export default async (ctx, next) => {
ctx._rdbConn.close()
await next()
}
  1. Import the open and close connection middleware in the root middlewares.js file and register them to the app, as follows:
// middlewares.js
import routes from './routes'
import rdbOpenConnection from 'Middlewares/database/rdb/connection/open'
import rdbCloseConnection from 'Middlewares/database/rdb/connection/close'

export default (app) => {
//...
app.use(rdbOpenConnection)
app.use(routes.routes(), routes.allowedMethods())
app.use(rdbCloseConnection)
}

Here, you can see that the open connection middleware is registered before all the module routes and that the close connection middleware is registered last so that they are called first and last, respectively.

  1. In the upcoming steps, we will use the following template code with a Koa router and the RethinkDB client driver to make the CRUD operation. For example, the following code shows how we apply the template code for fetching all the users from the user table in the user module:
// modules/user/_routes/index.js
import Router from 'koa-router'
import rdb from 'rethinkdb'

const router = new Router()
router.get('/', async (ctx, next) => {
try {
// perform verification on the incoming parameters...
// perform a CRUD operation:
let result = await rdb.table('user')
.run(ctx._rdbConn)

ctx.type = 'json'
ctx.body = result
await next()

} catch (err) {
ctx.throw(500, err)
}
})
export default router

Let's go through this code and understand what it does. Here, you can see that we are using a custom top-level namespace, rdb, for the RethinkDB client driver, as opposed to the r namespace that you have practiced on localhost:8080. Also, when using the RethinkDB client driver in our app, we must always call the run method at the end of the ReQL commands with the RethinkDB server connection to construct the query and pass it onto the server for execution.

Furthermore, we must call the next method at the end of the code so that we can pass the execution of the app to the next piece of middleware, especially the close connection middleware, which is used to close the RethinkDB connection. We should perform checks on the incoming parameters and data from the client before performing any CRUD operations. Then, we should wrap our code in try-catch blocks to catch and throw any potential errors.

Note that in the upcoming steps, we will skip writing the parameter verification and the try-catch statement from the code to avoid lengthy and repetitive code lines and blocks, but you should have them included in your actual code.
  1. Create a create-user.js file in the /_routes/ folder in the user module with the following code for injecting new users into the user table in the database:
// modules/user/_routes/create-user.js
router.post('/user', async (ctx, next) => {
let result = await rdb.table('user')
.insert(document, {returnChanges: true})
.run(ctx._rdbConn)

if (result.inserted !== 1) {
ctx.throw(404, 'insert user failed')
}

ctx.type = 'json'
ctx.body = result
await next()
})

We should throw the error if the insertion fails and pass the error message to the Koa throw method with the HTTP error code so that we can catch them with the try-catch blocks and display them on the frontend.

  1. Create a fetch-user.js file in the /_routes/ folder in the user module to fetch a specific user from the user table by using the slug key, as follows:
// modules/user/_routes/fetch-user.js
router.get('/:slug', async (ctx, next) => {
const slug = ctx.params.slug
let user = await rdb.table('user')
.filter(searchQuery)
.nth(0)
.default(null)
.run(ctx._rdbConn)

if (!user) {
ctx.throw(404, 'user not found')
}

ctx.type = 'json'
ctx.body = user
await next()
})

We added the nth command in the query to display the document by its position. In our case, we just want to get the first document, so we pass a 0 integer to this method. We also added the default command to return a null exception if no users are found in the user table.

  1. Create an update-user.js file in the /_routes/ folder in the user module for updating the existing user in the user table by using the document ID, as follows:
// modules/user/_routes/update-user.js
router.put('/user', async (ctx, next) => {
let body = ctx.request.body || {}
let objectId = body.id

let timestamp = Date.now()
let updateQuery = {
name: body.name,
slug: body.slug,
updatedAt: timestamp
}

let result = await rdb.table('user')
.get(objectId)
.update(updateQuery, {returnChanges: true})
.run(ctx._rdbConn)

if (result.replaced !== 1) {
ctx.throw(404, 'update user failed')
}

ctx.type = 'json'
ctx.body = result
await next()
})

We added the get command in the query to fetch the specific document by its ID first, before running the update.

  1. Create a delete-user.js file in the /_routes/ folder in the user module for deleting the existing user from the user table by using the document ID, as follows:
// modules/user/_routes/delete-user.js
router.del('/user', async (ctx, next) => {
let body = ctx.request.body || {}
let objectId = body.id

let result = await rdb.table('user')
.get(objectId)
.delete()
.run(ctx._rdbConn)

if (result.deleted !== 1) {
ctx.throw(404, 'delete user failed')
}

ctx.type = 'json'
ctx.body = result
await next()
})
  1. Lastly, refactor the CRUD operation for listing all the users from the user table that you just created in step 7 by adding an orderBy command to the query in the index.js file, which is kept in the /_routes/ folder, as follows:
// modules/user/_routes/index.js
router.get('/', async (ctx, next) => {
let cursor = await rdb.table('user')
.orderBy(rdb.desc('createdAt'))
.run(ctx._rdbConn)

let users = await cursor.toArray()

ctx.type = 'json'
ctx.body = users
await next()
})

We added the orderBy command to the query so that we can sort the documents by their creation dates descendingly (the latest first). Also, the documents returned by the RethinkDB database are always contained in a cursor object as a callback from the CRUD operation, so we must use the toArray command to iterate through the cursor and convert the object into an array.

If you want to find out more about the orderBy and toArray commands, please visit https://rethinkdb.com/api/javascript/order_by/ and https://rethinkdb.com/api/javascript/to_array/, respectively.

With that, you have implemented the CRUD operations with RethinkDB in your API successfully. Again, this is easy and fun, isn't it? But we still can improve the "quality" of the document we store in the database by enforcing schema in the RethinkDB databases. We'll learn how to do this in the next section.

Enforcing schema in RethinkDB

Just like the BSON databases in MongoDB, the JSON databases in RethinkDB are also schemaless. This means no blueprints and no formula or integrity constraints are imposed on the databases. No organized rule of how the database is constructed can pose the issue of integrity in our databases. Certain documents can contain different and unwanted keys in the same table (or "collection" in MongoDB), along with the documents that have the correct keys. You may inject some keys by mistake or forget to inject the required keys and values. So, it can be a good idea to enforce some sort of schema in our JSON or BSON databases if you want to keep the data in your documents organized. There is no internal feature from RethinkDB (or MongoDB) for enforcing the schema, but we can create custom functions to impose some basic schema with the Node.js Lodash module. Let's explore how to do this:

  1. Install the Lodash module via npm:
$ npm i lodash
  1. Create a utils.js file in the /core/ directory and import lodash to create a function called sanitise, as follows:
// core/utils.js
import lodash from 'lodash'

function sanitise (options, schema) {
let data = options || {}

if (schema === undefined) {
const err = new Error('Schema is required.')
err.status = 400
err.expose = true
throw err
}

let keys = lodash.keys(schema)
let defaults = lodash.defaults(data, schema)
let picked = lodash.pick(defaults, keys)

return picked
}
export { sanitise }

This function simply picks the default keys that you set and ignores any additional keys that are not in the "schema".

We are using the following methods from Lodash. For more information about each of them, please visit the following links:
https://lodash.com/docs/4.17.15#keys for the keys method
https://lodash.com/docs/4.17.15#defaults for the defaults method
https://lodash.com/docs/4.17.15#pick for the pick method
  1. Create a user schema in the user module with the following keys that only you want to accept:
// modules/user/schema.js
export default {
slug: null,
name: null,
createdAt: null,
updatedAt: null
}
  1. Import the sanitise method and the preceding schema into the route that you want to enforce the schema; for example, in the create-user.js file:
// modules/user/_routes/create-user.js
let timestamp = Date.now()
let options = {
name: body.name,
slug: body.slug,
createdAt: timestamp,
username: 'marymoe',
password: '123123'
}

let document = sanitise(options, schema)
let result = await rdb.table('user')
.insert(document, {returnChanges: true})
.run(ctx._rdbConn)

In the preceding code, the example fields, username and password, won't be injected into the document in the user table when sanitizing the data before insertion.

You can see that this sanitise function only performs a simple validation. If you need more complicated and advanced data validation, you can use the Node.js joi module from the hapi web framework.

If you want to find out more about this module, please visit https://hapi.dev/module/joi/.

The next thing you must explore is the changefeeds in RethinkDB. This is the main purpose of this chapter – to show you how to make use of the real-time feature of RethinkDB to create real-time apps. So, let's explore and play with the changefeeds in RethinkDB!

Introducing changefeeds in RethinkDB

Before applying the changefeeds in our app with the RethinkDB client driver, let's use the Data Explorer from the Administration UI again at localhost:8080/#dataexplorer to see the real-time feeds in real time on the screen:

  1. Paste in the following ReQL query and click the Run button:
r.db('nuxtdb').table('user').changes()

You should see the following information on your browser screen:

Listening for events...
Waiting for more results
  1. Open another tab on your browser and point it to localhost:8080/#dataexplorer. Now, you have two data explorers. Drag one out from the browser tab so that you can place them side by side. Then, insert the new documents into the user table from one of the data explorers:
r.db('nuxtdb').table('user').insert([
{ name: "Richard Roe", slug: "richard" },
{ name: "Marry Moe", slug: "marry" }
])

You should get the following result:

{
"deleted": 0,
"errors": 0,
"generated_keys": [
"f7305c97-2bc9-4694-81ec-c5acaed1e757",
"5862e1fa-e51c-4878-a16b-cb8c1f1d91de"
],
"inserted": 2,
"replaced": 0,
"skipped": 0,
"unchanged": 0
}

At the same time, you should see the other Data Explorer displaying the following feeds instantaneously in real time:

{
"new_val": {
"id": "f7305c97-2bc9-4694-81ec-c5acaed1e757",
"name": "Richard Roe",
"slug": "richard"
},
"old_val": null
}

{
"new_val": {
"id": "5862e1fa-e51c-4878-a16b-cb8c1f1d91de",
"name": "Marry Moe",
"slug": "marry"
},
"old_val": null
}

Hooray! You have just made real-time feeds effortlessly with RethinkDB! Notice that you will always get these two keys, new_val and old_val, in each of the real-time feeds. They have the following implications:

  • If you get the data in new_val but it's null in old_val, that means the new document is injected into the database.
  • If you get the data in both new_val and old_val, that means the existing document is updated in the database.
  • If you get the data in old_val but it's null in new_val, that means the existing document is removed from the database.

You will get to use these keys when we use them in the Nuxt app in the last section of this chapter. So, don't worry too much about them for now. Instead, the next challenge is to implement it in the API and the Nuxt app. To do that, we will need another Node.js module  Socket.IO. So, let's explore how this module can help you to achieve that.

Introducing Socket.IO

Just like HTTP, WebSocket is a communication protocol, but it provides full-duplex (bidirectional) communication between the client and the server. Unlike HTTP, the WebSocket connection always remains open for real-time data transfer. So, in WebSocket apps, the server can send data to the client without having the client initiate the request.

Also, unlike the HTTP schema that starts with HTTP or HTTPS for Hypertext Transfer Protocol Secure, the WebSocket protocol schema starts with ws or wss for WebSocket Secure; for example:

ws://example.com:4000

Socket.IO is a JavaScript library that uses the WebSocket protocol and polling as the fallback option for creating real-time web apps. It supports any platform, browser, or device and handles all the degradation for the server and client to get the full-duplex communication in real time. Most browsers support the WebSocket protocol these days anyway, including Google Chrome, Microsoft Edge, Firefox, Safari, and Opera. But when using Socket.IO, we must use its client-side and server-side libraries together. The client-side library runs inside the browser, while the server-side library runs on your server-side Node.js app. So, let's get these two libraries working in our apps.

If you want to find out more about Socket.IO, please visit https://socket.io/.

Adding and using Socket.IO server and client

We will add the Socket.IO server to the API that we have been building in the last few sections, and then add the Socket.IO client to the Nuxt app eventually. But before adding it to the Nuxt app, we will add it to a simple HTML page so that we have a bird's-eye view of how the Socket.IO server and Socket.IO client work together. Let's learn how to do so:

  1. Install the Socket.IO server via npm:
$ npm i socket.io
  1. Create an index.js file in the /configs/ directory to store the server setting if you haven't done so yet:
// configs/index.js
export default {
server: {
port: 4000
},
}

From this simple setting, we will be serving our API at port 4000.

  1. Import socket.io and bind it to the Node.js HTTP object with the new instance of Koa to create a new instance of Socket.IO, as follows:
// backend/koa/public/index.js
import Koa from 'koa'
import socket from 'socket.io'
import http from 'http'
import config from 'Configs'
import middlewares from '../middlewares'

const app = new Koa()
const host = process.env.HOST || '127.0.0.1'
const port = process.env.PORT || config.server.port
middlewares(app)

const server = http.createServer(app.callback())
const io = socket(server)

io.sockets.on('connection', socket => {
console.log('a user connected: ' + socket.id)
socket.on('disconnect', () => {
console.log('user disconnected: ' + socket.id)
})
})
server.listen(port, host)

After creating the new instance of Socket.IO, we can start listening to the Socket.IO connection event for the incoming socket from the socket callback. We log the incoming socket to the console with its ID. We also log the incoming socket's disconnect event when it is disconnected. Lastly, notice that we start and serve the app on localhost:4000 by using the native Node.js HTTP, as opposed to using the HTTP inside Koa, which we used to do:

app.listen(4000)
  1. Create a socket-client.html page and import the Socket.IO client via CDN. Create a new instance of it by passing localhost:4000 as the specific URL, as follows:
// frontend/html/socket-client.html
<script src="https://cdn.jsdelivr.net/npm/socket.io-
client@2/dist/socket.io.js"></script>

<script>
var socket = io('http://localhost:4000/')
</script>

Now, if you browse this HTML page on your browser, or when you refresh the page, you should see the console printing the log with the socket ID, as follows:

a user connected: abeGnarBnELo33vQAAAB

You should also see the console printing the log with the socket ID when you close the HTML page, as follows:

user disconnected: abeGnarBnELo33vQAAAB

That's all you need to do in order to connect the server and client sides of Socket.IO. This is extremely simple and easy, isn't it? But all we're doing here is connecting and disconnecting the server and client. We need more from them – we want to transmit data simultaneously. To do that, we just need to emit and receive events from and to each other, which we'll do in the upcoming steps.

If you want to use the local version of the Socket.IO client, you can point the script tag's URL source to /node_modules/socket.io-client/dist/socket.io.js.
  1. Create an emit event from the server by using the emit method from the Socket.IO server, as follows:
// backend/koa/public/index.js
io.sockets.on('connection', socket => {
io.emit('emit.onserver', 'Hi client, what you up to?')
console.log('Message to client: ' + socket.id)
})

Here, you can see that we emit the event with a simple message through the custom event called emit.onserver and log the activity to the console. Notice that we can only emit the event when the connection is established. Then, we can listen to this custom event on the client-side and log the message coming from the server, as follows:

// frontend/html/socket-client.html
socket.on('emit.onserver', function (message) {
console.log('Message from server: ' + message)
})
  1. So, now, if you refresh the page again on your browser, you should see the console printing the log with the socket ID, as follows:
Message to client: abeGnarBnELo33vQAAAB // server side
Message from server: Hi client, what you up to? // client side
  1. Create an emit event from the client by using the emit method from the Socket.IO client, as follows:
// frontend/html/socket-client.html
<script
src="https://code.jquery.com/jquery-3.4.1.slim.min.js"
integrity="sha256-pasqAKBDmFT4eHoN2ndd6lN370kFiGUFyTiUHWhU7k8="
crossorigin="anonymous"></script>

<button class="button-sent">Send</button>

$('.button-sent').click(function(e){
e.preventDefault()

var message = 'Hi server, how are you holding up?'
socket.emit('emit.onclient', message)
console.log('Message sent to server.')

return false
})

Here, you can see that, first, we install jQuery via CDN and create a <button> with the jQuery click event. Secondly, we emit the Socket.IO custom event called emit.onclient with a simple message when the button is clicked. Lastly, we log the activity to the console.

  1. After that, we can listen to the Socket.IO custom event on the server-side and log the message coming from the client, as follows:
// backend/koa/public/index.js
socket.on('emit.onclient', (message) => {
console.log('Message from client, '+ socket.id + ' :' + message);
})
  1. If you refresh the page again on your browser, you should see the console printing the log, along with the socket ID, as follows:
Message sent to server. // client side
Message from client, abeGnarBnELo33vQAAAB: Hi server,
how are you holding up? // server side

You noknow how to transmit data back and forth in real time with Socket.IO just by emitting custom events and listening to them. The next thing you should know about is how to integrate Socket.IO with the changefeeds in RethinkDB in order to transmit the real-time data from the database to the client. So, keep reading!

Integrating Socket.IO server and RethinkDB changefeeds

Remember that you fiddled with the RethinkDB changefeeds previously, using the Data Explorer from the Administration UI again at localhost:8080/#dataexplorer. To subscribe to a changefeed, you just have to chain the ReQL changes command to the query, as follows:

r.db('nuxtdb').table('user').changes()

The RethinkDB changefeeds contain real-time data that's emitted from the RethinkDB database to our API, which means we need to catch these feeds on the server-side with the Socket.IO server and emit them to the client. So, let's learn how to catch them by refactoring the API we have been developing throughout this chapter:

  1. Install the Socket.IO server via npm into your API:
$ npm i socket.io
  1. Create an asynchronous anonymous arrow function in a changefeeds.js file in the /core/ directory with the following code:
// core/database/rethinkdb/changefeeds.js
import rdb from 'rethinkdb'
import rdbConnection from './connection'

export default async (io, tableName, eventName) => {
try {
const connection = await rdbConnection()
var cursor = await rdb.table(tableName)
.changes()
.run(connection)

cursor.each(function (err, row) {
if (err) {
throw err
}
io.emit(eventName, row)
})
} catch( err ) {
console.error(err);
}
}

In this function, we import rethinkdb as rdb and our RethinkDB database connection as rdbConnection, and then use the following items as the parameters of this function:

  • The instance of the Socket.IO server
  • The Socket.IO emit custom event name that you will want to use
  • The RethinkDB table name that you want to subscribe to its changefeed

The changefeed will return the documents in a cursor object as a callback, so we iterate through the cursor object and emit each row of the document with the custom event name.

  1. Import the changefeeds function as rdbChangeFeeds in the app root in the /public/ directory and integrate it with the rest of the existing code in the index.js file, as follows:
// public/index.js
import Koa from 'koa'
import socket from 'socket.io'
import http from 'http'
import config from 'Configs'
import middlewares from '../middlewares'
import rdbChangeFeeds from 'Core/database/rethinkdb/changefeeds'

const app = new Koa()
const host = process.env.HOST || '127.0.0.1'
const port = process.env.PORT || config.server.port
middlewares(app)

const server = http.createServer(app.callback())
const io = socket(server)
io.sockets.on('connection', socket => {
//...
})

rdbChangeFeeds(io, 'user', 'user.changefeeds')
server.listen(port, host)

In the preceding code, the table name we want to subscribe to is user and the emit event name we want to call is user.changefeeds. So, we pass them into the rdbChangeFeeds function with the socket.io instance. That's all you need to do to integrate Socket.IO and RethinkDB once only and globally.

Well done! You have managed to integrate Koa, RethinkDB, and Socket.IO on the server-side and created a real time API. But what about the client side, and how do we listen to the event being emitted from the API? We'll find out in the next section.

Integrating Socket.IO with Nuxt

The Nuxt app we are going to build is very similar to the one we had in the previous chapter, where we had a /users/ directory that contains the following CRUD pages in the /pages/ directory for adding, updating, listing, and deleting users:

users
├── index.vue
├── _slug.vue
├── add
│ └── index.vue
├── update
│ └── _slug.vue
└── delete
└── _slug.vue

You can copy these files from the previous chapter. The only major change and difference in this app is the <script> block, where we will list users in real time by listening to the emit event from the Socket.IO server. To do that, we will need to use the Socket.IO client, which you learned in the Adding and using Socket.IO server and client section with the simple HTML page. So, let's find out how to implement what we already know into the Nuxt app:

  1. Install the Socket.IO client via npm into your Nuxt project:
$ npm i socket.io-client
  1. Create the following variables for the app's protocol, hostname, and the cross-domain ports in the Nuxt config file so that we can reuse them later:
// nuxt.config.js
const protocol = 'http'
const host = process.env.NODE_ENV === 'production' ? 'a-cool-domain-name.com' : 'localhost'

const ports = {
local: '8000',
remote: '4000'
}

const remoteUrl = protocol + '://' + host + ':' + ports.remote + '/'

These variables are made for the following situations:

  • The host variable is used to take the value of a-cool-domain-name.com when the Nuxt app is in production; that is, when you run the app with npm run start. Otherwise, it just takes localhost as the default value.
  • The local key in the ports variable is used to set a server port for the Nuxt app and it is set to 8000. Remember that the default port that Nuxt serves the app is 3000.
  • The remote key in the ports variable is used to tell the Nuxt app what server port the API is on, which is 4000.
  • The remoteUrl variable is used to concatenate the API with the preceding variables.
  1. Apply the preceding variables to the env and server options in the Nuxt config file, as follows:
// nuxt.config.js
export default {
env: {
remoteUrl
},
server: {
port: ports.local,
host: host
}
}

So, with this configuration, we can access the remoteUrl variable again when serving the app via the following methods:

  • process.env.remoteUrl
  • context.env.remoteUrl

Also, in this configuration, we have changed the Nuxt app's default server port to 8000 in the server option. The default port is 3000, while the default host is localhost. But you may want to use a different port for some reason. That's why we looked at how to change them here.

If you want to find out more about the server configuration and other options such as timing and https, please visit https://nuxtjs.org/api/configuration-server.

If you want to find out more about the env configuration, please visit https://nuxtjs.org/api/configuration-envthe-env-property.
  1. Install the Nuxt Axios and Proxy modules and configure them in the Nuxt config file, as follows:
// nuxt.config.js
export default {
modules: [
'@nuxtjs/axios'
],

axios: {
proxy: true
},

proxy: {
'/api/': {
target: remoteUrl,
pathRewrite: {'^/api/': ''}
}
}
}

Notice that we have reused the remoteUrl variable in the proxy option. So, every API request we make that starts with /api/ will be converted into http://localhost:4000/api/. But since we don't have /api/ in the routes in the API, we remove this /api/ section from the request URL with the pathRewrite option before sending it off to the API.

  1. Create a plugin in the /plugin/ directory for abstracting the instance of the Socket.IO client so that we can reuse it anywhere:
// plugins/socket.io.js
import io from 'socket.io-client'

const remoteUrl = process.env.remoteUrl
const socket = io(remoteUrl)

export default socket

Notice that we have reused the remoteUrl variable via process.env.remoteUrl in the Socket.IO client instance. This means the Socket.IO client will call the Socket.IO server at localhost:4000.

  1. Import the socket.io client plugin into the <script> block and fetch the list of users with the @nuxtjs/axios module in the index file. This index file is kept in the /users/ directory, under pages:
// pages/users/index.vue
import socket from '~/plugins/socket.io'

export default {
async asyncData ({ error, $axios }) {
try {
let { data } = await $axios.get('/api/users')
return { users: data.data }
} catch (err) {
// Handle the error.
}
}
}
  1. After fetching and setting the users with the asyncData method, use the Socket.IO plugin to listen to the user.changefeeds event in the mounted method for any new real-time feeds from the server, as follows:
// pages/users/index.vue
export default {
async asyncData ({ error, $axios }) {
//...
},
mounted () {
socket.on('user.changefeeds', data => {
if (data.new_val === undefined && data.old_val === undefined) {
return
}
//...
})
}
}

Here, you can see that we always check the data callback to make sure that new_val and old_val are defined in the incoming feed. In other words, we want to ensure these two keys are always present in the feed before proceeding to the following lines.

  1. After checking this, if we receive data in the new_val key but the old_val key is empty, this means a new user has been added to the server. If we get a new feed from the server side, we will prepend the new user data to the top of the user array by using the JavaScript unshift function, as follows:
// pages/users/index.vue
mounted () {
//...
if(data.old_val === null && data.new_val !== null) {
this.users.unshift(data.new_val)
}
}

Then, if we receive data in the old_val key but the new_val key is empty, this means an existing user has been deleted from the server. So, to pop off an existing user from the array by its index (its position/location in the array), we can use the JavaScript splice function. But first, we must find the index of the user by its ID using the JavaScript map function, as follows:

// pages/users/index.vue
mounted () {
//...
if(data.new_val === null && data.old_val !== null) {
var id = data.old_val.id
var index = this.users.map(el => {
return el.id
}).indexOf(id)
this.users.splice(index, 1)
}
}

Lastly, if we receive data in both the new_val and old_val keys, this means a current user has been updated. So, if a user has been updated, we must find the user's index in the array first and then replace it with the JavaScript splice function., as follows:

// pages/users/index.vue
mounted () {
//...
if(data.new_val !== null && data.old_val !== null) {
var id = data.new_val.id
var index = this.users.findIndex(item => item.id === id)
this.users.splice(index, 1, data.new_val)
}
}

Note that we use the JavaScript findIndex function instead as another alternative to the map function.

If you want to find out more information about the JavaScript standard built-in functions we have used here for manipulating the JavaScript arrays, please visit the following links:
  1. Add the following template to the <template> block to display the users, as follows: 
// pages/users/index.vue
<div>
<h1>Users</h1>
<ul>
<li v-for="user in users" v-bind:key="user.uuid">
<nuxt-link :to="'/users/' + user.slug">
{{ user.name }}
</nuxt-link>
</li>
</ul>
<nuxt-link to="/users/add">
Add New
</nuxt-link>
</div>

In this template, you can see that we simply loop the user data we get from the asyncData method with v-for and bind the user uuid to each looped element. After that, any real-time feed that occurs in the mounted method will update the user data and the template reactively.

  1. Run the Nuxt app with npm run dev. You should the following information on your terminal:
Listening on: http://localhost:8000/
  1. Open two tabs on your browser side by side, or two different browsers side by side, and point them to localhost:8000/users. Add a new user from one of the tabs (or browsers) at localhost:8000/users/add. You should see that the newly added user is shown on all the tabs (or browsers) instantly and concurrently, in real time, without you needing to refresh them.
You can find all the code and apps in this chapter in /chapter-17/frontend/ and /chapter-17/backend/ in this book's GitHub repository.

Well done – you have made it! We hope you found this application fun and easy and that it inspires you to venture further with what you've learned so far. Let's summarize what we have learned in this chapter.

Summary

In this chapter, you managed to install and use RethinkDB and Socket.IO to turn the ordinary backend API and frontend Nuxt app into real-time apps. You learned how to manipulate the JSON data by creating, reading, updating, and deleting them on the server side with RethinkDB through the RethinkDB Administration UI, and then used the RethinkDB client driver with Koa. Most importantly, you learned how to manipulate the real time feeds in RethinkDB, known as changefeeds, through the RethinkDB Administration UI as well, and then integrated them with the Socket.IO server and Koa on the server side. Furthermore, you used the Socket.IO server to emit data with custom events and the Socket.IO client to listen to the event and catch the data in real-time on the client side with the Nuxt app. Wasn't it a fun ride?

In the next chapter, we will take Nuxt further with third-party APIs, content management systems (CMS), and GraphQL. You will be introduced to WordPress API, Keystone, and GraphQL. You will then learn how to create custom content types and custom routes to extend the WordPress API so that you can integrate it with Nuxt and stream remote images from the WordPress project. You will be developing custom CMS using Keystone, installing and securing PostgreSQL for Keystone app development, as well as securing MongoDB, which you learned how to install in Chapter 9Adding a Server-Side Database. Most importantly and excitingly, you will learn the differences between the REST API and the GraphQL API; build a GraphQL API with GraphQL.js, Express, and Apollo Server; understand the GraphQL schema and its resolvers; use the Keystone GraphQL API; and then integrate them with Nuxt. It will definitely be another fun ride, so buckle up and get ready!

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.191.189.85