7

Creational Design Patterns

A design pattern is a reusable solution to a recurring problem. The term is really broad in its definition and can span multiple domains of an application. However, the term is often associated with a well-known set of object-oriented patterns that were popularized in the 90s by the book, Design Patterns: Elements of Reusable Object-Oriented SoftwarePearson Education, by the almost legendary Gang of Four (GoF): Erich Gamma, Richard Helm, Ralph Johnson, and John Vlissides. We will often refer to these specific sets of patterns as traditional design patterns or GoF design patterns.

Applying this set of object-oriented design patterns in JavaScript is not as linear and formal as it would be in a classical object-oriented language. As we know, JavaScript is object-oriented, prototype-based, and has dynamic typing. It also treats functions as first-class citizens and allows functional programming styles. These characteristics make JavaScript a very versatile language, which gives tremendous power to the developer but at the same time, it causes fragmentation of programming styles, conventions, techniques, and ultimately the patterns of its ecosystem. With JavaScript, there are so many ways to achieve the same result that each developer has their own opinion on what's the best way to approach a problem. A clear demonstration of this phenomenon is the abundance of frameworks and opinionated libraries in the JavaScript ecosystem; probably no other language has ever seen so many, especially now that Node.js has given new astonishing possibilities to JavaScript and has created so many new scenarios.

In this context, the nature of JavaScript affects traditional design patterns too. There are so many ways in which traditional design patterns can be implemented in JavaScript that the traditional, strongly object-oriented implementation stops being relevant.

In some cases, the traditional implementation of these design patterns is not even possible because JavaScript, as we know, doesn't have real classes or abstract interfaces. What doesn't change, though, is the original idea at the base of each pattern, the problem it solves, and the concepts at the heart of the solution.

In this chapter and in the two that follow, we will see how some of the most important GoF design patterns apply to Node.js and its philosophy, thus rediscovering their importance from another perspective. Among these traditional patterns, we will also have a look at some "less traditional" design patterns born from within the JavaScript ecosystem itself.

In this chapter, in particular, we'll take a look at a class of design patterns called creational. As the name suggests, these patterns address problems related to the creation of objects. For example, the Factory pattern allows us to encapsulate the creation of an object within a function. The Revealing Constructor pattern allows us to expose private object properties and methods only during the object's creation, while the Builder pattern simplifies the creation of complex objects. Finally, the Singleton pattern and the Dependency Injection pattern help us with wiring the modules within our applications.

This chapter, as well as the following two, assume that you have some notion of how inheritance works in JavaScript. Please also be advised that we will often use generic and more intuitive diagrams to describe a pattern in place of standard UML. This is because many patterns can have an implementation based not only on classes but also on objects and even functions.

Factory

We'll begin our journey from one of the most common design patterns in Node.js: Factory. As you will see, the Factory pattern is very versatile and has more than just one purpose. Its main advantage is its ability to decouple the creation of an object from one particular implementation. This allows us, for example, to create an object whose class is determined at runtime. Factory also allows us to expose "a surface area" that is much smaller than that of a class; a class can be extended or manipulated, while a factory, being just a function, offers fewer options to the user, making it more robust and easier to understand. Finally, a factory can also be used to enforce encapsulation by leveraging closures.

Decoupling object creation and implementation

We already stressed the fact that, in JavaScript, the functional paradigm is often preferred to a purely object-oriented design for its simplicity, usability, and small surface area. This is especially true when creating new object instances. In fact, invoking a factory, instead of directly creating a new object from a class using the new operator or Object.create(), is so much more convenient and flexible in several respects.

First and foremost, a factory allows us to separate the creation of an object from its implementation. Essentially, a factory wraps the creation of a new instance, giving us more flexibility and control in the way we do it. Inside the factory, we can choose to create a new instance of a class using the new operator, or leverage closures to dynamically build a stateful object literal, or even return a different object type based on a particular condition. The consumer of the factory is totally agnostic about how the creation of the instance is carried out. The truth is that, by using new, we are binding our code to one specific way of creating an object, while with a factory, we can have much more flexibility, almost for free. As a quick example, let's consider a simple factory that creates an Image object:

function createImage (name) {
  return new Image(name)
}
const image = createImage('photo.jpeg')

The createImage() factory might look totally unnecessary; why not instantiate the Image class by using the new operator directly? Why not write something like the following:

const image = new Image(name)

As we already mentioned, using new binds our code to one particular type of object, which in the preceding case is to the Image object type. A factory, on the other hand, gives us much more flexibility. Imagine that we want to refactor the Image class, splitting it into smaller classes, one for each image format that we support.

If we exposed a factory as the only means to create new images, we could simply rewrite it as follows, without breaking any of the existing code:

function createImage (name) {
  if (name.match(/.jpe?g$/)) {
    return new ImageJpeg(name)
  } else if (name.match(/.gif$/)) {
    return new ImageGif(name)
  } else if (name.match(/.png$/)) {
    return new ImagePng(name)
  } else {
    throw new Error('Unsupported format')
  }
}

Our factory also allows us to keep the classes hidden and prevents them from being extended or modified (remember the principle of small surface area?). In JavaScript, this can be achieved by exporting only the factory, while keeping the classes private.

A mechanism to enforce encapsulation

A factory can also be used as an encapsulation mechanism, thanks to closures.

Encapsulation refers to controlling the access to some internal details of a component by preventing external code from manipulating them directly. The interaction with the component happens only through its public interface, isolating the external code from the changes in the implementation details of the component. Encapsulation is a fundamental principle of object-oriented design, together with inheritance, polymorphism, and abstraction.

In JavaScript, one of the main ways to enforce encapsulation is through function scopes and closures. A factory makes it straightforward to enforce private variables. Consider the following, for example:

function createPerson (name) {
  const privateProperties = {}
  const person = {
    setName (name) {
      if (!name) {
        throw new Error('A person must have a name')
      }
      privateProperties.name = name
    },
    getName () {
      return privateProperties.name
    }
  }
  person.setName(name)
  return person
}

In the preceding code, we leverage a closure to create two objects: a person object, which represents the public interface returned by the factory, and a group of privateProperties that are inaccessible from the outside and that can be manipulated only through the interface provided by the person object. For example, in the preceding code, we make sure that a person's name is never empty; this would not be possible to enforce if name was just a normal property of the person object.

Using closures is not the only technique that we have for enforcing encapsulation. In fact, other possible approaches are:

  • Using private class fields (the hashbang # prefix syntax), introduced in Node.js 12. More on this at nodejsdp.link/tc39-private-fields. This is the most modern approach, but at the time of writing, the feature is still experimental and has yet to be included in the official ECMAScript specification.
  • Using WeakMaps. More on this at nodejsdp.link/weakmaps-private.
  • Using symbols, as explained in the following article: nodejsdp.link/symbol-private.
  • Defining private variables in a constructor (as recommended by Douglas Crockford: nodejsdp.link/crockford-private). This is the legacy but also the best-known approach.
  • Using conventions, for example, prefixing the name of a property with an underscore "_". However, this does not technically prevent a member from being read or modified from the outside.

Building a simple code profiler

Now, let's work on a complete example using a factory. Let's build a simple code profiler, an object with the following properties:

  • start() method that triggers the start of a profiling session
  • An end() method to terminate the session and log its execution time to the console

Let's start by creating a file named profiler.js, which will have the following content:

class Profiler {
  constructor (label) {
    this.label = label
    this.lastTime = null
  }
  start () {
    this.lastTime = process.hrtime()
  }
  end () {
    const diff = process.hrtime(this.lastTime)
    console.log(`Timer "${this.label}" took ${diff[0]} seconds ` +
      `and ${diff[1]} nanoseconds.`)
  }
}

The Profiler class we just defined uses the default high resolution timer of Node.js to save the current time when start() is invoked, and then calculate the elapsed time when end() is executed, printing the result to the console.

Now, if we are going to use such a profiler in a real-world application to calculate the execution time of different routines, we can easily imagine the huge amount of profiling information printed to the console, especially in a production environment. What we may want to do instead is redirect the profiling information to another source, for example, a dedicated log file, or alternatively, disable the profiler altogether if the application is running in production mode. It's clear that if we were to instantiate a Profiler object directly by using the new operator, we would need some extra logic in the client code or in the Profiler object itself in order to switch between the different logics.

Alternatively, we can use a factory to abstract the creation of the Profiler object so that, depending on whether the application runs in production or development mode, we can return a fully working Profiler instance or a mock object with the same interface but with empty methods. This is exactly what we are going to do in our profiler.js module. Instead of exporting the Profiler class, we will export only our factory. The following is its code:

const noopProfiler = {
  start () {},
  end () {}
}
export function createProfiler (label) {
  if (process.env.NODE_ENV === 'production') {
    return noopProfiler
  }
  return new Profiler(label)
}

The createProfiler() function is our factory and its role is abstracting the creation of a Profiler object from its implementation. If the application is running in production mode, we return noopProfiler, which essentially doesn't do anything, effectively disabling any profiling. If the application is not running in production mode, then we create and return a new, fully functional Profiler instance.

Thanks to JavaScript's dynamic typing, we were able to return an object instantiated with the new operator in one circumstance and a simple object literal in the other (this is also known as duck typing, and you can read more about it at nodejsdp.link/duck-typing). This confirms how we can create objects in any way we like inside the factory function. We could also execute additional initialization steps or return a different type of object based on particular conditions, all of this while isolating the consumer of the object from all these details. We can easily understand the power of this simple pattern.

Now, let's play with our profiler factory a bit. Let's create an algorithm to calculate all the factors of a given number and use our profiler to record its running time:

// index.js
import { createProfiler } from './profiler.js'
function getAllFactors (intNumber) {
  const profiler = createProfiler(
    `Finding all factors of ${intNumber}`)
  profiler.start()
  const factors = []
  for (let factor = 2; factor <= intNumber; factor++) {
    while ((intNumber % factor) === 0) {
      factors.push(factor)
      intNumber = intNumber / factor
    }
  }
  profiler.end()
  return factors
}
const myNumber = process.argv[2]
const myFactors = getAllFactors(myNumber)
console.log(`Factors of ${myNumber} are: `, myFactors)

The profiler variable contains our Profiler object, whose implementation will be decided by the createProfiler() factory at runtime, based on the NODE_ENV environment variable.

For example, if we run the module in production mode, we will get no profiling information:

NODE_ENV=production node index.js 2201307499

While if we run the module in development mode, we will see the profiling information printed to the console:

node index.js 2201307499

The example that we just presented is just a simple application of the factory function pattern, but it clearly shows the advantages of separating an object's creation from its implementation.

In the wild

As we said, factories are very common in Node.js. We can find one example in the popular Knex (nodejsdp.link/knex) package. Knex is a SQL query builder that supports multiple databases. Its package exports just a function, which is a factory. The factory performs various checks, selects the right dialect object to use based on the database engine, and finally creates and returns the Knex object. Take a look at the code at nodejsdp.link/knex-factory.

Builder

Builder is a creational design pattern that simplifies the creation of complex objects by providing a fluent interface, which allows us to build the object step by step. This greatly improves the readability and the general developer experience when creating such complex objects.

The most apparent situation in which we could benefit from the Builder pattern is a class with a constructor that has a long list of arguments, or takes many complex parameters as input. Usually, these kinds of classes require so many parameters in advance because all of them are necessary to build an instance that is complete and in a consistent state, so it's necessary to take this into account when considering potential solutions.

So, let's see the general structure of the pattern. Imagine having a Boat class with a constructor such as the following:

class Boat {
  constructor (hasMotor, motorCount, motorBrand, motorModel,
               hasSails, sailsCount, sailsMaterial, sailsColor,
               hullColor, hasCabin) {
    // ...
  }
}

Invoking such a constructor would create some hard to read code, which is easily prone to errors (which argument is what?). Take the following code, for example:

const myBoat = new Boat(true, 2, 'Best Motor Co. ', 'OM123', true, 1,
                        'fabric', 'white', 'blue', false)

A first step to improve the design of this constructor is to aggregate all arguments in a single object literal, such as the following:

class Boat {
  constructor (allParameters) {
    // ...
  }
}
const myBoat = new Boat({
  hasMotor: true,
  motorCount: 2,
  motorBrand: 'Best Motor Co. ',
  motorModel: 'OM123',
  hasSails: true,
  sailsCount: 1,
  sailsMaterial: 'fabric',
  sailsColor: 'white',
  hullColor: 'blue',
  hasCabin: false
})

As we can note from the previous code, this new constructor is indeed much better than the original one as it allows us to clearly see what is the parameter that receives each value. However, we can do even better than this. One drawback of using a single object literal to pass all inputs at once is that the only way to know what the actual inputs are is to look at the class documentation or, even worse, into the code of the class. In addition to that, there is no enforced protocol that guides the developers toward the creation of a coherent class. For example, if we specify hasMotor: true, then we are required to also specify a motorCount, a motorBrand, and a motorModel, but there is nothing in this interface that conveys this information to us.

The Builder pattern fixes even these last few flaws and provides a fluent interface that is simple to read, self-documenting, and that provides guidance toward the creation of a coherent object. Let's take a look at the BoatBuilder class, which implements the Builder pattern for the Boat class:

class BoatBuilder {
  withMotors (count, brand, model) {
    this.hasMotor = true
    this.motorCount = count
    this.motorBrand = brand
    this.motorModel = model
    return this
  }
  withSails (count, material, color) {
    this.hasSails = true
    this.sailsCount = count
    this.sailsMaterial = material
    this.sailsColor = color
    return this
  }
  hullColor (color) {
    this.hullColor = color
    return this
  }
  withCabin () {
    this.hasCabin = true
    return this
  }
  build() {
    return new Boat({
      hasMotor: this.hasMotor,
      motorCount: this.motorCount,
      motorBrand: this.motorBrand,
      motorModel: this.motorModel,
      hasSails: this.hasSails,
      sailsCount: this.sailsCount,
      sailsMaterial: this.sailsMaterial,
      sailsColor: this.sailsColor,
      hullColor: this.hullColor,
      hasCabin: this.hasCabin
    })
  }
}

To fully appreciate the positive impact that the Builder pattern has on the way we create our Boat objects, let's see an example of that:

const myBoat = new BoatBuilder()
  .withMotors(2, 'Best Motor Co. ', 'OM123')
  .withSails(1, 'fabric', 'white')
  .withCabin()
  .hullColor('blue')
  .build()

As we can see, the role of our BoatBuilder class is to collect all the parameters needed to create a Boat using some helper methods. We usually have a method for each parameter or set of related parameters, but there is not an exact rule to that. It is down to the designer of the Builder class to decide the name and behavior of each method responsible for collecting the input parameters.

We can instead summarize some general rules for implementing the Builder pattern, as follows:

  • The main objective is to break down a complex constructor into multiple, more readable, and more manageable steps.
  • Try to create builder methods that can set multiple related parameters at once.
  • Deduce and implicitly set parameters based on the values received as input by a setter method, and in general, try to encapsulate as much parameter setting related logic into the setter methods so that the consumer of the builder interface is free from doing so.
  • If necessary, it's possible to further manipulate the parameters (for example, type casting, normalization, or extra validation) before passing them to the constructor of the class being built to simplify the work left to do by the builder class consumer even more.

In JavaScript, the Builder pattern can also be applied to invoke functions, not just to build objects using their constructor. In fact, from a technical point of view, the two scenarios are almost identical. The major difference when dealing with functions is that instead of having a build() method, we would have an invoke() method that invokes the complex function with the parameters collected by the builder object and returns any eventual result to the caller.

Next, we will work on a more concrete example that makes use of the Builder pattern we've just learned.

Implementing a URL object builder

We want to implement a Url class that can hold all the components of a standard URL, validate them, and format them back into a string. This class in going to be intentionally simple and minimal, so for standard production use, we recommend the built-in URL class (nodejsdp.link/docs-url).

Now, let's implement our custom Url class in a file called url.js:

export class Url {
  constructor (protocol, username, password, hostname,
    port, pathname, search, hash) {
    this.protocol = protocol
    this.username = username
    this.password = password
    this.hostname = hostname
    this.port = port
    this.pathname = pathname
    this.search = search
    this.hash = hash
    this.validate()
  }
  validate () {
    if (!this.protocol || !this.hostname) {
      throw new Error('Must specify at least a ' +
        'protocol and a hostname')
    }
  }
  toString () {
    let url = ''
    url += `${this.protocol}://`
    if (this.username && this.password) {
      url += `${this.username}:${this.password}@`
    }
    url += this.hostname
    if (this.port) {
      url += this.port
    }
    if (this.pathname) {
      url += this.pathname
    }
    if (this.search) {
      url += `?${this.search}`
    }
    if (this.hash) {
      url += `#${this.hash}`
    }
    return url
  }
}

A standard URL is made of several components, so to take them all in, the Url class' constructor is inevitably big. Invoking such a constructor can be a challenge, as we have to keep track of the argument position to know what component of the URL we are passing. Take a look at the following example to get an idea of this:

return new Url('https', null, null, 'example.com', null, null, null,
  null)

This is the perfect situation for applying the Builder pattern we just learned. Let's do that now. The plan is to create a UrlBuilder class, which has a setter method for each parameter (or set of related parameters) needed to instantiate the Url class. Finally, the builder is going to have a build() method to retrieve a new Url instance that's been created using all the parameters that have been set in the builder. So, let's implement the builder in a file called urlBuilder.js:

export class UrlBuilder {
  setProtocol (protocol) {
    this.protocol = protocol
    return this
  }
  setAuthentication (username, password) {
    this.username = username
    this.password = password
    return this
  }
  setHostname (hostname) {
    this.hostname = hostname
    return this
  }
  setPort (port) {
    this.port = port
    return this
  }
  setPathname (pathname) {
    this.pathname = pathname
    return this
  }
  setSearch (search) {
    this.search = search
    return this
  }
  setHash (hash) {
    this.hash = hash
    return this
  }
  build () {
    return new Url(this.protocol, this.username, this.password,
      this.hostname, this.port, this.pathname, this.search,
      this.hash)
  }
}

This should be pretty straightforward. Just note the way we coupled together the username and password parameters into a single setAuthentication() method. This clearly conveys the fact that if we want to specify any authentication information in the Url, we have to provide both username and password.

Now, we are ready to try our UrlBuilder and witness its benefits over using the Url class directly. We can do that in a file called index.js:

import { UrlBuilder } from './urlBuilder.js'
const url = new UrlBuilder()
  .setProtocol('https')
  .setAuthentication('user', 'pass')
  .setHostname('example.com')
  .build()
console.log(url.toString())

As we can see, the readability of the code has improved dramatically. Each setter method clearly gives us a hint of what parameter we are setting, and on top of that, they provide some guidance on how those parameters must be set (for example, username and password must be set together).

The Builder pattern can also be implemented directly into the target class. For example, we could have refactored the Url class by adding an empty constructor (and therefore no validation at the object's creation time) and the setter methods for the various components, rather than creating a separate UrlBuilder class. However, this approach has a major flaw. Using a builder that is separate from the target class has the advantage of always producing instances that are guaranteed to be in a consistent state. For example, every Url object returned by UrlBuilder.build() is guaranteed to be valid and in a consistent state; calling toString() on such objects will always return a valid URL. The same cannot be said if we implemented the Builder pattern on the Url class directly. In fact, in this case, if we invoke toString() while we are still setting the various URL components, its return value may not be valid. This can be mitigated by adding extra validations, but at the cost of adding more complexity.

In the wild

The Builder pattern is a quite common pattern in Node.js and JavaScript as it provides a very elegant solution to the problem of creating complex objects or invoking complex functions. One perfect example is creating new HTTP(S) client requests with the request() API from the http and https built-in modules. If we look at its documentation (available at nodejsdp.link/docs-http-request), we can immediately see it accepts a large amount of options, which is the usual sign that the Builder pattern can potentially provide a better interface. In fact, one of the most popular HTTP(S) request wrappers, superagent (nodejsdp.link/superagent), aims to simplify the creation of new requests by implementing the Builder pattern, thus providing a fluent interface to create new requests step by step. See the following code fragment for an example:

superagent
  .post('https://example.com/api/person')
  .send({ name: 'John Doe', role: 'user' })
  .set('accept', 'json')
  .then((response) => {
    // deal with the response
  })

From the previous code, we can note that this is an unusual builder; in fact, we don't have a build() or invoke() method (or any other method with a similar purpose), and have not used the new operator. What triggers the request instead is an invocation to the then() method. It's interesting to note that the superagent request object is not a promise but rather a custom thenable where the then() method triggers the execution of the request we have built with the builder object.

We already discussed thenables in Chapter 5, Asynchronous Control Flow Patterns with Promises and Async/Await.

If you take a look at the library's code, you will see the Builder pattern in action in the Request class (nodejsdp.link/superagent-src-builder).

This concludes our exploration of the Builder pattern. Next, we'll look at the Revealing Constructor pattern.

Revealing Constructor

The Revealing Constructor pattern is one of those patterns that you won't find in the "Gang of Four" book, since it originated directly from the JavaScript and the Node.js community. It solves a very tricky problem, which is: how can we "reveal" some private functionality of an object only at the moment of the object's creation? This is particularly useful when we want to allow an object's internals to be manipulated only during its creation phase. This allows for a few interesting scenarios, such as:

  • Creating objects that can be modified only at creation time
  • Creating objects whose custom behavior can be defined only at creation time
  • Creating objects that can be initialized only once at creation time

These are just a few possibilities enabled by the Revealing Constructor pattern. But to better understand all the possible use cases, let's see what the pattern is about by looking at the following code fragment:

//                    (1)               (2)          (3)
const object = new SomeClass(function executor(revealedMembers) {
  // manipulation code ...
})

As we can see from the previous code, the Revealing Constructor pattern is made of three fundamental elements; a constructor (1) that takes a function as input (the executor (2)), which is invoked at creation time and receives a subset of the object's internals as input (revealed members (3)).

For the pattern to work, the revealed functionality must otherwise be not accessible by the users of the object once it is created. This can be achieved with one of the encapsulation techniques we've mentioned in the previous section regarding the Factory pattern.

Domenic Denicola was the first to identify and name the pattern in one of his blog posts, which can be found at nodejsdp.link/domenic-revealing-constructor.

Now, let's look at a couple of examples to better understand how the Revealing Constructor pattern works.

Building an immutable buffer

Immutable objects and data structures have many excellent properties that make them ideal to use in countless situations in place of their mutable (or changeable) counterparts. Immutable refers to the property of an object by which its data or state becomes unmodifiable once it's been created.

With immutable objects, we don't need to create defensive copies before passing them around to other libraries or functions. We simply have a strong guarantee, by definition, that they won't be modified, even when they are passed to code that we don't know or control.

Modifying an immutable object can only be done by creating a new copy and can make the code more maintainable and easier to reason about. We do this to make it easier to keep track of state changes.

Another common use case for immutable objects is efficient change detection. Since every change requires a copy and if we assume that every copy corresponds to a modification, then detecting a change is as simple as using the strict equality operator (or triple equal ===). This technique is used extensively in frontend programming to efficiently detect if the UI needs refreshing.

In this context, let's now create a simple immutable version of the Node.js Buffer component (nodejsdp.link/docs-buffer) using the Revealing Constructor pattern. The pattern allows us to manipulate an immutable buffer only at creation time.

Let's implement our immutable buffer in a new file called immutableBuffer.js, as follows:

const MODIFIER_NAMES = ['swap', 'write', 'fill']
export class ImmutableBuffer {
  constructor (size, executor) {
    const buffer = Buffer.alloc(size)                         // (1)
    const modifiers = {}                                      // (2)
    for (const prop in buffer) {                              // (3)
      if (typeof buffer[prop] !== 'function') {
        continue
      }
      if (MODIFIER_NAMES.some(m => prop.startsWith(m))) {     // (4)
        modifiers[prop] = buffer[prop].bind(buffer)
      } else {
        this[prop] = buffer[prop].bind(buffer)                // (5)
      }
    }
    executor(modifiers)                                       // (6)
  }
}

Let's now see how our new ImmutableBuffer class works:

  1. First, we allocate a new Node.js Buffer (buffer) of the size specified in the size constructor argument.
  2. Then, we create an object literal (modifiers) to hold all the methods that can mutate the buffer.
  3. After that, we iterate over all the properties (own and inherited) of our internal buffer, making sure to skip all those that are not functions.
  4. Next, we try to identify if the current prop is a method that allows us to modify the buffer. We do that by trying to match its name with one of the strings in the MODIFIER_NAMES array. If we have such a method, we bind it to the buffer instance, and then we add it to the modifiers object.
  5. If our method is not a modifier method, then we add it directly to the current instance (this).
  6. Finally, we invoke the executor function received as input in the constructor and pass the modifiers object as an argument, which will allow executor to mutate our internal buffer.

In practice, our ImmutableBuffer is acting as a proxy between its consumers and the internal buffer object. Some of the methods of the buffer instance are exposed directly through the ImmutableBuffer interface (mainly the read-only methods), while others are provided to the executor function (the modifier methods).

We will analyze the Proxy pattern in more detail in Chapter 8, Structural Design Patterns.

Please keep in mind that this is just a demonstration of the Revealing Constructor pattern, so the implementation of the immutable buffer is intentionally kept simple. For example, we are not exposing the size of the buffer or providing other means to initialize the buffer. We'll leave this to you as an exercise.

Now, let's write some code to demonstrate how to use our new ImmutableBuffer class. Let's create a new file, index.js, containing the following code:

import { ImmutableBuffer } from './immutableBuffer.js'
const hello = 'Hello!'
const immutable = new ImmutableBuffer(hello.length,
  ({ write }) => {                                         // (1)
    write(hello)
  })
console.log(String.fromCharCode(immutable.readInt8(0)))    // (2)
// the following line will throw
// "TypeError: immutable.write is not a function"
// immutable.write('Hello?')                               // (3)

The first thing we can note from the previous code is how the executor function uses the write() function (which is part of the modifier methods) to write a string into the buffer (1). In a similar way, the executor function could've used fill(), writeInt8(), swap16() or any other method exposed in the modifiers object.

The code we've just seen also demonstrates how the new ImmutableBuffer instance exposes only the methods that don't mutate the buffer, such as readInt8() (2), while it doesn't provide any method to change the content of the buffer (3).

In the wild

The Revealing Constructor pattern offers very strong guarantees and for this reason, it's mainly used in contexts where we need to provide foolproof encapsulation. A perfect application of the pattern would be in components used by hundreds of thousands of developers that have to provide unopinionated interfaces and strict encapsulation. However, we can also use the pattern in our projects to improve reliability and simplify code sharing with other people and teams (since it can make an object safer to use by third parties).

A popular application of the Revealing Constructor pattern is in the JavaScript Promise class. Some of you may have already noticed it. When we create a new Promise from scratch, its constructor accepts as input an executor function that will receive the resolve() and reject() functions used to mutate the internal state of the Promise. Let's provide a reminder of what this looks like:

return new Promise((resolve, reject) => {
  // ...
})

Once created, the Promise state cannot be altered by any other means. All we can do is receive its fulfilment value or rejection reason using the methods we already learned about in Chapter 5, Asynchronous Control Flow Patterns with Promises and Async/Await.

Singleton

Now, we are going to spend a few words on a pattern that is among the most used in object-oriented programming, which is the Singleton pattern. As we will see, Singleton is one of those patterns that has a trivial implementation in Node.js that's almost not worth discussing. However, there are a few caveats and limitations that every good Node.js developer must know.

The purpose of the Singleton pattern is to enforce the presence of only one instance of a class and centralize its access. There are a few reasons for using a single instance across all the components of an application:

  • For sharing stateful information
  • For optimizing resource usage
  • To synchronize access to a resource

As you can imagine, those are quite common scenarios. Take, for example, a typical Database class, which provides access to a database:

// 'Database.js'
export class Database {
  constructor (dbName, connectionDetails) {
    // ...
  }
  // ...
}

Typical implementations of such a class usually keep a pool of database connections, so it doesn't make sense to create a new Database instance for each request. Plus, a Database instance may store some stateful information, such as the list of pending transactions. So, our Database class meets two criterions for justifying the Singleton pattern. Therefore, what we usually want is to configure and instantiate one single Database instance at the start of our application and let every component use that single shared Database instance.

A lot of people new to Node.js get confused about how to implement the Singleton pattern correctly; however, the answer is easier than what we might think. Simply exporting an instance from a module is already enough to obtain something very similar to the Singleton pattern. Consider, for example, the following lines of code:

// file 'dbInstance.js'
import { Database } from './Database.js'
export const dbInstance = new Database('my-app-db', {
    url: 'localhost:5432',
    username: 'user',
    password: 'password'
})

By simply exporting a new instance of our Database class, we can already assume that within the current package (which can easily be the entire code of our application), we are going to have only one instance of the dbInstance module. This is possible because, as we know from Chapter 2, The Module System, Node.js will cache the module, making sure not to execute its code at every import.

For example, we can easily obtain a shared instance of the dbInstance module, which we defined earlier, with the following line of code:

import { dbInstance } from './dbInstance.js'

But there is a caveat. The module is cached using its full path as the lookup key, so it is only guaranteed to be a singleton within the current package. In fact, each package may have its own set of private dependencies inside its node_modules directory, which might result in multiple instances of the same package and therefore of the same module, resulting in our singleton not really being unique anymore! This is, of course, a rare scenario, but it's important to understand what its consequences are.

Consider, for example, the case in which the Database.js and dbInstance.js files that we saw earlier are wrapped into a package named mydb. The following lines of code would be in its package.json file:

{
  "name": "mydb",
  "version": "2.0.0",
  "type": "module",
  "main": "dbInstance.js"
}

Next, consider two packages (package-a and package-b), both of which have a single file called index.js containing the following code:

import { dbInstance } from 'mydb'
export function getDbInstance () {
  return dbInstance
}

Both package-a and package-b have a dependency on the mydb package. However, package-a depends on version 1.0.0 of the mydb package, while package-b depends on version 2.0.0 of the same package (which, for our example, has an identical implementation, but just a different version specified in its package.json file).

Given the structure we just described, we would end up with the following package dependency tree:

app/
`-- node_modules
    |-- package-a
    |  `-- node_modules
    |      `-- mydb
    `-- package-b
        `-- node_modules
            `-- mydb

We end up with a directory structure like the one here because package-a and package-b require two different incompatible versions of the mydb module (for example, 1.0.0 versus 2.0.0). In this case, a typical package manager such as npm or yarn would not "hoist" the dependency to the top node_modules directory, but it will instead install a private copy of each package in an attempt to fix the version incompatibility.

With the directory structure we just saw, both package-a and package-b have a dependency on the mydb package; in turn, the app package, which is our root package, depends on both package-a and package-b.

The scenario we just described will break the assumption about the uniqueness of the database instance. In fact, consider the following file (index.js) located in the root folder of the app package:

import { getDbInstance as getDbFromA } from 'package-a'
import { getDbInstance as getDbFromB } from 'package-b'
const isSame = getDbFromA() === getDbFromB()
console.log('Is the db instance in package-a the same ' +
  `as package-b? ${isSame ? 'YES' : 'NO'}`)

If you run the previous file, you will notice that the answer to Is the db instance in package-a the same as package-b? is NO. In fact, package-a and package-b will actually load two different instances of the dbInstance object because the mydb module will resolve to a different directory, depending on the package it is required from. This clearly break the assumptions of the Singleton pattern.

If instead, both package-a and package-b required two versions of the mydb package compatible with each other, for example, ^2.0.1 and ^2.0.7, then the package manager would install the mydb package into the top-level node_modules directory (a practice known as dependency hoisting), effectively sharing the same instance with package-a, package-b, and the root package.

At this point, we can easily say that the Singleton pattern, as described in the literature, does not exist in Node.js, unless we use a real global variable to store it, such as the following:

global.dbInstance = new Database('my-app-db', {/*...*/})

This guarantees that the instance is the only one shared across the entire application and not just the same package. However, please consider that most of the time, we don't really need a pure singleton. In fact, we usually create and import singletons within the main package of an application or, at worst, in a subcomponent of the application that has been modularized into a dependency.

If you are creating a package that is going to be used by third parties, try to keep it stateless to avoid the issues we've discussed in this section.

Throughout this book, for simplicity, we will use the term singleton to describe a class instance or a stateful object exported by a module, even if this doesn't represent a real singleton in the strict definition of the term.

Next, we are going to see the two main approaches for dealing with dependencies between modules, one based on the Singleton pattern and the other leveraging the Dependency Injection pattern.

Wiring modules

Every application is the result of the aggregation of several components and, as the application grows, the way we connect these components becomes a win or lose factor for the maintainability and success of the project.

When a component, A, needs component B to fulfill a given functionality, we say that "A is dependent on B" or, conversely, that "B is a dependency of A." To appreciate this concept, let's present an example.

Let's say we want to write an API for a blogging system that uses a database to store its data. We can have a generic module implementing a database connection (db.js) and a blog module that exposes the main functionality to create and retrieve blog posts from the database (blog.js).

The following figure illustrates the relationship between the database module and the blog module:

../../../Downloads/Untitled%20Diagram%20(13).png

Figure 7.1: Dependency graph between the blog module and the database module

In this section, we are going to see how we can model this dependency using two different approaches, one based on the Singleton pattern and the other using the Dependency Injection pattern.

Singleton dependencies

The simplest way to wire two modules together is by leveraging Node.js' module system. Stateful dependencies wired in this way are de facto singletons, as we discussed in the previous section.

To see how this works in practice, we are going to implement the simple blogging application that we described earlier using a singleton instance for the database connection. Let's see a possible implementation of this approach (the file db.js):

import { dirname, join } from 'path'
import { fileURLToPath } from 'url'
import sqlite3 from 'sqlite3'
const __dirname = dirname(fileURLToPath(import.meta.url))
export const db = new sqlite3.Database(
  join(__dirname, 'data.sqlite'))

In the previous code, we are using SQLite (nodejsdp.link/sqlite) as a database to store our posts. To interact with SQLite, we are using the module sqlite3 (nodejsdp.link/sqlite3) from npm. SQLite is a database system that keeps all the data in a single local file. In our database module, we decided to use a file called data.sqlite saved in the same folder as the module.

The preceding code creates a new instance of the database pointing to our data file and exports the database connection object as a singleton with the name db.

Now, let's see how we can implement the blog.js module:

import { promisify } from 'util'
import { db } from './db.js'
const dbRun = promisify(db.run.bind(db))
const dbAll = promisify(db.all.bind(db))
export class Blog {
  initialize () {
    const initQuery = `CREATE TABLE IF NOT EXISTS posts (
      id TEXT PRIMARY KEY,
      title TEXT NOT NULL,
      content TEXT,
      created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
    );`
    return dbRun(initQuery)
  }
  createPost (id, title, content, createdAt) {
    return dbRun('INSERT INTO posts VALUES (?, ?, ?, ?)',
      id, title, content, createdAt)
  }
  getAllPosts () {
    return dbAll('SELECT * FROM posts ORDER BY created_at DESC')
  }
}

The blog.js module exports a class called Blog containing three methods:

  • initialize(): Creates the posts table if it doesn't exist. The table will be used to store the blog post's data.
  • createPost(): Takes all the necessary parameters needed to create a post. It will execute an INSERT statement to add the new post to the database.
  • getAllPosts(): Retrieves all the posts available in the database and returns them as an array.

Now, let's create a module to try out the functionality of the blog module we just created (the file index.js):

import { Blog } from './blog.js'
async function main () {
  const blog = new Blog()
  await blog.initialize()
  const posts = await blog.getAllPosts()
  if (posts.length === 0) {
    console.log('No post available. Run `node import-posts.js`' +
      ' to load some sample posts')
  }
  for (const post of posts) {
    console.log(post.title)
    console.log('-'.repeat(post.title.length))
    console.log(`Published on ${new Date(post.created_at)
      .toISOString()}`)
    console.log(post.content)
  }
}
main().catch(console.error)

This preceding module is very simple. We retrieve the array with all the posts using blog.getAllPosts() and then we loop over it and display the data for every single post, giving it a bit of formatting.

You can use the provided import-posts.js module to load some sample posts into the database before running index.js. You can find import-posts.js in the code repository of this book, along with the rest of the files for this example.

As a fun exercise, you could try to modify the index.js module to generate HTML files; one for the blog index and then a dedicated file for each blog post. This way, you would build your own minimalistic static website generator!

As we can see from the preceding code, we can implement a very simple command-line blog management system by leveraging the Singleton pattern to pass the db instance around. Most of the time, this is how we manage stateful dependencies in our application; however, there are situations in which this may not be enough.

Using a singleton, as we have done in the previous example, is certainly the most simple, immediate, and readable solution to pass stateful dependencies around. However, what happens if we want to mock our database during our tests? What can we do if we want to let the user of the blogging CLI or the blogging API select another database backend, instead of the standard SQLite backend that we provide by default? For these use cases, a singleton can be an obstacle for implementing a properly structured solution.

We could introduce if statements in our db.js module to pick different implementations based on some environment condition or some configuration. Alternatively, we could fiddle with the Node.js module system to intercept the import of the database file and replace it with something else. But, as you can image, these solutions are far from elegant.

In the next section, we will learn about another strategy for wiring modules, which can be the ideal solution to some of the issues we discussed here.

Dependency Injection

The Node.js module system and the Singleton pattern can serve as great tools for organizing and wiring together the components of an application. However, these do not always guarantee success. If, on the one hand, they are simple to use and very practical, then on the other, they might introduce a tighter coupling between components.

In the previous example, we can see that the blog.js module is tightly coupled with the db.js module. In fact, our blog.js module cannot work without the database.js module by design, nor can it use a different database module if necessary. We can easily fix this tight coupling between the two modules by leveraging the Dependency Injection pattern.

Dependency Injection (DI) is a very simple pattern in which the dependencies of a component are provided as input by an external entity, often referred to as the injector.

The injector initializes the different components and ties their dependencies together. It can be a simple initialization script or a more sophisticated global container that maps all the dependencies and centralizes the wiring of all the modules of the system. The main advantage of this approach is improved decoupling, especially for modules depending on stateful instances (for example, a database connection). Using DI, each dependency, instead of being hardcoded into the module, is received from the outside. This means that the dependent module can be configured to use any compatible dependency, and therefore the module itself can be reused in different contexts with minimal effort.

The following diagram illustrates this idea:

../../../Downloads/Untitled%20Diagram%20(14).png

Figure 7.2: Dependency injection schematic

In Figure 7.2, we can see that a generic service expects a dependency with a predetermined interface. It's the responsibility of the injector to retrieve or create an actual concrete instance that implements such an interface and passes it (or "injects it") into the service. In other words, the injector has the goal of providing an instance that fulfills the dependency for the service.

To demonstrate this pattern in practice, let's refactor the simple blogging system that we built in the previous section by using DI to wire its modules. Let's start by refactoring the blog.js module:

import { promisify } from 'util'
export class Blog {
  constructor (db) {
    this.db = db
    this.dbRun = promisify(db.run.bind(db))
    this.dbAll = promisify(db.all.bind(db))
  }
  initialize () {
    const initQuery = `CREATE TABLE IF NOT EXISTS posts (
      id TEXT PRIMARY KEY,
      title TEXT NOT NULL,
      content TEXT,
      created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
    );`
    return this.dbRun(initQuery)
  }
  createPost (id, title, content, createdAt) {
    return this.dbRun('INSERT INTO posts VALUES (?, ?, ?, ?)',
      id, title, content, createdAt)
  }
  getAllPosts () {
    return this.dbAll(
      'SELECT * FROM posts ORDER BY created_at DESC')
  }
}

If you compare the new version with the previous one, they are almost identical. There are only two small but important differences:

  • We are not importing the database module anymore
  • The Blog class constructor takes db as an argument

The new constructor argument db is the expected dependency that needs to be provided at runtime by the client component of the Blog class. This client component is going to be the injector of the dependency. Since JavaScript doesn't have any way to represent abstract interfaces, the provided dependency is expected to implement the db.run() and db.all() methods. This is called duck typing, as mentioned earlier in this book.

Let's now rewrite our db.js module. The goal here is to get rid of the Singleton pattern and to come up with an implementation that is more reusable and configurable:

import sqlite3 from 'sqlite3'
export function createDb (dbFile) {
  return new sqlite3.Database(dbFile)
}

This new implementation of the db module provides a factory function called createDb(), which allows us to create new instances of the database at runtime. It also allows us to pass the path to the database file at creation time so that we can create independent instances that can write to different files if we have to.

At this point, we have almost all the building blocks in place, we are only missing the injector. We will give an example of the injector by reimplementing the index.js module:

import { dirname, join } from 'path'
import { fileURLToPath } from 'url'
import { Blog } from './blog.js'
import { createDb } from './db.js'
const __dirname = dirname(fileURLToPath(import.meta.url))
async function main () {
  const db = createDb(join(__dirname, 'data.sqlite'))
  const blog = new Blog(db)
  await blog.initialize()
  const posts = await blog.getAllPosts()
  if (posts.length === 0) {
    console.log('No post available. Run `node import-posts.js`' +
      ' to load some sample posts')
  }
  for (const post of posts) {
    console.log(post.title)
    console.log('-'.repeat(post.title.length))
    console.log(`Published on ${new Date(post.created_at)
      .toISOString()}`)
    console.log(post.content)
  }
}
main().catch(console.error)

This code is also quite similar to the previous implementation, except for two important changes (highlighted in the preceding code):

  1. We create the database dependency (db) using the factory function createDb().
  2. We explicitly "inject" the database instance when we instantiate the Blog class.

In this implementation of our blogging system, the blog.js module is totally decoupled from the actual database implementation, making it more composable and easy to test in isolation.

We saw how to inject dependencies as constructor arguments (constructor injection), but dependencies can also be passed when invoking a function or method (function injection) or injected explicitly by assigning the relevant properties of an object (property injection).

Unfortunately, the advantages in terms of decoupling and reusability offered by the Dependency Injection pattern come with a price to pay. In general, the inability to resolve a dependency at coding time makes it more difficult to understand the relationship between the various components of a system. This is especially true in large applications where we might have a significant amount of services with a complex dependency graph.

Also, if we look at the way we instantiated our database dependency in our preceding example script, we can see that we had to make sure that the database instance was created before we could invoke any function from our Blog instance. This means that, when used in its raw form, Dependency Injection forces us to build the dependency graph of the entire application by hand, making sure that we do it in the right order. This can become unmanageable when the number of modules to wire becomes too high.

Another pattern, called Inversion of Control, allows us to shift the responsibility of wiring the modules of an application to a third-party entity. This entity can be a service locator (a simple component used to retrieve a dependency, for example, serviceLocator.get('db')) or a dependency injection container (a system that injects the dependencies into a component based on some metadata specified in the code itself or in a configuration file). You can find more about these components on Martin Fowler's blog at nodejsdp.link/ioc-containers. Even though these techniques derail a bit from the Node.js way of doing things, some of them have recently gained some popularity. Check out inversify (nodejsdp.link/inversify) and awilix (nodejsdp.link/awilix) to find out more.

Summary

In this chapter, you were gently introduced to a set of traditional design patterns concerning the creation of objects. Some of those patterns are so basic, and yet essential at the same time, that you have probably already used them in one way or another.

Patterns such as Factory and Singleton are, for example, two of the most ubiquitous in object-oriented programming in general. However, in JavaScript, their implementation and significance are very different from what was thought up by the Gang of Four book. For example, Factory becomes a very versatile pattern that works in perfect harmony with the hybrid nature of the JavaScript language, that is, half object-oriented and half functional. On the other hand, Singleton becomes so trivial to implement that it's almost a non-pattern, but it carries a set of caveats that you should have learned to take into account.

Among the patterns you've learned in this chapter, the Builder pattern may seem the one that has retained most of its traditional object-oriented form. However, we've shown you that it can also be used to invoke complex functions and not just to build objects.

The Revealing Constructor pattern, on the other hand, deserves a category of its own. Born from necessities arising from the JavaScript language itself, it provides an elegant solution to the problem of "revealing" certain private object properties at construction time only. It provides strong guarantees in a language that is relaxed by nature.

Finally, you learned about the two main techniques for wiring components together: Singleton and Dependency Injection. We've seen how the first is the simplest and most practical approach, while the second is more powerful but also potentially more complex to implement.

As we already mentioned, this was just the first of a series of three chapters entirely dedicated to traditional design patterns. In these chapters, we will try to teach the right balance between creativity and rigor. We want to show not only that there are patterns that can be reused to improve our code, but also that their implementation is not the most important detail; in fact, it can vary a lot, or even overlap with other patterns. What really matters, however, is the blueprint, the guidelines, and the idea at the base of each pattern. This is the real reusable piece of information that we can exploit to design better Node.js applications in a fun way.

In the next chapter, you will learn about another category of traditional design patterns, called structural patterns. As the name suggests, these patterns are aimed at improving the way we combine objects together to build more complex, yet flexible and reusable structures.

Exercises

  • 7.1 Console color factory: Create a class called ColorConsole that has just one empty method called log(). Then, create three subclasses: RedConsole, BlueConsole, and GreenConsole. The log() method of every ColorConsole subclass will accept a string as input and will print that string to the console using the color that gives the name to the class. Then, create a factory function that takes color as input, such as 'red', and returns the related ColorConsole subclass. Finally, write a small command-line script to try the new console color factory. You can use this Stack Overflow answer as a reference for using colors in the console: nodejsdp.link/console-colors.
  • 7.2 Request builder: Create your own Builder class around the built-in http.request() function. The builder must be able to provide at least basic facilities to specify the HTTP method, the URL, the query component of the URL, the header parameters, and the eventual body data to be sent. To send the request, provide an invoke() method that returns a Promise for the invocation. You can find the docs for http.request() at the following URL: nodejsdp.link/docs-http-request.
  • 7.3 A tamper-free queue: Create a Queue class that has only one publicly accessible method called dequeue(). Such a method returns a Promise that resolves with a new element extracted from an internal queue data structure. If the queue is empty, then the Promise will resolve when a new item is added. The Queue class must also have a revealing constructor that provides a function called enqueue() to the executor that pushes a new element to the end of the internal queue. The enqueue() function can be invoked asynchronously and it must also take care of "unblocking" any eventual Promise returned by the dequeue() method. To try out the Queue class, you could build a small HTTP server into the executor function. Such a server would receive messages or tasks from a client and would push them into the queue. A loop would then consume all those messages using the dequeue() method.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.118.184.142