Chapter 2. Setting Up Grunt

In this chapter, we will go over the steps required to get Grunt up and running. We begin with an introduction into Node.js and npm, as they are the key technologies used to build the foundations upon which Grunt stands. We review Node.js modules and how they relate to Grunt, then cover the basics of a Grunt environment, including package.json and Gruntfile.js files. Once we are set up, we'll move onto configuring Grunt. We will look into the various methods and strategies that make Grunt best convey our build.

Installation

In this section, we cover how to install and use Grunt's key components, Node.js and npm. We will review a brief introduction into each, as well as their core concepts. Subsequently, we will cover the simple installation of Grunt itself.

Node.js

Although this book primarily focuses on Grunt, we will also dip our toes into the world of Node.js (http://gswg.io#node) fairly regularly. Given Grunt is written as a Node.js module and Grunt tasks and plugins are also Node.js modules, it is important that we understand the basics of Node.js and its package manager, npm (http://gswg.io#npm).

Ryan Dahl started the Node.js project in early 2009 out of frustration with the current state of web servers in the industry. At the time, web servers written in Ruby (Mongrel and then Thin) were popular due to the Ruby on Rails framework. Ryan realized that writing a really fast web server in Ruby just wasn't possible. Ruby's inefficiency was actually caused by the language's blocking nature, which meant – in context of a web server – that it could not effectively use the hardware available to it. A program is said to be blocking when it causes the CPU to be put on pause while it waits on a given Input/Output (I/O) task such as reading from the hard drive or making a network request to a web server.

Blocking is inherent in many programming languages. JavaScript and hence Node.js can avoid the blocking problem through its evented execution model. This model allows JavaScript programs to execute code asynchronously. That is, I/O tasks within JavaScript programs can be written so they don't block, and therefore achieve a high degree of efficiency.

The following table from Ryan Dahl's original Node.js presentation (http://gswg.io#node-presentation) in late 2009 shows the main types of I/O operations and using the average access time, the corresponding number of CPU cycles that could have been used during each I/O operation:

I/O operation

CPU cycles

L1

3 cycles

L2

14 cycles

RAM

250 cycles

Disk

41,000,000 cycles

Network

240,000,000 cycles

Based on this table, by blocking the CPU on any disk or network access, we are introducing large inefficiencies into our programs; so using Node.js is a huge step forward when building any application dealing with system I/O – which is most applications today.

On a general note, learning how the language works, where it excels and where it doesn't, and why JavaScript isn't the "toy" language that many have previously labeled it, will be of great value when traversing the JavaScript landscape. For a list of useful JavaScript resources, see Chapter 5, Advanced Grunt.

To install Node.js, first we visit the Node.js download page: http://gswg.io#node-download. Once there, you should see the following table of download options:

Node.js

At the time of writing, the newest Node.js version is 0.10.22. This will most likely change, but fear not! The download page always contains the latest stable release of Node.js.

On Windows and Mac, the installers are the simplest way of installing Node.js. However, some may prefer using an operating system package manager, as they generally provide a more uniform method to install, uninstall, and most importantly – upgrade. For instance, if you are on a Mac, using homebrew to install Node.js is also very simple and it provides the added benefit of easy version upgrades to new versions as they're released, with the command: brew upgrade node. To read more on installing Node.js via a package manager, see http://gswg.io#node-with-package-manager. This page contains installation guides for Mac, Windows, Ubuntu, and various other Linux distributions. We'll learn more about homebrew in Chapter 5, Advanced Grunt, in the Development tools section.

Now we have installed Node.js, which has npm bundled along with it; we should have access to the node and npm executables, as they should now reside in our system's PATH.

The following commands should print the version of each executable to the console: node --version and npm --version, which should display the Node.js version that you just downloaded and installed. At the time of writing, my output looks like:

$ node --version
v0.10.22
$ npm --version
1.3.14

This confirms that we have set up Node.js correctly, and we are now ready to use it!

Modules

Before we look at npm, we first need to understand the basics of the Node.js module system. The Node.js module system is an implementation of the CommonJS specification. CommonJS describes a simple syntax for JavaScript programs to require (or import) other JavaScript programs into their context. This missing feature of JavaScript greatly assists with creating modular systems by simplifying the process of separating concerns. In Node.js, all JavaScript files can be seen as individual modules. So, beyond this point, we'll use the terms: file and module interchangeably. We may have also heard the term package being used in the place of module, which can be confusing. Rest assured, however, we'll cover packages in the next section on npm.

The CommonJS 1.1.1 specification can be found at http://gswg.io#commonjs. This specification describes the use of the following variables:

  • module – an object representing the module itself. The module object contains the exports object. In the case of Node.js, it also contains meta-information, such as id, parent, and children.
  • exports – a plain JavaScript object, which may be augmented to expose functionality to other modules. The exports object is returned as the result of a call to require.
  • require ­– a function is used to import modules, returning the corresponding exports object.

In the case of Node.js, modules can be imported by filename using relative paths or absolute paths. When using npm (which stores modules in the node_modules directory), modules can also be imported by module name, which we'll see more on in the next subsection. In the case of a web browser, another implementation of CommonJS might require modules by URL.

The CommonJS specification also contains the following sample code, slightly modified for the purpose of clarity:

//Code example 01-modules
//program.js
var inc = require('./increment').increment;
var a = 1;
console.log(inc(a));

//increment.js
var add = require('./math').add;
exports.increment = function(b) {
    return add(b, 1);
}; 

//math.js
exports.add = function(c, d) {
    return c + d;
};

In this example, we'll use program.js as our entry point or "main" file. Since we know require will return the exports object of the desired file, it's quite easy to see what it does. Starting at program.js, we can see that it calls require('./increment'). When this require function is called, it synchronously executes the increment.js file. The increment.js module in turn, calls require('./math'). The math.js file augments its exports object with an add function.

Once the math.js file completes execution, require returns the math.js module's exports object, thereby allowing increment.js to use the add function. Subsequently, increment.js will complete its execution and return its exports object to program.js. Finally, program.js uses its new inc function to increment the variable a from 1 to 2. Now, when we run program.js with Node.js, we should see the following result:

$ node program.js
2

The important takeaway from this example is the separation of concerns provided by this modularity. Notice that the program.js module has no notion of the add.js module, yet it is doing most of the work. In computer science, the idea of abstracting functionality is not a new one; and with Node.js implementing CommonJS, it has provided a simple way for users to write modular programs in JavaScript. We could place this functionality in a single file, but if we were to extend math.js to include every common math function, its size and complexity would quickly grow. By splitting modules into submodules, we are separating the concerns of our program, transforming it from a single large complex program into multiple small and simple programs. The idea of many small programs working together is one of the foundations of Node.js. This helps us steer clear of large monolithic libraries such as jQuery v1.x.x, making their way into Node.js. Libraries of that size would be split up into smaller modules, allowing the user to use only what they require. The official documentation of the Node.js module system can be found at http://gswg.io#node-modules.

npm

As previously noted, npm is the Node.js package manager. Since the release of Node.js version 0.6.3, npm comes prepackaged with each Node.js distribution. npm provides the means to publish Node.js packages into the npm repository under a unique name. Subsequently, such a package may be installed by anyone who knows this unique name. This is the essence of npm – sharing and retrieving code from a public repository. "What is a package?" we may ask. On the npm Frequently Asked Questions (FAQ) page (http://gswg.io#npm-what-is-a-package), we see the following extract:

"What is a package?

A package is:

a) a folder containing a program described by a package.json file

b) a gzipped tarball containing (a)

c) a url that resolves to (b)"

Points d) through g) have been removed for brevity, however each definition results in a). So ultimately, a package is any folder containing a valid package.json file. This is the only requirement to pass as an npm package.

In the last part we learned about modules, while relating back to the CommonJS specification. With the introduction of npm, there was a need to extend the CommonJS definition. The following description of a module is outlined on the npm FAQ page (http://gswg.io#npm-what-is-a-module):

"What is a module?

A module is anything that can be loaded with require() in a Node.js program. The following things are all examples of things that can be loaded as modules:

A folder with a package.json file containing a main field.

A folder with an index.js file in it.

A JavaScript file."

So, as well as being a single JavaScript file, a module can be any folder with an index.js file in it, and can be any folder with a package.json file containing a main field (basically, the main field allows us to rename index.js). Notice that these two new definitions approximately coincide with the definition of a package. A package is folder that has a package.json file and a module can be a folder with package.json file or an index.js file. So, in order for someone to use your package in their program, it must be loaded with the require function, which by definition, means your package must also be a module. This is why Node.js programs are commonly referred to as "node modules" not "node packages" because "module" is more fitting in most scenarios.

In the early years, soon after Node.js v0.1.8 was released, the platform started with only the CommonJS-based module system outlined in the past section. It had no sanctioned way to find and publish modules. Isaac Schlueter saw this gap and set out to fill it, starting the npm project in September 2009. In early 2010, Ryan requested Isaac to join him at Joyent to work on npm and Node.js full-time. In January 2012, Ryan stepped down as the "gatekeeper" of the Node.js project and handed over the reins to Isaac.

A fun fact is that many believe they bring the truth when they quote "npm is not an acronym for the Node Package Manager" from the npm FAQ. However, Isaac was being humorous on the day of writing and this is not actually true. As some might say, he was "trolling".

npm has a many features, though for purposes of this book, we'll cover the two most relevant workflows: finding modules and installing modules.

Finding modules

The search feature of npm is fairly straightforward; we type the command npm search followed by the search terms we wish to match. If we were to enter: npm search grunt concat, npm would return all packages which match both grunt and concat. A term is considered a match if it's contained anywhere in the title, description, tags, or dependencies of the package descriptor, that is, the package.json file. So, before we use Google to find modules, it's best to try npm search first, as npm will search through metadata that does not appear on the npm website and is hence not indexed by Google. Let's say we wanted to find a Grunt plugin that makes use of the Unix rsync tool. We might try npm search gruntplugin rsync. In this case we've included gruntplugin, which according to the Grunt team, is a recommended tag for all Grunt plugins to use. We have also included rsync, to narrow the search down to only those Grunt plugins matching rsync. This command currently yields:

$ npm search gruntplugin rsync

NAME               DESCRIPTION 
grunt-rsync        A Grunt task for accessing the file copying
                   and syncing capabilities of the 
grunt-rsync-2      Copy files to a (remote) machine with rsync.
                   supports maps with target:source 

Once we've found a potentially useful package, we can view its package information with npm info <name>, so we use npm info grunt-rsync in this case. However, in most cases, we just want to know how to use it. So, if the package has a public Git repository and also adheres to open source best practice, it should have a README file documenting its usage. We can open this repository page with the npm repo <name> command. Now that we've read about the package and we've decided that it may be what we're searching for, it is time to install it.

Installing modules

The npm install command has one purpose: to download modules from the npm repository. When installing a module, we can either install it locally or globally. We would choose to install a module locally if we're to use it in another module or application, and we'd choose a global install if we wanted to use the module as a command-line tool.

When we installed Node.js, a folder for npm "binaries" files was created and added to your system's PATH. This allows npm to globally install modules by placing a symbolic link in this directory, which points to the file specified in the package.json file's bin field. We say "binaries" here as the term binary file generally means some kind of compiled machine code; however, in this case, an npm binary is simply a JavaScript file. For example, if we wanted to install the express module globally, we would use the command: npm install -g express.

In the context of Grunt, we'll mainly be using npm install to utilize plugins locally inside a specific Grunt environment. Let's say we are developing a jQuery plugin and we wish to minify our source code. We can achieve this with the grunt-contrib-uglify plugin. In order to use this plugin in our Gruntfile.js file, we must first install it locally with the command: npm install grunt-contrib-uglify. This will place the newly downloaded module inside the current package's node_modules folder. To determine the current package, npm will traverse up the file directory tree from the current working directory, looking a module descriptor – package.json. If a package.json file is found, its containing folder will be used as the package root directory; however, if it is not found, the npm will assume there is no package yet and use the current directory as the package root directory. Once a package root directory has been determined, a node_modules folder will be created (if one doesn't already exist) and then finally, the module we're installing will be placed in there. To help solidify this, consider the following directory structure:

//Code example 02-npm-install-directory
└── project
    ├── a
    │   └── b
    │       └── c
    │           └── important.js
    └── package.json

If we run npm install grunt-contrib-uglify from the c directory, the project directory will be used as the package root directory, as it contains package.json.

$ cd project/a/b/c 
$ npm install grunt-contrib-uglify

Once complete, the preceding command will result in the following directory structure:

└── project
    ├── a
    │   └── b
    │       └── c
    │           └── important.js
    ├── node_modules
    │   ├── grunt-contrib-uglify
    │   │   └── ...
    │   └── ...
    └── package.json

However, if we removed package.json before npm installing this same command would instead result in the following directory structure:

└── project
    └── a
        └── b
            └── c
                ├── important.js
                └── node_modules
                    ├── grunt-contrib-uglify
                    │   └── ...
                    └── ...

This pattern of calculating where to place the node_modules directory is compatible with the pattern that the require function uses to find modules. When we wish to use a newly installed module, we call the require function with the module's name (instead of a filename). The require function will look for the node_modules folder in the current directory and if it's not there, it will check the parent directory. It will keep searching up the directory tree until it finds a node_modules folder or until it reaches the root of the drive. Therefore, we can always require a module from where it was installed, even if it was actually placed many folders up the directory tree.

Now that we've installed grunt-contrib-uglify, we can load this module's Grunt tasks using: grunt.loadNpmTasks("grunt-contrib-uglify") within our Gruntfile.js file. The loadNpmTasks function searches for our node_modules folder in a similar way the require function. Once found it will look inside for the desired module. Lastly, it will load all of the files in the module's tasks directory. This is how a single module (a Grunt plugin) can provide multiple tasks.

Grunt

Finally, we can install Grunt! The Grunt Command-line interface (CLI) is published as a separate module for one important reason: to allow us to work on one machine, on multiple projects with various backward-incompatible versions of Grunt, without concern. We can do this because the grunt-cli module searches for an instance of Grunt (the grunt module) within the current directory or its parent directories (again, similar to the require function).

This means we can pull a legacy Grunt project (v0.3.x) and run grunt on the command line (which is actually the grunt-cli module). Then, navigate to a different Grunt project (v0.4.x) and run grunt again; both will run seamlessly. With this in mind, we should be able to see why we install grunt-cli globally and grunt locally.

First, we'll install grunt-cli with the following command:

$ npm install -g grunt-cli

Tip

It should be noted that on Mac and Linux, we might receive a permissions error when installing modules globally. To remedy this we can prepend sudo, for example, sudo npm install –g grunt. However, modules are able to execute arbitrary code on installation; therefore, using sudo may be considered unsafe. To prevent this, it's best to reinstall Node.js without using sudo. For more information on this topic, please see this GitHub Gist (http://gswg.io#npm-no-sudo) by Isaac Schlueter.

Next, we'll find the project in which we wish to use Grunt and we'll use the following command:

$ cd my-project/
$ npm install grunt

Note, however that when it becomes time to set up this particular project again, we would prefer not to have to manually remember every module we used. One solution to this problem is to save our node_modules folder along with our project. This might be okay in some cases, however, npm was built to house and serve modules. In this next section, we'll see a better solution using npm, our package.json file and the dependencies field.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.143.235.23