Suppose we want to read all of the files in the recipes directory, in alphabetical order, and then concatenate their contents into a single string and display it. We could do this quite easily using synchronous methods.
Asyncjs/synchronous.js | |
| var fs = require('fs'); |
| process.chdir('recipes'); // change the working directory |
| |
| var concatenation = ''; |
| |
| fs.readdirSync('.') |
| .filter(function(filename) { |
| // ignore directories |
| return fs.statSync(filename).isFile(); |
| }) |
| .forEach(function(filename) { |
| // add contents to our output |
| concatenation += fs.readFileSync(filename, 'utf8'); |
| }); |
| |
| console.log(concatenation); |
(Be aware that the forEach iterator isn’t available in older JavaScript environments, such as IE6. You can fix this with a library like Kris Kowal’s es5-shim.[41] We’ll learn how to serve this library to just the browsers that need it in Chapter 6, Async Script Loading.)
But all this blocking is terribly inefficient, particularly if our application could be doing something else simultaneously. The problem is that we can’t just naïvely replace
| concatenation += fs.readFileSync(filename, 'utf8'); |
with its async analog
| fs.readFile(filename, 'utf8', function(err, contents) { |
| if (err) throw err; |
| concatenation += contents; |
| }); |
because there’s no guarantee that the readFile callbacks would fire in the order that the readFile calls were made in. readFile just tells the OS to start reading a file. Most likely, shorter files will be read more quickly than longer files. As a result, the order in which the recipes are added to concatenation would be unpredictable. Plus, we’d have to make our console.log line somehow run after all the callbacks have fired.
To use multiple async tasks and get a predictable result, we’ll need to do some planning.
3.144.254.72