Let’s try to solve this problem without bringing in any utility functions. The simplest approach that I can think of is to run each readFile from the callback of the previous one, while keeping track of the number of callbacks that have fired so far in order to eventually show the output. Here’s my implementation:
Asyncjs/seriesByHand.js | |
| var fs = require('fs'); |
| process.chdir('recipes'); // change the working directory |
| var concatenation = ''; |
| |
| fs.readdir('.', function(err, filenames) { |
| if (err) throw err; |
| |
| function readFileAt(i) { |
| var filename = filenames[i]; |
| fs.stat(filename, function(err, stats) { |
| if (err) throw err; |
| if (! stats.isFile()) return readFileAt(i + 1); |
| |
| fs.readFile(filename, 'utf8', function(err, text) { |
| if (err) throw err; |
| concatenation += text; |
| if (i + 1 === filenames.length) { |
| // all files read, display the output |
| return console.log(concatenation); |
| } |
| readFileAt(i + 1); |
| }); |
| }); |
| } |
| readFileAt(0); |
| }); |
This is, as you may have noticed, a lot more code than the synchronous version. When we used the synchronous filter and forEach methods, this took about half as many lines and reads much more clearly! Wouldn’t it be nice if we could just drop in async equivalents of those wonderful iteration methods? With Async.js, we can do just that!
We want to replace the filter and forEach methods we used for synchronous iteration with async analogs. Async.js gives us two options.
async.filter and async.forEach, which process the given array in parallel
async.filterSeries and async.forEachSeries, which process the given array sequentially
Running our async operations in parallel would be faster, so why would we want to use a series method? There are two reasons.
The aforementioned problem of unpredictable ordering. We might get around this by storing our results in an array and then joining it, but that’s an extra step.
There’s a limit on the number of files that Node (or any application process) can try to read simultaneously. If we hit that limit, the OS would give us an error. If we read the files sequentially, we don’t have to deal with this limitation.
So, we’ll stick to async.forEachSeries for now. Here’s a straightforward adaptation of our synchronous code to use Async.js’s collection methods:
Asyncjs/forEachSeries.js | |
| var async = require('async'); |
| var fs = require('fs'); |
| process.chdir('recipes'); // change the working directory |
| |
| var concatenation = ''; |
| |
| var dirContents = fs.readdirSync('.'); |
| |
| async.filter(dirContents, isFilename, function(filenames) { |
| async.forEachSeries(filenames, readAndConcat, onComplete); |
| }); |
| |
| function isFilename(filename, callback) { |
| fs.stat(filename, function(err, stats) { |
| if (err) throw err; |
| callback(stats.isFile()); |
| }); |
| } |
| |
| function readAndConcat(filename, callback) { |
| fs.readFile(filename, 'utf8', function(err, fileContents) { |
| if (err) return callback(err); |
| concatenation += fileContents; |
| callback(); |
| }); |
| } |
| |
| function onComplete(err) { |
| if (err) throw err; |
| console.log(concatenation); |
| } |
Now our code splits up nicely into two parts: the overall task (in the form of the async.filter and async.forEachSeries calls) and the implementation details (in the form of two iterator functions and one final callback).
filter and forEach aren’t the only Async.js utilities corresponding to standard functional iteration methods. There are also the following:
reject/rejectSeries, the inverse of filter
map/mapSeries, for 1:1 transformations
reduce/reduceRight, for transforming a value at each step
detect/detectSeries, for finding a value matching a filter
sortBy, for generating a sorted copy
some, for testing whether at least one value matches the given criterion
every, for testing whether all values match the given criterion
These methods are the core of Async.js, allowing you to perform common iterations with minimal boilerplate. Before we move on to more advanced methods, let’s take a look at the way these methods deal with errors.
In our original async code, we had three throws. In the Async.js version, we have two, yet all errors will still be thrown. How does Async.js do it? And why can’t we have just one throw?
Simply put, Async.js follows Node conventions. This means that every I/O callback has the form (err, results...)—with the exception of callbacks where the result is a boolean. Boolean callbacks just have the form (result), which is why our isFilename iterator from the previous code example needs to handle errors on its own.
Asyncjs/forEachSeries.js | |
| function isFilename(filename, callback) { |
| fs.stat(filename, function(err, stats) { |
| if (err) throw err; |
| callback(stats.isFile()); |
| }); |
| } |
Blame Node’s fs.exists for setting this precedent. That means that the iterators for the Async.js collection methods (filter, reject, detect, some, every, and their series equivalents) can’t report errors.
With all non-boolean Async.js iterators, passing a value other than null or undefined as the first argument to the iterator’s callback will immediately invoke the completion callback with that error. That’s why readAndConcat can do without throw.
Asyncjs/forEachSeries.js | |
| function readAndConcat(filename, callback) { |
| fs.readFile(filename, 'utf8', function(err, fileContents) { |
| if (err) return callback(err); |
| concatenation += fileContents; |
| callback(); |
| }); |
| } |
So, if callback(err) does get called from readAndConcat, that err will be passed to onComplete. Async.js guarantees that onComplete will be called only once, either the first time an error occurs or after all operations have finished successfully.
Asyncjs/forEachSeries.js | |
| function onComplete(err) { |
| if (err) throw err; |
| console.log(concatenation); |
| } |
Node’s error conventions may not ideal for Async.js’s collection methods. But for all of Async.js’s other methods, following these conventions allow errors to flow neatly from individual tasks to the completion callback. We’ll see more examples of this in the next section.
3.129.26.204