Most active web applications and services have a lot of data flowing through them. That data comes in the form of text, JSON strings, binary buffers, and data streams. For that reason, Node.js has many mechanisms built in to support handling the data I/O from system to system. It is important to understand the mechanisms that Node.js provides to implement effective and efficient web applications and services.
This chapter focuses on manipulating JSON data, managing binary data buffers, implementing readable and writable streams, and compressing and decompressing data. You learn how to leverage the Node.js functionality to work with different I/O requirements.
One of the most common data types that you work with when implementing Node.js web applications and services is JSON (JavaScript Object Notation). JSON is a lightweight method to convert JavaScript objects into a string form and then back again. This provides an easy method when you need to serialize data objects when passing them from client to server, process to process, stream to stream, or when storing them in a database.
There are several reasons to use JSON to serialize your JavaScript objects over XML including the following:
JSON is much more efficient and takes up fewer characters.
Serializing/deserializing JSON is faster than XML because it’s simpler syntax.
JSON is easier to read from a developer’s perspective because it is similar to JavaScript syntax.
The only reasons you might want to use XML over JSON are for complex objects or if you have XML/XSLT transforms already in place.
A JSON string represents the JavaScript object in string form. The string syntax is similar to code, making it easy to understand. You can use the JSON.parse(string)
method to convert a string that is properly formatted with JSON into a JavaScript object.
For example, the following code snippet defines accountStr
as a formatted JSON string and converts it to a JavaScript object using JSON.parse()
. Then member properties can be accessed via dot notation:
var accountStr = '{"name":"Jedi", "members":["Yoda","Obi Wan"], "number":34512, "location": "A galaxy far, far away"}'; var accountObj = JSON.parse(accountStr); console.log(accountObj.name); console.log(accountObj.members);
The preceding code outputs the following:
Jedi [ 'Yoda', 'Obi Wan' ]
Node also allows you to convert a JavaScript object into a properly formatted JSON string. Thus the string form can be stored in a file or database, sent across an HTTP connection, or written to a stream/buffer. Use the JSON.stringify(text)
method to parse JSON text and generate a JavaScript object:
For example, the following code defines a JavaScript object that includes string, numeric, and array properties. Using JSON.stringify()
, it is all converted to a JSON string:
var accountObj = { name: "Baggins", number: 10645, members: ["Frodo, Bilbo"], location: "Shire" }; var accountStr = JSON.stringify(accountObj); console.log(accountStr);
The preceding code outputs the following:
{"name":"Baggins","number":10645,"members":["Frodo, Bilbo"],"location":"Shire"}
Buffer
Module to Buffer DataWhile JavaScript is Unicode friendly, it is not good at managing binary data. However, binary data is useful when implementing some web applications and services. For example:
Transferring compressed files
Sending serialized binary data
Buffered data is made up of a series of octets in big endian or little endian format. That means they take up considerably less space than textual data. Therefore, Node.js provides the Buffer
module that gives you the functionality to create, read, write, and manipulate binary data in a buffer structure. The Buffer
module is global, so you do not need to use the require()
statement to access it.
Buffered data is stored in a structure similar to that of an array but is stored outside the normal V8 heap in raw memory allocations. Therefore a Buffer
cannot be resized.
When converting buffers to and from strings, you need to specify the explicit encoding method to be used. Table 5.1 lists the various encoding methods supported.
Table 5.1 Methods of encoding between strings and binary buffers
Method |
Description |
|
Multi-byte encoded Unicode characters used as the standard in most documents and webpages. |
|
Little endian encoded Unicode characters of 2 or 4 bytes. |
|
Same as |
|
Base64 string encoding. |
|
Encode each byte as two hexadecimal characters. |
Big Endian and Little Endian
Binary data in buffers is stored as a series of octets or a sequence of eight 0s and 1s that can be a hexadecimal value of 0x00 to 0xFF. It can be read as a single byte or as a word containing multiple bytes. Endian defines the ordering of significant bits when defining the word. Big endian stores the least significant word first, and little endian stores the least significant word last. For example, the words 0x0A 0x0B 0x0C 0x0D
would be stored in the buffer as [0x0A, 0x0B, 0x0C, 0x0D]
in big endian but as [0x0D, 0x0C, 0x0B, 0x0A]
in little endian.
Buffer
objects are actually raw memory allocations; therefore, their size must be determined when they are created. The three methods for creating Buffer
objects using the new
keyword are
new Buffer(sizeInBytes) new Buffer(octetArray) new Buffer(string, [encoding])
For example, the following lines of code define buffers using a byte size, octet buffer, and a UTF8
string:
var buf256 = new Buffer(256); var bufOctets = new Buffer([0x6f, 0x63, 0x74, 0x65, 0x74, 0x73]); var bufUTF8 = new Buffer("Some UTF8 Text u00b6 u30c6 u20ac", 'utf8');
You cannot extend the size of a Buffer
object after it has been created, but you can write data to any location in the buffer. Table 5.2 describes the three methods you can use when writing to buffers.
Table 5.2 Methods of writing from Buffer
objects
Method |
Description |
|
Writes |
|
Replaces the data at index |
|
Writes the value to every byte in the buffer starting at the |
|
There is a wide range of methods for |
To illustrate writing to buffers better, Listing 5.1 defines a buffer, fills it with zeros, writes some text at the beginning using the write()
method at line 4, and then adds some additional text using a write that alters part of the existing buffer using write(string, offset, length)
at line 6. Then in line 8 it adds a +
to the end by directly setting the value of an index, as shown in Listing 5.1 Output. Notice that the buf256.write("more text", 9, 9)
statement writes to the middle of the buffer and buf256[18] = 43
changes a single byte.
Listing 5.1 buffer_write.js
: Various ways to write to a Buffer object
1 buf256 = new Buffer(256); 2 buf256.fill(0); 3 buf256.write("add some text"); 4 console.log(buf256.toString()); 5 buf256.write("more text", 9, 9); 6 console.log(buf256.toString()); 7 buf256[18] = 43; 8 console.log(buf256.toString());
Listing 5.1 Output buffer_write.js
: Writing data from a Buffer object
C:ooks odech05>node buffer_write.js add some text add some more text add some more text+
There are several methods for reading from buffers. The simplest is to use the toString()
method to convert all or part of a buffer to a string. However, you can also access specific indexes in the buffer directly or by using read()
. Also Node.js provides a StringDecoder
object that has a write(buffer)
method that decodes and writes buffered data using the specified encoding. Table 5.3 describes these methods for reading Buffer
objects.
Table 5.3 Methods of reading from Buffer
objects
Method |
Description |
|
Returns a string containing the decoded characters specified by encoding from the |
|
Returns a decoded string version of the buffer. |
|
Returns the octet value in the buffer at the specified |
|
There is a wide range of methods for |
To illustrate reading from buffers, Listing 5.2 defines a buffer with UTF8
encoded characters, and then uses toString()
without parameters to read all the buffer, and then with the encoding
, start
, and end
parameters to read part of the buffer. Then in lines 4 and 5 it creates a StringDecoder
with UTF8
encoding and uses it to write the contents of the buffer out to the console. Next, a direct access method is used to get the value of the octet at index 18. Listing 5.2 Output shows the output of the code.
Listing 5.2 buffer_read.js
: Various ways to read from a Buffer
object
1 bufUTF8 = new Buffer("Some UTF8 Text u00b6 u30c6 u20ac", 'utf8'); 2 console.log(bufUTF8.toString()); 3 console.log(bufUTF8.toString('utf8', 5, 9)); 4 var StringDecoder = require('string_decoder').StringDecoder; 5 var decoder = new StringDecoder('utf8'); 6 console.log(decoder.write(bufUTF8));
Listing 5.2 Output buffer_read.js
: Reading data from a Buffer
object
C:ooks odech05>node buffer_read.js Some UTF8 Text ¶ テ € UTF8 Some UTF8 Text ¶ テ € e3 e3838620
A common task when dealing with buffers is determining the length, especially when you create a buffer dynamically from a string. The length of a buffer can be determined by calling .length
on the Buffer
object. To determine the byte length that a string takes up in a buffer you cannot use the .length
property. Instead you need to use Buffer.byteLength(string, [encoding])
. Note that there is a difference between the string length and byte length of a buffer. To illustrate this consider the followings statements:
"UTF8 text u00b6".length; //evaluates to 11 Buffer.byteLength("UTF8 text u00b6", 'utf8'); //evaluates to 12 Buffer("UTF8 text u00b6").length; //evaluates to 12
Notice that the same string evaluates to 11 characters, but because it contains a double-byte character the byteLength
is 12. Also note that Buffer("UTF8 text u00b6").length
evaluates to 12 also. That is because .length
on a buffer returns the byte length.
An important part of working with buffers is the ability to copy data from one buffer into another buffer. Node.js provides the copy(targetBuffer, [targetStart], [sourceStart], [sourceIndex])
method on Buffer
objects. The targetBuffer
parameter is another Buffer
object, and targetStart
, sourceStart
, and sourceEnd
are indexes inside the source and target buffers.
Note
To copy string data from one buffer to the next, make sure that both buffers use the same encoding or you may get unexpected results when decoding the resulting buffer.
You can also copy data from one buffer to the other by indexing them directly, for example:
sourceBuffer[index] = destinationBuffer[index]
Listing 5.3 illustrates three examples of copying data from one buffer to another. The first method in lines 4–8 copies the full buffer. The next method in lines 10–14 copies only the middle 5 bytes of a buffer. The third example iterates through the source buffer and only copies every other byte in the buffer. The results are shown in Listing 5.3 Output.
Listing 5.3 buffer_copy.js
: Various ways to copy data from one Buffer
object to another
01 var alphabet = new Buffer('abcdefghijklmnopqrstuvwxyz'); 02 console.log(alphabet.toString()); 03 // copy full buffer 04 var blank = new Buffer(26); 05 blank.fill(); 06 console.log("Blank: " + blank.toString()); 07 alphabet.copy(blank); 08 console.log("Blank: " + blank.toString()); 09 // copy part of buffer 10 var dashes = new Buffer(26); 11 dashes.fill('-'); 12 console.log("Dashes: " + dashes.toString()); 13 alphabet.copy(dashes, 10, 10, 15); 14 console.log("Dashes: " + dashes.toString()); 15 // copy to and from direct indexes of buffers 16 var dots = new Buffer('-------------------------'); 17 dots.fill('.'); 18 console.log("dots: " + dots.toString()); 19 for (var i=0; i < dots.length; i++){ 20 if (i % 2) { dots[i] = alphabet[i]; } 21 } 22 console.log("dots: " + dots.toString());
Listing 5.3 Output buffer_copy.js
: Copying data from one Buffer
object to another
C:ooks odech05>node buffer_copy.js abcdefghijklmnopqrstuvwxyz Blank: Blank: abcdefghijklmnopqrstuvwxyz Dashes: -------------------------- Dashes: ----------klmno----------- dots: ......................... dots: .b.d.f.h.j.l.n.p.r.t.v.x.
Another important aspect of working with buffers is the ability to divide them into slices. A slice is a section of a buffer between a starting index and an ending index. Slicing a buffer allows you to manipulate a specific chunk.
Slices are created using the slice([start], [end])
method, which returns a Buffer
object that points to start
index of the original buffer and has a length of end
– start
. Keep in mind that a slice is different from a copy. If you edit a copy, the original does not change. However, if you edit a slice, the original does change.
Listing 5.4 illustrates using slices. Note that when the slice is altered in lines 5 and 6, it also alters the original buffer, as shown in Listing 5.4 Output.
Listing 5.4 buffer_slice.js
: Creating and manipulating slices of a Buffer
object
1 var numbers = new Buffer("123456789"); 2 console.log(numbers.toString()); 3 var slice = numbers.slice(3, 6); 4 console.log(slice.toString()); 5 slice[0] = '#'.charCodeAt(0); 6 slice[slice.length-1] = '#'.charCodeAt(0); 7 console.log(slice.toString()); 8 console.log(numbers.toString());
Listing 5.4 Output buffer_slice.js
: Slicing and modifying a Buffer
object
C:ooks odech05>node buffer_slice.js 123456789 456 #5# 123#5#789
You can also concatenate two or more Buffer
objects together to form a new buffer. The concat(list, [totalLength])
method accepts an array of Buffer
objects as the first parameter, and totalLength
defines the maximum bytes in the buffer as an optional second argument. The Buffer
objects are concatenated in the order they appear in the list, and a new Buffer
object is returned containing the contents of the original buffers up to totalLength
bytes.
If you do not provide a totalLength
parameter, concat()
figures it out for you. However, it has to iterate through the list, so providing a totalLength
value is faster.
Listing 5.5 concatenates a base Buffer
with one buffer and then another, as shown in Listing 5.5 Output.
Listing 5.5 buffer_concat.js
: Concatenating Buffer
objects
1 var af = new Buffer("African Swallow?"); 2 var eu = new Buffer("European Swallow?"); 3 var question = new Buffer("Air Speed Velocity of an "); 4 console.log(Buffer.concat([question, af]).toString()); 5 console.log(Buffer.concat([question, eu]).toString());
Listing 5.5 Output buffer_concat.js
: Concatenating Buffer
objects
C:ooks odech05>node buffer_concat.js Air Speed Velocity of an African Swallow? Air Speed Velocity of an European Swallow?
An important module in Node.js is the stream module. Data streams are memory structures that are readable, writable, or both. Streams are used all over in Node.js, for example, when accessing files or reading data from HTTP requests and in several other areas. This section covers using the Stream
module to create streams as well as read and write data from them.
The purpose of streams is to provide a common mechanism to transfer data from one location to another. They also expose events, such as when data is available to be read, when an error occurs, and so on. You can then register listeners to handle the data when it becomes available in a stream or is ready to be written to.
Some common uses for streams are HTTP data and files. You can open a file as a readable stream or access the data from an HTTP request as a readable stream and read bytes out as needed. Additionally, you can create your own custom streams. The following sections describe the process of creating and using readable, writable, duplex, and transform streams.
Readable
StreamsReadable
streams provide a mechanism to easily read data coming into your application from another source. Some common examples of readable streams are
HTTP responses on the client
HTTP requests on the server
fs
read streams
zlib
streams
crypto
streams
TCP sockets
Child processes stdout
and stderr
process.stdin
Readable
streams provide the read([size])
method to read data where size
specifies the number of bytes to read from the stream. read()
can return a String
, Buffer
or null
. Readable
streams also expose the following events:
readable
: Emitted when a chunk of data can be read from the stream.
data
: Similar to readable
except that when data
event handlers are attached, the stream is turned into flowing mode, and the data
handler is called continuously until all data has been drained.
end
: Emitted by the stream when data will no longer be provided.
close
: Emitted when the underlying resource, such as a file, has been closed.
error
: Emitted when an error occurs receiving data.
Readable
stream objects also provide a number of functions that allow you to read and manipulate them. Table 5.4 lists the methods available on a Readable
stream object.
Table 5.4 Methods available on Readable
stream objects
Method |
Description |
|
Reads data from the stream. The data can be a |
|
Sets the encoding to use when returning |
|
This pauses data events from being emitted by the object. |
|
The resumes data events being emitted by the object. |
This pipes the output of this stream into a |
|
|
Disconnects this object from the |
To implement your own custom Readable
stream object, you need to first inherit the functionality for Readable
streams. The simplest way to do that is to use the util
module’s inherits()
method:
var util = require('util'); util.inherits(MyReadableStream, stream.Readable);
Then you create an instance of the object call:
stream.Readable.call(this, opt);
You also need to implement a _read()
method that calls push()
to output the data from the Readable
object. The push()
call should push either a String
, Buffer
, or null
.
Listing 5.6 illustrates the basics of implementing and reading from a Readable
stream. Notice that the Answers()
class inherits from Readable
and then implements the Answers.prototye._read()
function to handle pushing data out. Also notice that on line 18, a direct read()
call reads the first item from the stream and then the rest of the items are read by the data
event handler defined on lines 19–21. Listing 5.6 Output shows the result.
Listing 5.6 stream_read.js
: Implementing a Readable
stream object
01 var stream = require('stream'); 02 var util = require('util'); 03 util.inherits(Answers, stream.Readable); 04 function Answers(opt) { 05 stream.Readable.call(this, opt); 06 this.quotes = ["yes", "no", "maybe"]; 07 this._index = 0; 08 } 09 Answers.prototype._read = function() { 10 if (this._index > this.quotes.length){ 11 this.push(null); 12 } else { 13 this.push(this.quotes[this._index]); 14 this._index += 1; 15 } 16 }; 17 var r = new Answers(); 18 console.log("Direct read: " + r.read().toString()); 19 r.on('data', function(data){ 20 console.log("Callback read: " + data.toString()); 21 }); 22 r.on('end', function(data){ 23 console.log("No more answers."); 24 });
Listing 5.6 Output stream_read.js
: Implementing a custom Readable
object
C:ooks odech05>node stream_read.js Direct read: yes Callback read: no Callback read: maybe No more answers.
Writable
StreamsWritable
streams are designed to provide a mechanism to write data into a form that can easily be consumed in another area of code. Some common examples of Writable
streams are
HTTP requests on the client
HTTP responses on the server
fs
write streams
zlib
streams
crypto
streams
TCP sockets
Child process stdin
process.stdout
, process.stderr
Writable
streams provide the write(chunk, [encoding], [callback])
method to write data into the stream, where chunk
contains the data to write, encoding
specifies the string encoding if necessary, and callback
specifies a callback function to execute when the data has been fully flushed. The write()
function returns true
if the data was written successfully. Writable
streams also expose the following events:
drain
: After a write()
call returns false
, the drain
event is emitted to notify listeners when it is okay to begin writing more data.
finish
: Emitted when end()
is called on the Writable
object; all data is flushed and no more data will be accepted.
pipe
: Emitted when the pipe()
method is called on a Readable
stream to add this Writable
as a destination.
unpipe
: Emitted when the unpipe()
method is called on a Readable
stream to remove this Writable
as a destination.
Writable
stream objects also provide a number of methods that allow you to write and manipulate them. Table 5.5 lists the methods available on a Writable
stream object.
Table 5.5 Methods available on Writable
stream objects
Method |
Description |
|
Writes the data chunk to the stream object’s data location. The data can be a |
|
Same as |
To implement your own custom Writable
stream object, you need to first inherit the functionality for Writable
streams. The simplest way to do that is to use the util
module’s inherits()
method:
var util = require('util'); util.inherits(MyWritableStream, stream.Writable);
Then you create an instance of the object call:
stream. Writable.call(this, opt);
You also need to implement a _write(data, encoding, callback)
method that stores the data for the Writable
object. Listing 5.7 illustrates the basics of implementing and writing to a Writable
stream. Listing 5.7 Output shows the result.
Listing 5.7 stream_write.js
: Implementing a Writable
stream object
01 var stream = require('stream'); 02 var util = require('util'); 03 util.inherits(Writer, stream.Writable); 04 function Writer(opt) { 05 stream.Writable.call(this, opt); 06 this.data = new Array(); 07 } 08 Writer.prototype._write = function(data, encoding, callback) { 09 this.data.push(data.toString('utf8')); 10 console.log("Adding: " + data); 11 callback(); 12 }; 13 var w = new Writer(); 14 for (var i=1; i<=5; i++){ 15 w.write("Item" + i, 'utf8'); 16 } 17 w.end("ItemLast"); 18 console.log(w.data);
Listing 5.7 Output stream_ write.js
: Implementing a custom Writable
object
C:ooks odech05>node stream_write.js Adding: Item1 Adding: Item2 Adding: Item3 Adding: Item4 Adding: Item5 Adding: ItemLast [ 'Item1', 'Item2', 'Item3', 'Item4', 'Item5', 'ItemLast' ]
Duplex
StreamsA Duplex
stream combines Readable
and Writable
functionality. A good example of a duplex stream is a TCP socket connection. You can read and write from the socket connection once it has been created.
To implement your own custom Duplex
stream object, you need to first inherit the functionality for Duplex
streams. The simplest way to do that is to use the util
module’s inherits()
method:
var util = require('util'); util.inherits(MyDuplexStream, stream.Duplex);
Then you create an instance of the object call:
stream. Duplex.call(this, opt);
The opt
parameter when creating a Duplex
stream accepts an object with the property allowHalfOpen
set to true
or false
. If this option is true
, then the readable side stays open even after the writable side has ended and vice versa. If this option is set to false
, ending the writable side also ends the readable side and vice versa.
When you implement a Duplex
stream, you need to implement both a _read(size)
and a _write(data, encoding, callback)
method when prototyping your Duplex
class.
Listing 5.8 illustrates the basics of implementing writing to and reading from a Duplex
stream. The example is basic but shows the main concepts. The Duplexer()
class inherits from the Duplex
stream and implements a rudimentary _write()
function that stores data in an array in the object. The _read()
function uses shift()
to get the first item in the array and then ends by pushing null
if it is equal to "stop"
, pushes it if there is a value, or sets a timeout timer to call back to the _read()
function if there is no value.
In Listing 5.8 Output, notice that the first two writes "I think, "
and "therefore"
are read together. This is because both were pushed to the Readable
before the data
event was triggered.
Listing 5.8 stream_duplex.js
: Implementing a Duplex
stream object
01 var stream = require('stream'); 02 var util = require('util'); 03 util.inherits(Duplexer, stream.Duplex); 04 function Duplexer(opt) { 05 stream.Duplex.call(this, opt); 06 this.data = []; 07 } 08 Duplexer.prototype._read = function readItem(size) { 09 var chunk = this.data.shift(); 10 if (chunk == "stop"){ 11 this.push(null); 12 } else { 13 if(chunk){ 14 this.push(chunk); 15 } else { 16 setTimeout(readItem.bind(this), 500, size); 17 } 18 } 19 }; 20 Duplexer.prototype._write = function(data, encoding, callback) { 21 this.data.push(data); 22 callback(); 23 }; 24 var d = new Duplexer(); 25 d.on('data', function(chunk){ 26 console.log('read: ', chunk.toString()); 27 }); 28 d.on('end', function(){ 29 console.log('Message Complete'); 30 }); 31 d.write("I think, "); 32 d.write("therefore "); 33 d.write("I am."); 34 d.write("Rene Descartes"); 35 d.write("stop");
Listing 5.8 Output stream_ duplex.js
: Implementing a custom Duplex
object
C:ooks odech05>node stream_duplex.js read: I think, read: therefore read: I am. read: Rene Descartes Message Complete
Transform
StreamsAnother type of stream is the Transform
stream. A Transform
stream extends the Duplex
stream but modifies the data between the Writable
stream and the Readable
stream. This can be useful when you need to modify data from one system to another. Some examples of Transform
streams are
zlib
streams
crypto
streams
A major difference between the Duplex
and the Transform
streams is that for Transforms
you do not need to implement the _read()
and _write()
prototype methods. These are provided as pass-through functions. Instead, you implement the _transform(chunk, encoding, callback)
and _flush(callback)
methods. The _transform()
method should accept the data from write()
requests, modify it, and then push()
out the modified data.
Listing 5.9 illustrates the basics of implementing a Transform
stream. The stream accepts JSON strings, converts them to objects, and then emits a custom event named object
that sends the object to any listeners. The _transform()
function also modifies the object to include a handled
property and then sends a string form on. Notice that lines 18–21 implement the object
event handler function that displays certain attributes. In Listing 5.9 Output, notice that the JSON strings now include the handled
property.
Listing 5.9 stream_transform.js
: Implementing a Transform
stream object
01 var stream = require("stream"); 02 var util = require("util"); 03 util.inherits(JSONObjectStream, stream.Transform); 04 function JSONObjectStream (opt) { 05 stream.Transform.call(this, opt); 06 }; 07 JSONObjectStream.prototype._transform = function (data, encoding, callback) { 08 object = data ? JSON.parse(data.toString()) : ""; 09 this.emit("object", object); 10 object.handled = true; 11 this.push(JSON.stringify(object)); 12 callback(); 13 }; 14 JSONObjectStream.prototype._flush = function(cb) { 15 cb(); 16 }; 17 var tc = new JSONObjectStream(); 18 tc.on("object", function(object){ 19 console.log("Name: %s", object.name); 20 console.log("Color: %s", object.color); 21 }); 22 tc.on("data", function(data){ 23 console.log("Data: %s", data.toString()); 24 }); 25 tc.write('{"name":"Carolinus", "color": "Green"}'); 26 tc.write('{"name":"Solarius", "color": "Blue"}'); 27 tc.write('{"name":"Lo Tae Zhao", "color": "Gold"}'); 28 tc.write('{"name":"Ommadon", "color": "Red"}');
Listing 5.9 Output stream_transform.js
: Implementing a custom Transform
object
C:ooks odech05>node stream_transform.js Name: Carolinus Color: Green Data: {"name":"Carolinus","color":"Green","handled":true} Name: Solarius Color: Blue Data: {"name":"Solarius","color":"Blue","handled":true} Name: Lo Tae Zhao Color: Gold Data: {"name":"Lo Tae Zhao","color":"Gold","handled":true} Name: Ommadon Color: Red Data: {"name":"Ommadon","color":"Red","handled":true}
Readable
Streams to Writable
StreamsOne of the coolest things you can do with stream objects is to chain Readable
streams to Writable
streams using the pipe(writableStream, [options])
function. This does exactly what the name implies. The output from the Readable
stream is directly input into the Writable
stream. The options
parameter accepts an object with the end
property set to true
or false
. When end
is true
, the Writable
stream ends when the Readable
stream ends. This is the default behavior. For example:
readStream.pipe(writeStream, {end:true});
You can also break the pipe programmatically using the unpipe(destinationStream)
option. Listing 5.10 implements a Readable
stream and a Writable
stream and then uses the pipe()
function to chain them together. To show you the basic process, the data input from the _write()
method is output to the console in Listing 5.10 Output.
Listing 5.10 stream_piped.js
: Piping a Readable
stream into a Writable
stream
01 var stream = require('stream'); 02 var util = require('util'); 03 util.inherits(Reader, stream.Readable); 04 util.inherits(Writer, stream.Writable); 05 function Reader(opt) { 06 stream.Readable.call(this, opt); 07 this._index = 1; 08 } 09 Reader.prototype._read = function(size) { 10 var i = this._index++; 11 if (i > 10){ 12 this.push(null); 13 } else { 14 this.push("Item " + i.toString()); 15 } 16 }; 17 function Writer(opt) { 18 stream.Writable.call(this, opt); 19 this._index = 1; 20 } 21 Writer.prototype._write = function(data, encoding, callback) { 22 console.log(data.toString()); 23 callback(); 24 }; 25 var r = new Reader(); 26 var w = new Writer(); 27 r.pipe(w);
Listing 5.10 Output stream_ piped.js
: Implementing stream piping
C:ooks odech05>node stream_piped.js Item 1 Item 2 Item 3 Item 4 Item 5 Item 6 Item 7 Item 8 Item 9 Item 10
Zlib
When working with large systems or moving large amounts of data around, it is helpful to be able to compress and decompress the data. Node.js provides an excellent library in the Zlib
module that allows you to easily and efficiently compress and decompress data in buffers.
Keep in mind that compressing data takes CPU cycles. So you should be certain of the benefits of compressing the data before incurring the compression/decompression cost. The compression methods supported by Zlib
are
gzip/gunzip: Standard gzip compression
deflate/inflate: Standard deflate compression algorithm based on Huffman coding
deflateRaw/inflateRaw: Deflate compression algorithm on a raw buffer
The Zlib
module provides several helper functions that make it easy to compress and decompress data buffers. These all use the same basic format of function(
buffer,
callback)
, where function is the compression/decompression method, buffer is the buffer to be compressed/decompressed, and callback is the callback function executed after the compression/decompression occurs.
The simplest way to illustrate buffer compression/decompression is to show you some examples. Listing 5.11 provides several compression/decompression examples, and the size result of each example is shown in Listing 5.11 Output.
Listing 5.11 zlib_buffers.js
: Compressing/decompressing buffers using the Zlib
module
01 var zlib = require("zlib"); 02 var input = '...............text...............'; 03 zlib.deflate(input, function(err, buffer) { 04 if (!err) { 05 console.log("deflate (%s): ", buffer.length, buffer.toString('base64')); 06 zlib.inflate(buffer, function(err, buffer) { 07 if (!err) { 08 console.log("inflate (%s): ", buffer.length, buffer.toString()); 09 } 10 }); 11 zlib.unzip(buffer, function(err, buffer) { 12 if (!err) { 13 console.log("unzip deflate (%s): ", buffer.length, buffer.toString()); 14 } 15 }); 16 } 17 }); 18 19 zlib.deflateRaw(input, function(err, buffer) { 20 if (!err) { 21 console.log("deflateRaw (%s): ", buffer.length, buffer.toString('base64')); 22 zlib.inflateRaw(buffer, function(err, buffer) { 23 if (!err) { 24 console.log("inflateRaw (%s): ", buffer.length, buffer.toString()); 25 } 26 }); 27 } 28 }); 29 30 zlib.gzip(input, function(err, buffer) { 31 if (!err) { 32 console.log("gzip (%s): ", buffer.length, buffer.toString('base64')); 33 zlib.gunzip(buffer, function(err, buffer) { 34 if (!err) { 35 console.log("gunzip (%s): ", buffer.length, buffer.toString()); 36 } 37 }); 38 zlib.unzip(buffer, function(err, buffer) { 39 if (!err) { 40 console.log("unzip gzip (%s): ", buffer.length, buffer.toString()); 41 } 42 }); 43 } 44 });
Listing 5.11 Output zilb_ buffers.js
: Compressing/decompressing buffers
C:ooks odech05>node zlib_buffers.js deflate (18): eJzT00MBJakVJagiegB9Zgcq deflateRaw (12): 09NDASWpFSWoInoA gzip (30): H4sIAAAAAAAAC9PTQwElqRUlqCJ6AIq+x+AiAAAA inflate (34): ...............text............... unzip deflate (34): ...............text............... inflateRaw (34): ...............text............... gunzip (34): ...............text............... unzip gzip (34): ...............text...............
Compressing/decompressing streams using Zlib
is slightly different from compressing/decompressing buffers. Instead, you use the pipe()
function to pipe the data from one stream through the compression/decompression object into another stream. This can apply to compressing any Readable
streams into Writable
streams.
A good example of doing this is compressing the contents of a file using fs.ReadStream
and fs.WriteStream
. Listing 5.12 shows an example of compressing the contents of a file using a zlib.Gzip()
object and then decompressing back using a zlib.Gunzip()
object.
Listing 5.12 zlib_file.js
: Compressing/decompressing a file stream using the Zlib
module
01 var zlib = require("zlib"); 02 var gzip = zlib.createGzip(); 03 var fs = require('fs'); 04 var inFile = fs.createReadStream('zlib_file.js'); 05 var outFile = fs.createWriteStream('zlib_file.gz'); 06 inFile.pipe(gzip).pipe(outFile); 07 gzip.flush(); 08 outFile.close(); 09 var gunzip = zlib.createGunzip(); 10 var inFile = fs.createReadStream('zlib_file.gz'); 11 var outFile = fs.createWriteStream('zlib_file.unzipped'); 12 inFile.pipe(gunzip).pipe(outFile);
At the heart of most intense web applications and services is a lot of data streaming from one system to another. In this chapter, you learned how to use functionality built into Node.js to work with JSON data, manipulate binary buffer data, and utilize data streams. You also learned about compressing buffered data as well as running data streams through compression/decompression.
3.145.7.208