Mastering Node.js(Second Edition)
上QQ阅读APP看书,第一时间看更新

Events

Many functions in the Node API emit events. These events are instances of events.EventEmitter. Any object can extend EventEmitter, providing Node developers with a simple and uniform way to build tight, asynchronous interfaces to object methods.

The following code sets Node's EventEmitter object as the prototype of a function constructor we define. Each constructed instance has the EventEmitter object exposed to its prototype chain, providing a natural reference to the event API. The counter instance methods emit events, and code after that listens for them. After making a Counter, we listen for the incremented event, specifying a callback Node will call when the event happens. Then, we call the increment twice. Each time, our Counter increments the internal count it holds, and then emits the incremented event. This calls our callback, giving it the current count, which our callback logs:

// File counter.js
// Load Node's 'events' module, and point directly to EventEmitter there
const EventEmitter = require('events').EventEmitter;
// Define our Counter function
const Counter = function(i) { // Takes a starting number
this.increment = function() { // The counter's increment method
i++; // Increment the count we hold
this.emit('incremented', i); // Emit an event named incremented
}
}
// Base our Counter on Node's EventEmitter
Counter.prototype = new EventEmitter(); // We did this afterwards, not before!
// Now that we've defined our objects, let's see them in action
// Make a new Counter starting at 10
const counter = new Counter(10);
// Define a callback function which logs the number n you give it
const callback = function(n) {
console.log(n);
}
// Counter is an EventEmitter, so it comes with addListener
counter.addListener('incremented', callback);
counter.increment(); // 11
counter.increment(); // 12

The following is the output for counter.js:

$ node counter.js
11
12

To remove the event listeners bound to counter, use this:

counter.removeListener('incremented', callback).

For consistency with browser-based JavaScript, counter.on and counter.addListener are interchangeable.

Node brought EventEmitter to JavaScript and made it an object your objects can extend. This greatly increases the possibilities available to developers. With EventEmitter, Node can handle I/O data streams in an event-oriented manner, performing long-running tasks while keeping true to Node's principles of asynchronous, non-blocking programming:

// File stream.js
// Use Node's stream module, and get Readable inside
let Readable = require('stream').Readable;
// Make our own readable stream, named r
let r = new Readable;
// Start the count at 0
let count = 0;
// Downstream code will call r's _read function when it wants some data from r
r._read = function() {
count++;
if (count > 10) { // After our count has grown beyond 10
return r.push(null); // Push null downstream to signal we've got no more data
}
setTimeout(() => r.push(count + '\n'), 500); // A half second from now, push our count on a line
};
// Have our readable send the data it produces to standard out
r.pipe(process.stdout);

The following is the output for stream.js:

$ node stream.js
1
2
3
4
5
6
7
8
9
10

This example creates a readable stream r, and pipes its output to the standard out. Every 500 milliseconds, code increments a counter and pushes a line of text with the current count downstream. Try running the program yourself, and you'll see the series of numbers appear on your terminal.

On what would be the 11th count, r pushes null downstream, indicating that it has no more data to send. This closes the stream, and with nothing more to do, Node exits the process.

Subsequent chapters will explain streams in more detail. Here, just note how pushing data onto a stream causes an event to fire, how you can assign a custom callback to handle this event, and how the data flows downstream.

Node consistently implements I/O operations as asynchronous, evented data streams. This design choice enables Node's excellent performance. Instead of creating a thread (or spinning up an entire process) for a long-running task like a file upload that a stream may represent, Node only needs to commit the resources to handle callbacks. Additionally, in the long stretches of time in between the short moments when the stream is pushing data, Node's event loop is free to process other instructions.

As an exercise, re-implement stream.js to send the data r produces to a file instead of the terminal. You'll need to make a new writable stream w, using Node's fs.createWriteStream:

// File stream2file.js
// Bring in Node's file system module
const fs = require('fs');
// Make the file counter.txt we can fill by writing data to writeable stream w
const w = fs.createWriteStream('./counter.txt', { flags: 'w', mode: 0666 });
...
// Put w beneath r instead
r.pipe(w);