📜 ⬆️ ⬇️

Understanding the event architecture of Node.js


Most Node objects — such as HTTP requests, responses, and streams — implement the EventEmitter module, through which they can generate and listen for events.


 const EventEmitter = require('events') 

The simplest form of event management is the callback style of some popular Node.js functions, for example fs.readFile . By this analogy, an event is generated once (when the Node is ready to call a callback), and the callback acts as an event handler. Let's first analyze this basic form of event-driven architecture.


Call me when you're ready, Node!


Initially, Node handled asynchronous events using callbacks. It was a long time ago, even before the native support of promis and async / await feature appeared in JavaScript. Callbacks are simply functions that you pass to other functions. This is possible in JavaScript, because functions are first-class objects.


It is important to understand that callbacks are not indicators of an asynchronous call in the code. The function can call a callback both synchronously and asynchronously. For example, the host function fileSize accepts a callback function cb , and calls it synchronously or asynchronously, depending on the condition:


 function fileSize (fileName, cb) { if (typeof fileName !== 'string') { return cb(new TypeError('argument should be string')); // Sync } fs.stat(fileName, (err, stats) => { if (err) { return cb(err); } // Async cb(null, stats.size); // Async }); } 

This is a bad approach, leading to unexpected errors. Create such host functions that accept callbacks either always synchronously or always asynchronously.


Let's look at a simple example of a typical asynchronous Node function written in a callback style:


 const readFileAsArray = function(file, cb) { fs.readFile(file, function(err, data) { if (err) { return cb(err); } const lines = data.toString().trim().split('\n'); cb(null, lines); }); }; 

readFileAsArray takes the file path and callback function. It reads the contents of the file, splits it into an array of strings, and calls a callback function for this array. Here's how to use it. Suppose the numbers.txt file is in the same directory as this content:


 10 11 12 13 14 15 

If we have a task to count the numbers in this file, then to simplify the code, you can use readFileAsArray :


 readFileAsArray('./numbers.txt', (err, lines) => { if (err) throw err; const numbers = lines.map(Number); const oddNumbers = numbers.filter(n => n%2 === 1); console.log('Odd numbers count:', oddNumbers.length); }); 

This code reads numeric content in an array of strings, parses it as a number, and performs a count.
It works typical for Node callback style. A callback has an err error-first argument, which can be null. We pass this callback as the last argument to the host function. Always do this in your functions, because users will probably count on it. Let your host function get a callback as the last argument, and let the callback wait for an error object as its first argument.


Modern JS alternatives to callbacks


In modern javascript there are such objects as promises. They can be an alternative to callbacks in the case of asynchronous APIs. Instead of passing a callback as an argument and handling errors at the same place, promis allows you to separately handle successful and erroneous situations, and also to connect several asynchronous calls into chains, rather than making them nested.


If the readFileAsArray function supports promises, then we can use it as follows:


 readFileAsArray('./numbers.txt') .then(lines => { const numbers = lines.map(Number); const oddNumbers = numbers.filter(n => n%2 === 1); console.log('Odd numbers count:', oddNumbers.length); }) .catch(console.error); 

Instead of passing a callback, we call the function .then for the return value of the host function. Usually .then gives us access to the same array lines that we get in the callback version, so we can work as before. To handle errors, we add a call to .catch as applied to the result, which will give us access to the error if it occurs.


Thanks to the new Promise object in modern JavaScript, it has become easier to implement support for the promise interface by the host function. Here is the readFileAsArray function, modified so that it supports the promis interface in addition to the already supported callback interface:


 const readFileAsArray = function(file, cb = () => {}) { return new Promise((resolve, reject) => { fs.readFile(file, function(err, data) { if (err) { reject(err); return cb(err); } const lines = data.toString().trim().split('\n'); resolve(lines); cb(null, lines); }); }); }; 

The function returns a Promise object, in which the asynchronous fs.readFile call is fs.readFile . The promise has two arguments: the function resolve and reject . If we need to call a callback with an error, then we use the promise function reject , and for the callback with data, we use the promise function resolve .


The only difference is that we need to have a default value for the callback argument in case the code is used with a promise interface. For example, a simple, default empty function () => {} can be used as an argument.


Using promises with async / await


Adding a promis interface makes it much easier to work with your code if you need to use an asynchronous function in a loop. With callbacks, the situation becomes more complicated. Promises improve things a bit, as does the function generator. In other words, a more recent alternative for working with asynchronous code is the async function. It allows you to treat asynchronous code as synchronous, which greatly improves the readability of the code.


Here's how to use the readFileAsArray function using async / await:


 async function countOdd () { try { const lines = await readFileAsArray('./numbers'); const numbers = lines.map(Number); const oddCount = numbers.filter(n => n%2 === 1).length; console.log('Odd numbers count:', oddCount); } catch(err) { console.error(err); } } countOdd(); 

First we create an asynchronous function - a normal function with the word async at the beginning. Inside it, we call the function readFileAsArray , as if it returns the variable lines, and for this we use the keyword await . If the readFileAsArray call was synchronous, then continue the code. To do this, we execute the async function. So it turns out simply and readable. To work with errors, we need to wrap the async call into a try/catch expression.


Thanks to the async / await feature, we didn't need a special API (like .then and .catch). We only labeled functions differently and took pure javascript.


We can use async / await with any function that supports the promis interface. But we cannot - with asynchronous functions in a callback style (for example, setTimeout).


EventEmitter module


An EventEmitter is a module that facilitates communication between objects in a Node. It is the core of an asynchronous, event-driven architecture. Many of the modules built into Node inherit from EventEmitter.


His idea is simple: emitter objects generate named events that lead to invoking previously registered listeners. So the emitter has two main functions:



To work with EventEmitter, you need to create a class that extends it.


 class MyEmitter extends EventEmitter { } 

Emitters are what we instantiate from EventEmitter based classes:


 const myEmitter = new MyEmitter(); 

At any time during the emitter life cycle, we can use the emit function and generate any named event.


 myEmitter.emit('something-happened'); 

Generating an event is a signal that a condition is met. Usually we are talking about changing the state of the generating object. Using the on method, you can add listener functions that will be executed each time emitters generate their associated named events.


Events! == asynchrony


Take a look at an example:


 const EventEmitter = require('events'); class WithLog extends EventEmitter { execute(taskFunc) { console.log('Before executing'); this.emit('begin'); taskFunc(); this.emit('end'); console.log('After executing'); } } const withLog = new WithLog(); withLog.on('begin', () => console.log('About to execute')); withLog.on('end', () => console.log('Done with execute')); withLog.execute(() => console.log('*** Executing task ***')); 

The WithLog class is an emitter. It defines one instance of the execute function. It takes one argument — the task function — and wraps its execution in a log expression. Events are generated before and after execution.


To see the order in which everything works, we will register listeners for the named events and perform an example of the task of starting the entire chain.


Result:


 Before executing About to execute *** Executing task *** Done with execute After executing 

What I want to note about the result of code execution: there is nothing asynchronous here.



Just like the good old callbacks, not suggesting that events are characteristic of synchronous or asynchronous code. This is important, because if we pass asynchronous taskFunc to execute , the generated events will no longer be accurate.


You can emulate this situation by calling setImmediate :


 // ... withLog.execute(() => { setImmediate(() => { console.log('*** Executing task ***') }); }); 

Now the result will be:


 Before executing About to execute Done with execute After executing *** Executing task *** 

It is not right. The strings after an asynchronous call, leading to the “Done with execute” and “After executing” calls, appear in the wrong order.


To generate an event after the completion of the asynchronous function, we need to combine callbacks (or promises) with this event-driven communication. This is demonstrated in the example below.


One of the advantages of using events instead of ordinary callbacks is that we can react to the same signal many times thanks to the definition of multiple listeners. To do the same with callbacks, you have to write more logic inside one available callback. Events are a great way to implement numerous external plugins that add functionality to the application core. You can consider them as "connectors" for customization of behavior when the state changes.


Asynchronous events


Let's transform our synchronous example into something asynchronous and a little more useful.


 const fs = require('fs'); const EventEmitter = require('events'); class WithTime extends EventEmitter { execute(asyncFunc, ...args) { this.emit('begin'); console.time('execute'); asyncFunc(...args, (err, data) => { if (err) { return this.emit('error', err); } this.emit('data', data); console.timeEnd('execute'); this.emit('end'); }); } } const withTime = new WithTime(); withTime.on('begin', () => console.log('About to execute')); withTime.on('end', () => console.log('Done with execute')); withTime.execute(fs.readFile, __filename); 

The WithTime class executes asyncFunc and using the calls console.time and console.timeEnd reports the time spent by this asyncFunc . It generates the correct sequence of events before and after execution. It also generates error / data-events for working with ordinary asynchronous call signals.


withTime emitter withTime , transferring to it a call to the asynchronous function fs.readFile . Instead of processing data from a file using a callback, we can now listen to a data event.


Having executed this code, we, as expected, receive the correct sequence of events, and also the report on execution time:


 About to execute execute: 4.507ms Done with execute 

Please note that for this we had to combine the callback with the emitter. If asynFunc also supported promises, then the same could be done with async / await:


 class WithTime extends EventEmitter { async execute(asyncFunc, ...args) { this.emit('begin'); try { console.time('execute'); const data = await asyncFunc(...args); this.emit('data', data); console.timeEnd('execute'); this.emit('end'); } catch(err) { this.emit('error', err); } } } 

I don’t know how for you, but for me it looks much more readable than code based on callbacks or strings with .then / .catch. The async / await feature brings us as close as possible to JavaScript, which I consider to be a great achievement.


Event arguments and errors


In the previous example, there were two events generated with additional arguments. Error event generated by error object.


 this.emit('error', err); 

Data event generated by the data object.


 this.emit('data', data); 

After a named event, we can use as many arguments as needed, and all of them will be available inside the listener functions that we registered for these named events.


For example, to work with a data event, a registered listener function will have access to the data argument that was passed to the generated event. And this data object is exactly what the asyncFunc provides.


 withTime.on('data', (data) => { // do something with data }); 

Usually the error event is special. In the example with callbacks, if we don’t handle the error event with a listener, the Node process ends.


To demonstrate this behavior, let's call the method execution again with a bad argument:


 class WithTime extends EventEmitter { execute(asyncFunc, ...args) { console.time('execute'); asyncFunc(...args, (err, data) => { if (err) { return this.emit('error', err); // Not Handled } console.timeEnd('execute'); }); } } const withTime = new WithTime(); withTime.execute(fs.readFile, ''); // BAD CALL withTime.execute(fs.readFile, __filename); 

The first execution call (execute call) will result in an error. Node process will crash or end:


 events.js:163 throw er; // Unhandled 'error' event ^ Error: ENOENT: no such file or directory, open '' 

This crash will affect the second performance call, which may not be executed at all.


If you register a listener for a special error event, the behavior of the Node process changes. For example:


 withTime.on('error', (err) => { // do something with err, for example log it somewhere console.log(err) }); 

In this case, the error of the first execution call will be reported, but the Node process will not crash and end. The second execution call will normally end:


 { Error: ENOENT: no such file or directory, open '' errno: -2, code: 'ENOENT', syscall: 'open', path: '' } execute: 4.276ms 

Note that now Node behaves differently with functions based on promises, it only gives a warning, but in the end it will change:


 UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 1): Error: ENOENT: no such file or directory, open '' DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code. 

Another way to handle exceptions due to generated errors is to register a listener for a global event of the uncaughtException process. However, the global catching of errors in such an event is a bad idea.


Standard advice regarding uncaughtException : avoid using it. But if you need it (for example, to report what happened or to clean it up), then let the process end anyway:


 process.on('uncaughtException', (err) => { // something went unhandled. // Do any cleanup and exit anyway! console.error(err); // don't do just that. // FORCE exit the process too. process.exit(1); }); 

However, imagine that several error-events occurred simultaneously. This means that the uncaughtException listener uncaughtException run several times, which can be a problem when cleaning up the code. This happens, for example, when multiple calls lead to the termination of a database.


The EventEmitter module provides a once method. It signals that one call to the listener is enough. The method is practical to use with uncaughtException , because at the first uncaught exception, we will start to perform the cleaning, knowing that in any case the process will end.


Order of listeners


If several listeners are registered for one event, they will be called up in some order. The first registered will be the first called.


 // प्रथम withTime.on('data', (data) => { console.log(`Length: ${data.length}`); }); // दूसरा withTime.on('data', (data) => { console.log(`Characters: ${data.toString().length}`); }); withTime.execute(fs.readFile, __filename); 

If you execute this code, then first the “Length” string will be entered into the log, and then “Characters”, because it is in this order that we defined their listeners.


If you need to define a new listener, but so that it is called first, you can use the prependListener method:


 // प्रथम withTime.on('data', (data) => { console.log(`Length: ${data.length}`); }); // दूसरा withTime.prependListener('data', (data) => { console.log(`Characters: ${data.toString().length}`); }); withTime.execute(fs.readFile, __filename); 

In this case, the line “Characters” first appears in the log.


Finally, if you need to remove the listener, use the removeListener method.


That's all.


')

Source: https://habr.com/ru/post/330048/


All Articles