📜 ⬆️ ⬇️

Javascript promises

Hello everyone, and once again all of the past holidays. Everyday work is gaining momentum and with them the information hunger tormenting us is growing. The world of development of the front end does not sleep and prepares us many surprises in the coming year, and believe me, no one will be bored. One of the new features that browser developers are preparing for us in conjunction with groups of developers writing specifications - JavaScript Promises (hereinafter referred to as Promises, please do not hit hard) - the asynchronous code writing pattern that many people are fond of gains native support. What is the promise and what they eat can be read in the following translation (slightly loose) of the great article by Jake Archibald .



Ladies and gentlemen, get ready for a great event in the world of web development ...
')
[Drumroll]

Promises have become native in JavaScript!

[Everywhere the roar of salutes, the crowd delighted]

At the moment, you can be attributed to one of the following categories:


Why is everyone dancing around?


JavaScript is single-threaded, and this means that two pieces of code cannot be executed at the same time, but will work one by one. In browsers, JavaScript shares a stream with the execution of other operations. The list of these operations differs from browser to browser, but for the most part, JavaScript is in the same queue as content rendering, style update, and custom action handling (such as text selection and interaction with form fields). Performing one of these operations slows down the entire queue.

As a human being, you are multithreaded. You can beat the keys with all your fingers and listen to the conversation on the phone of a colleague sitting at the next table. The only blocking function built into us is sneezing. All current activity will be suspended for the time of sneeze. It is a bit annoying when you drive a car and at the same time speak on the phone, when suddenly sneeze attacks you. And you would not want that the code written by you also suffered from this problem.

Probably, to protect yourself from this, you used an event model and a series of callbacks. For example:

var img1 = document.querySelector('.img-1'); img1.addEventListener('load', function() { // ,   }); img1.addEventListener('error', function() { // ,   }); 


There is no blocking code. We took the image, hung a couple of listeners on it, that's all. Further JavaScript can stop its execution until the moment until one of the events fires.

Unfortunately, in the example above, it is likely that events will happen before we start listening to them. Therefore, we will have to file this example using the complete images property:

 var img1 = document.querySelector('.img-1'); function loaded() { // ,   } if (img1.complete) { loaded(); } else { img1.addEventListener('load', loaded); } img1.addEventListener('error', function() { // ,   }); 


Bad job, I tell you. All the same, it will not give us a chance to catch pictures, the download of which ended in error until the moment when we had the opportunity to listen to them. DOM politely does not give us this opportunity. We are also trying to load one image, but imagine if we need to catch the end of the load of a pack of images.

Events are not always the best choice.


Listeners are a great thing when we need to catch a lot of recurring events on one element - keyup, touchstart, etc. With these types of events, you don’t worry too much about what happens before you hook handlers. But when you need to catch, for example, asynchronous loading with an indefinite outcome (success / failure), ideally, I would like to have something like this:

  img1.callThisIfLoadedOrWhenLoaded(function() { //  }).orIfFailedCallThis(function() { //  }); // … whenAllTheseHaveLoaded([img1, img2]).callThis(function() { //   }).orIfSomeFailedCallThis(function() { //     }); 


This is exactly what promises are responsible for, but only under more distinct and semantic names. If the HTML image had a ready method that returns a promise, we could do the following:

 img1.ready().then(function() { //  }, function() { //    }); // … Promise.all([img1.ready(), img2.ready()]).then(function() { //   }, function() { //      }); 


At their core, promises are a bit like events except for:



This is extremely convenient for asynchronous requests, since you are not interested when it happened exactly there, but it is interesting to process the results of what happened.

Promise Terminology


Domenik Denikol read the first draft of this article and gave me a stake for the terminology used. As punishment, he made me read a hundred times the provisions on possible “ States and Exodus of Promises ”, and write an apology letter to my parents. Despite this, I still got mixed up, and I came to the following basic concepts:

A promise may be:



The specification also uses the term thenable to describe a promise of a similar object that has a then method. But this term reminds me of former English football manager Teri Venables , so I will use it as seldom as possible.

Promises are built into JavaScript!


Promises already surround us for some time in the form of libraries, such as these:



The promises from the libraries above and embedded in JavaScript follow the behavior described in the standardized Promises / A + specification. If you use jQuery, there is something close in spirit called Deffered . As if there were no, the deffered cards are slightly incompatible with the Promises / A + specification, which makes them less suitable , so keep your ear sharp. JQuery also has a type of Promise , but this is just a subset of Deffered fields with the same problems.

Although all of these promise implementations follow the standard, their API is different. The native promises API is most similar to RSVP.js.

 var promise = new Promise(function(resolve, reject) { //    ,   , … if (/* ..    */) { resolve("!"); } else { reject(Error("")); } }); 


The promise constructor takes one argument — a callback function with two parameters: resolve and reject . Everything is simple, inside the callback you perform any asynchronous operations, then you call resolve in case of success, or reject in case of failure.

Like throw in good old JavaScript, rejects do not have to pass an error object. The advantage of creating an Error object is that debugging code, having a call stack trace in the console, is much more pleasant.

Further, the promise can be used as follows:

 promise.then(function(result) { console.log(result); // " !" }, function(err) { console.log(err); // : "" }); 


The promise standardization started in DOM as “Futures”, was later renamed “Promises”, and finally moved to the JavaScript specification. The idea of ​​realizing promises, primarily in JavaScript, separate from the document object model, is beautiful because they can be available in non-browser environments such as Node.js.

Although they have turned purely into a JavaScript feature, the DOM does not hesitate to use them to the fullest. In fact, an all new DOM API tied to asynchrony will use Promises. Now it is happening with Quota Management , Font Load Events , ServiceWorker , Web MIDI , Streams , and other API's.

Browser Support


At the moment, support for promises in browsers, frankly, is slightly limited.

There is in Chrome'e. Download Canary , there promises are included by default. Otherwise, if you are a soldier from the ranks of supporters of Firefox, download the latest nightly build , it also has promises.

At the moment, nowhere is the realization of the promises fully completed. You can track the development of Firefox on bugzilla , and the Chrome innovations board to keep up with the latest developments.

In order to bring the work of promises to the proper form or add promises to other browsers and Node.js, use a polyfil .

Compatible with other libraries


The promises JavaScript API does not do without the then method, as it should be for a Promise-like object (in Promise terminology it is also called thenable ). There is also a Promise.cast method that erases the boundaries between embedded and custom Promise-like objects. So, if you use a library that returns promises of type Q, that's fine, they will work fine with native JavaScrip promises.

But, as I warned, jQuery Deferreds are slightly ... different. Fortunately, you can bring them to the standard ones:

 var jsPromise = Promise.cast($.ajax('/whatever.json')); 


Here jQuery'rivsky $.ajax returns Deferred. But as long as he has a then method, Promist.cast can turn it into a real promise. Be that as it may, at times Deffered passes too many arguments to its callback:

 var jqDeferred = $.ajax('/whatever.json'); jqDeferred.then(function(response, statusText, xhrObj) { // ... }, function(xhrObj, textStatus, err) { // ... });     JS     : jsPromise.then(function(response) { // ... }, function(xhrObj) { // ... }); 


Fortunately, in most cases this is what you need, and you get access to what you need. It's also important to know that jQuery should not convention pass the error object to reject .

Asynchronous code gets easier


So, let's encode a couple of things. Suppose we want:

  1. Show rotating icon for load indication
  2. Request some JSON for a story that contains a title and a collection of URLs for each chapter.
  3. Add heading to the page
  4. Request all chapters
  5. Show them all
  6. Hide the download indicator


... and also notify the user if something went wrong along the way. We want to stop the indicator after an error in order not to make the user dizzy from perpetual rotation.

Of course, you do not want to dynamically load content, which is faster to give as HTML , but our template is good when working with third-party APIs: you make a lot of requests, then work with data, when you get them all.

Before we begin, let's see how we pull data from the network.

XMLHttpRequest enlists the promise


Older APIs will be updated using promises, if it is possible without losing backward compatibility. XMLHttpRequest first candidate, but for now let's write a simple function for making a GET request:

 function get(url) { //   . return new Promise(function(resolve, reject) { //   XHR  var req = new XMLHttpRequest(); req.open('GET', url); req.onload = function() { //      404'  //     if (req.status == 200) { //      resolve(req.response); } else { // ,     //       reject(Error(req.statusText)); } }; //    req.onerror = function() { reject(Error("Network Error")); }; //   req.send(); }); } 


Now let's use it:

 get('story.json').then(function(response) { console.log("!", response); }, function(error) { console.error("!", error); }); 


Now we can make HTTP requests without typing in XMLHttpRequest , which makes me very happy, because The nauseous form of XMLHttpRequest 'camel's notation poisons my life.

Call chain


then this is not the end of the story, you can link calls then for end-to-end transformation of return values ​​or perform additional asynchronous actions one by one.

Pipelining of values

You can modify the value of the pipeline simply by returning a new one:

 var promise = new Promise(function(resolve, reject) { resolve(1); }); promise.then(function(val) { console.log(val); // 1 return val + 2; }).then(function(val) { console.log(val); // 3 }); 


For a more practical example, let's go back to:

 get('story.json').then(function(response) { console.log("!", response); }); 


The answer came to us in JSON format, but we need plain text to display the content. We can set the responseType our answer, but we can also send a stroll through the wonderful world of promises:

 get('story.json').then(function(response) { return JSON.parse(response); }).then(function(response) { console.log("  JSON!", response); }); 


By the way, JSON.parse takes one argument and returns the processed value, and we can simply pass a reference to it:

 get('story.json').then(JSON.parse).then(function(response) { console.log("  JSON!", response); }); 


In fact, we can easily sketch the getJSON sugar function:

 function getJSON(url) { return get(url).then(JSON.parse); } 


getJSON still returns the promise after it pulls out the data and parses the JSON response.

Asynchronous Event Queue

You can also connect then calls to perform asynchronous actions sequentially.

When you return something from the then callback, a little bit of magic happens. If you return any value, this value will be passed to the callback function next then . And if you return something like a promise, the next will then wait for it and call the callback only when it is fulfilled. For example:

 getJSON('story.json').then(function(story) { return getJSON(story.chapterUrls[0]); }).then(function(chapter1) { console.log("  !", chapter1); }); 


Here we make an asynchronous request to story.json , and when we receive a set of URLs in the response, we request for the first one. Here we clearly see how far an apple can roll away from the apple tree, the advantage of promises over the usual pattern of callbacks hurts the eyes. You can take out the logic of requesting an article to a separate method:

 var storyPromise; function getChapter(i) { storyPromise = storyPromise || getJSON('story.json'); return storyPromise.then(function(story) { return getJSON(story.chapterUrls[i]); }) } //     : getChapter(0).then(function(chapter) { console.log(chapter); return getChapter(1); }).then(function(chapter) { console.log(chapter); }); 


We do not load story.json until the first call to getChapter , and the following calls to getChapter re-use the promise of loading the story that has already been fulfilled and do not make additional requests. Oh, those Promises!

Error processing


As we saw earlier, then takes two arguments, one for successful completion, the other is called in case of an error (fulfill and reject in the terminology of promises):

 get('story.json').then(function(response) { console.log("!", response); }, function(error) { console.log("!", error); }); 


You can also use catch :

 get('story.json').then(function(response) { console.log("!", response); }).catch(function(error) { console.log("!", error); }); 


There is nothing special about this method, it’s just more readable sugar for then(undefined, func) . Note that the two pieces of code above are not the same, the latter is equivalent to the following:

  get('story.json').then(function(response) { console.log("!", response); }).then(undefined, function(error) { console.log("!", error); }); 


This, at first glance, a small difference is actually a very powerful concept. A rejection of the promise (rejections) will be passed down the then call chain (or catch , which is almost the same) until the first error handler is encountered. In the case of then(func1, func2) , func1 and func2 will never be called both. But in the chain of then(func1).catch(func2) , both functions can be called if the promise returned from func1 fails (reject). Joke with the following piece of code:

 asyncThing1().then(function() { return asyncThing2(); }).then(function() { return asyncThing3(); }).catch(function(err) { return asyncRecovery1(); }).then(function() { return asyncThing4(); }, function(err) { return asyncRecovery2(); }).catch(function(err) { console.log("Don't worry about it"); }).then(function() { console.log("All done!"); }); 


The error handling process is very similar to the standard try/catch , the error that occurred in the try block is immediately passed to the catch . Here is a block diagram of what is happening in the code above (I love block diagrams):

image

Follow the green arrows, in the case of a successfully fulfilled promise and in red, in case of failure.

JavaScript exceptions and promises

Error handling can occur not only when you explicitly refuse (reject) a promise, but also implicitly when an exception is generated inside the constructor callback.

 var jsonPromise = new Promise(function(resolve, reject) { // JSON.parse     //  JSON,     reject': resolve(JSON.parse("This ain't JSON")); }); jsonPromise.then(function(data) { //     : console.log("!", data); }).catch(function(err) { //   : console.log("!", err); }); 


This means that it is best to do all the work related to the promise inside the callback function that you pass to the constructor, and all errors will be automatically intercepted by the function that you use to handle the failure of the promise.

The same will happen when an exception is generated in the then callback:

 get('/').then(JSON.parse).then(function() { //    , '/'     JSON //   JSON.parse   console.log("!", data); }).catch(function(err) { //   : console.log("!", err); }); 


Error handling in practice

In the case of our chaptered chapter history, we can use catch to alert the user about the error:

 getJSON('story.json').then(function(story) { return getJSON(story.chapterUrls[0]); }).then(function(chapter1) { addHtmlToPage(chapter1.html); }).catch(function() { addTextToPage("  "); }).then(function() { document.querySelector('.spinner').style.display = 'none'; }); 


If the request to story.chapterUrls[0] fails (http 500 or the user has gone offline), all subsequent callbacks called upon success will not be executed, such as the JSON parser included in getJSON , the callback adding the first chapter to the page , too, will be ignored. Execution will immediately go to the first error handling callback. As a result, the user will see the message “It is impossible to display the chapter” if in any of the previous callbacks something goes wrong.

As in the case of the standard try/catch error will be intercepted, and the program will continue its execution, so we will successfully hide the loading indicator, which is what we need. Here’s how it would look in a synchronous blocking version:

 try { var story = getJSONSync('story.json'); var chapter1 = getJSONSync(story.chapterUrls[0]); addHtmlToPage(chapter1.html); } catch (e) { addTextToPage("  "); } document.querySelector('.spinner').style.display = 'none'; 


Perhaps you want to catch the error a little earlier, for example, to log what is happening. To do this, simply regenerate the error in this place. We can do this in our getJSON method:

 function getJSON(url) { return get(url).then(JSON.parse).catch(function(err) { console.log("getJSON  ", url, err); throw err; }); } 


So, we managed to bring out one chapter, but we want to see them all. Let's do it.

Parallelism and queue - take the best of both


Thinking asynchronously is not so easy. If you are at a dead end, try to write the code as if it were synchronous:

 try { var story = getJSONSync('story.json'); addHtmlToPage(story.heading); story.chapterUrls.forEach(function(chapterUrl) { var chapter = getJSONSync(chapterUrl); addHtmlToPage(chapter.html); }); addTextToPage("All done"); } catch (err) { addTextToPage("- : " + err.message); } document.querySelector('.spinner').style.display = 'none'; 


It works ! But all actions are performed synchronously and block the browser at the time of loading. In order to make this code asynchronous, we will use then to perform tasks one after another.

 getJSON('story.json').then(function(story) { addHtmlToPage(story.heading); // TODO:       story.chapterUrls }).then(function() { //    ! addTextToPage(" "); }).catch(function(err) { //   ,     addTextToPage("- : " + err.message); }).then(function() { //     document.querySelector('.spinner').style.display = 'none'; }); 


But how do we go round all the chapters? This will not work:

 story.chapterUrls.forEach(function(chapterUrl) { //   getJSON(chapterUrl).then(function(chapter) { //      addHtmlToPage(chapter.html); }); }); 


forEach has nothing to do with asynchrony, and our chapters will be added to the page in random order as it loads, approximately, as it was written “Pulp Fiction”. We do not have “Pulp Fiction”, so let's fix it ...

Put everything in line

We want to turn our chaptersUrls array into a line of promises. We can do this as follows:

 //    ,     var sequence = Promise.resolve(); //      story.chapterUrls.forEach(function(chapterUrl) { //       sequence = sequence.then(function() { return getJSON(chapterUrl); }).then(function(chapter) { addHtmlToPage(chapter.html); }); }); 


This is the first time we have met the Promise.resolve factory method, which creates an immediate fulfilled promise with the value that you give to it. If you give him something like a promise (something that has a then method), he will return a copy of it. If you call Promise.resolve without an argument, as in our example, it returns a successfully fulfilled promise with the value undefined.

There is also a reverse method Promise.reject(val) , which returns a promise, completed with an error, with the value you give it (or undefined ).

We can make the code a bit more neat using array.reduce :

 //      story.chapterUrls.reduce(function(sequence, chapterUrl) { //       return sequence.then(function() { return getJSON(chapterUrl); }).then(function(chapter) { addHtmlToPage(chapter.html); }); }, Promise.resolve()); 


Here the same thing happens as in the previous example, but we do not need to allocate a separate variable for the queue after the cycle. Our reducing callback is called for each element in the array. Our initial queue, transmitted at the first iteration, is Promise.resolve(), but with each subsequent call of the callback, the result of the previous call is passed to it. array.reduceit is very convenient to use when you need to cast an array to a single value, for example, a promise, as in our case.

Let's put it all together ...

 getJSON('story.json').then(function(story) { addHtmlToPage(story.heading); return story.chapterUrls.reduce(function(sequence, chapterUrl) { return sequence.then(function() { // …   return getJSON(chapterUrl); }).then(function(chapter) { //      addHtmlToPage(chapter.html); }); }, Promise.resolve()); }).then(function() { //   ! addTextToPage(" "); }).catch(function(err) { //   ,    addTextToPage("- : " + err.message); }).then(function() { //       document.querySelector('.spinner').style.display = 'none'; }); 


And so it happened! We have a full asynchronous version of our idea. But we will not stop there. At the moment, loading our page looks like this:

image

Browsers have long been able to load many things at the same time, and we lose in performance by loading chapters one by one. What we want to do is to have all the chapters loaded at the same time, and then we will process them when they are all available to us. Fortunately, the API gives us this opportunity right out of the box:

 Promise.all(arrayOfPromises).then(function(arrayOfResults) { //... }); 

Promise.allaccepts an array of promises and returns one promise that will be fulfilled only when all promises are completed successfully. This general promise will return to the callback an thenarray of each results in the order in which you gave them.

 getJSON('story.json').then(function(story) { addHtmlToPage(story.heading); //        return Promise.all( //     //     getJSON story.chapterUrls.map(getJSON) ); }).then(function(chapters) { //        … chapters.forEach(function(chapter) { // …     addHtmlToPage(chapter.html); }); addTextToPage(" "); }).catch(function(err) { addTextToPage("- : " + err.message); }).then(function() { document.querySelector('.spinner').style.display = 'none'; }); 


Depending on the connection, the load may be much faster than when you load the chapters one by one , and in this example there is less code than in our previous attempt. Chapters can be loaded in random order, but they will appear on the screen in the desired sequence.

image

No matter how it was, we can still slightly improve our perceived performance. When the first chapter arrives, we can add it to the page, which will allow the user to start reading while the remaining chapters are uploaded. If the third chapter comes before everyone else, we would not want to display it because the user will not understand that the first couple of chapters have been missed. Therefore, we will, if possible, display all the chapters in sequence as they are loaded.

To implement this, we will request JSON's for all our chapters at the same time, and then create a queue to add them to the document:

 getJSON('story.json').then(function(story) { addHtmlToPage(story.heading); //     //     getJSON //   ,     . return story.chapterUrls.map(getJSON) .reduce(function(sequence, chapterPromise) { //        , //       return sequence.then(function() { return chapterPromise; }).then(function(chapter) { addHtmlToPage(chapter.html); }); }, Promise.resolve()); }).then(function() { addTextToPage("All done"); }).catch(function(err) { addTextToPage("Argh, broken: " + err.message); }).then(function() { document.querySelector('.spinner').style.display = 'none'; }); 


And so we killed two birds with one stone . We simultaneously request all the content, but with a slow connection, the user has the opportunity to see the first portion a little earlier.

image

In this primitive example, the chapters are displayed almost instantly, the real gain from this technique can be seen on the data of a much larger volume.

Repeating all of the above in the style of callbacks and Node.js events is not so easy and will double the amount of code approximately twice . Anyway, this is not the end of the story of promises. Let's try to see how they will work in tandem with other new features of ES6 ...

Small bonus: promises and generators


Next, we take a look at a small bundle of new features of ES6, but you don’t need them to understand and start using promises today. Take it as a trailer for the upcoming blockbuster.

ES6 also gives us generators . They give us the opportunity to exit a function at any point, like return does, but later we can continue executing the same state from the same point.

 function *addGenerator() { var i = 0; while (true) { i += yield i; } } 


Pay attention to the asterisk symbol in front of the name of the declared function, it indicates that it should be a generator. The keyword yieldis our return / restore point. We can use the above function, for example, like this:

 var adder = addGenerator(); adder.next().value; // 0 adder.next(5).value; // 5 adder.next(5).value; // 10 adder.next(5).value; // 15 adder.next(50).value; // 65 


But what advantages do generators give us when working with promises? Count up, you can use their behavior to write asynchronous code that looks like synchronous. Don't worry too much about understanding each line of the following example. It describes the function that enables us to use yieldto wait for the fulfillment of the promise.

 function spawn(generatorFunc) { function continuer(verb, arg) { var result; try { result = generator[verb](arg); } catch (err) { return Promise.reject(err); } if (result.done) { return result.value; } else { return Promise.cast(result.value).then(onFulfilled, onRejected); } } var generator = generatorFunc(); var onFulfilled = continuer.bind(continuer, "next"); var onRejected = continuer.bind(continuer, "throw"); return onFulfilled(); } 


... it is very similar to the similar function from Q , just adapted to native promises. With it, we can take our last example of displaying chapters and turn it into this:

 spawn(function *() { try { // 'yield' effectively does an async wait, // returning the result of the promise let story = yield getJSON('story.json'); addHtmlToPage(story.heading); // Map our array of chapter urls to // an array of chapter json promises. // This makes sure they all download parallel. let chapterPromises = story.chapterUrls.map(getJSON); for (let chapterPromise of chapterPromises) { // Wait for each chapter to be ready, then add it to the page let chapter = yield chapterPromise; addHtmlToPage(chapter.html); } addTextToPage("All done"); } catch (err) { // try/catch just works, rejected promises are thrown here addTextToPage("Argh, broken: " + err.message); } document.querySelector('.spinner').style.display = 'none'; }); 


This code works the same way as before, but it has become much easier to read. An example will work today in Chrome Canary if you enable Enable experimental JavaScript in about:flags.

This example combines a lot of features ES6: promises, generators let, for-of. And it shows how we can write simple asynchronous code with normal try/catch.

The future is near.

Source: https://habr.com/ru/post/209662/


All Articles