📜 ⬆️ ⬇️

Explanatory conversation about asynchronous programming in Javascript

Hello to all!

As you may remember, back in October we translated an interesting article about the use of timers in Javascript. It caused a huge discussion, according to the results of which we have long wanted to return to this topic and offer you a detailed analysis of asynchronous programming in this language. We are glad that we managed to find decent material and publish it before the end of the year. Enjoy reading!

Asynchronous programming in Javascript has undergone a multi-stage evolution: from callbacks to promises and further to generators, and soon - to async/await . At each stage, asynchronous programming in Javascript was somewhat simplified for those who were already knee-deep in their own way in this language, but for beginners it only became more frightening because they needed to understand the nuances of each paradigm, mastering the use of each and, no less importantly, understand how it all works.

In this article, we decided to briefly remind you how to use callbacks and promises, give a brief introduction to generators, and then help you intuitively understand exactly how asynchronous programming is organized under the hood using async / async and generators. We hope that in this way you can confidently apply different paradigms exactly where they are relevant.
')
It is assumed that the reader has already used callbacks, promises, and generators for asynchronous programming, as well as being completely familiar with closures and currying in Javascript.

Hell callbacks

Initially there were callbacks. There is no synchronous I / O (hereinafter referred to as I / O) in Javascript and locks are not supported at all. So, for the organization of any I / O or for the postponement of any action, the following strategy was chosen: the code that was required to be executed asynchronously was passed to the function with deferred execution, which was run somewhere lower in the event loop. One callback is not so bad, but the code grows, and callbacks usually generate new callbacks. The result is something like this:

 getUserData(function doStuff(e, a) { getMoreUserData(function doMoreStuff(e, b) { getEvenMoreUserData(function doEvenMoreStuff(e, c) { getYetMoreUserData(function doYetMoreStuff(e, c) { console.log('Welcome to callback hell!'); }); }); }); }) 

Apart from the goose bumps that run at the sight of such a fractal code, there is another problem: now we have delegated the control of our logic to do*Stuff other functions ( get*UserData() ), to which you may not have the source code, and you cannot have are sure if they are doing your callback. Great, isn't it?

Promises

Promises reverse the control inversion provided by the callbacks and help to unravel the tangle of callbacks into an even chain.
Now the previous example can be converted to something like this:

 getUserData() .then(getUserData) .then(doMoreStuff) .then(getEvenMoreUserData) .then(doEvenMoreStuff) .then(getYetMoreUserData) .then(doYetMoreStuff); 

Not so indifferent, eh?

But let me !!! Let's look at a more vital (but still largely contrived) example of callbacks:

 // ,     fetchJson(),   GET   , //    :         ,     –   // . function fetchJson(url, callback) { ... } fetchJson('/api/user/self', function(e, user) { fetchJson('/api/interests?userId=' + user.id, function(e, interests) { var recommendations = []; interests.forEach(function () { fetchJson('/api/recommendations?topic=' + interest, function(e, recommendation) { recommendations.push(recommendation); if (recommendations.length == interests.length) { render(profile, interests, recommendations); } }); }); }); }); 

So, we select a user profile, then his interests, then, based on his interests, select recommendations and, finally, having collected all the recommendations, display the page. Such a set of callbacks that you can probably be proud of, but, nevertheless, some kind of shaggy. Nothing, let's apply promises here - and everything will be fine. Right?

Let's change our fetchJson() method to return a promise, rather than accept a callback. Promis is resolved by the response body parsed in JSON format.

 fetchJson('/api/user/self') .then(function (user) { return fetchJson('/api/user/interests?userId=' + self.id); }) .then(function (interests) { return Promise.all[interests.map(i => fetchJson('/api/recommendations?topic=' + i))]; }) .then(function (recommendations) { render(user, interests, recommendations); }); 

Beautiful, is not it? What is wrong now with this code?

... Oops! ..
We do not have access to the profile or interests in the last function of this chain? So nothing works! What to do? Let's try nested promises:

 fetchJson('/api/user/self') .then(function (user) { return fetchJson('/api/user/interests?userId=' + self.id) .then(interests => { user: user, interests: interests }); }) .then(function (blob) { return Promise.all[blob.interests.map(i => fetchJson('/api/recommendations?topic=' + i))] .then(recommendations => { user: blob.user, interests: blob.interests, recommendations: recommendations }); }) .then(function (bigBlob) { render(bigBlob.user, bigBlob.interests, bigBlob.recommendations); }); 

Yes ... now it looks much more awkward than we hoped. Not because of these crazy dolls, we, not least, sought to break the hell of callbacks? What to do now?

The code can be combed a bit, leaning on the circuit:

 //   ,     var user, interests; fetchJson('/api/user/self') .then(function (fetchedUser) { user = fetchedUser; return fetchJson('/api/user/interests?userId=' + self.id); }) .then(function (fetchedInterests) { interests = fetchedInterests; return Promise.all(interests.map(i => fetchJson('/api/recommendations?topic=' + i))); }) .then(function (recomendations) { render(user, interests, recommendations); }) .then(function () { console.log('We are done!'); }); 

Yes, now everything is almost the way we wanted, but with one fad. Noticed how we called arguments inside callbacks in fetchedUser and fetchedInterests , not user and interests ? If so, then you are very observant!

The flaw in this approach is as follows: one must be very, very careful not to name anything in the internal functions as well as the “out of cache” variables that you are going to use in your closure. Even if you have enough skill to avoid shading, referring to a variable so high in the closure still seems pretty dangerous, and this is definitely not good.

Asynchronous generators

Generators will help! If you use generators, then all the excitement disappears. Just magic. True. Take a look only:

 co(function* () { var user = yield fetchJson('/api/user/self'); var interests = yield fetchJson('/api/user/interests?userId=' + self.id); var recommendations = yield Promise.all( interests.map(i => fetchJson('/api/recommendations?topic=' + i))); render(user, interests, recommendations); }); 

That's all. It will work. You do not break into a tear when you see how beautiful the generators are, do you not regret that you were so short-sighted and began to learn Javascript even before generators appeared in it? I confess that such a thought once visited me.
But ... how does all this work? Really magic?

Of course not. We turn to the exposure.

Generators

In our example, it seems that the generators are easy to use, but in fact a lot of them are going on. To understand the asynchronous generators in more detail, you need to better understand how the generators work and how they provide asynchronous execution, seemingly synchronous in appearance.

As the name implies, the generator makes the values:

 function* counts(start) { yield start + 1; yield start + 2; yield start + 3; return start + 4; } const counter = counts(0); console.log(counter.next()); // {value: 1, done: false} console.log(counter.next()); // {value: 2, done: false} console.log(counter.next()); // {value: 3, done: false} console.log(counter.next()); // {value: 4, done: true} console.log(counter.next()); // {value: undefined, done: true} 

It's pretty simple, but, anyway, let's talk about what's going on here:

  1. const counter = counts(); - we initialize the generator and save it in the counter variable. The generator is in a suspended state, no code in the body of the generator has yet been executed.
  2. console.log(counter.next()); - issuance ( yield ) 1 is interpreted, after which 1 is returned as value , and done results in false , because the output does not end there
  3. console.log(counter.next()); - Now 2!
  4. console.log(counter.next()); - Now 3! Finished. Is that right? Not. Execution is suspended in step yield 3; To complete, call next () again.
  5. console.log(counter.next()); - Now 4, and it returns, but not issued, so now we exit the function, and everything is ready.
  6. console.log(counter.next()); - Generator job finished! He has nothing to say except "everything is done."

So we figured out how the generators work! But wait, but what a shocking truth: generators can not only belch the values, but also devour them!

 function* printer() { console.log("We are starting!"); console.log(yield); console.log(yield); console.log(yield); console.log("We are done!"); } const counter = printer(); counter.next(1); // ! counter.next(2); // 2 counter.next(3); // 3 counter.next(4); // 4\n ! counter.next(5); //    

Ugh, what ?! The generator consumes the values, instead of generating them. How is this possible?

The secret in the function next . It not only returns values ​​from the generator, but can also return them to the generator. If you give the next() argument, then the yield operation that the generator is now waiting for actually results in the argument. That is why the first counter.next(1) registered as undefined . There is simply no issue that could be allowed.

All the same, as if the generator allowed the calling code (procedure) and the generator code (procedure) to cooperate in partnership, so that they transmit values ​​to each other as they are executed and wait for each other. The situation is practically the same, as if Javascript generators would have thought about the possibility of implementing cooperatively executed competitive procedures, they are also “corutines”. Actually, quite reminiscent of co() , right?

But let's not be in a hurry, but we will outsmart ourselves. In this case, it is important that the reader intuitively learned the essence of generators and asynchronous programming, and the best way to do this is to assemble the generator yourself. Not to write the function of the generator and not to use the finished one, but to recreate the function of the generator itself.

Internal device of the generator - we generate generators

Well, I really do not know exactly how the insides of the generator look in different JS runtimes. But this is not so important. Generators match the interface. A “constructor” for instantiating a generator, the next(value? : any) method, with which we order the generator to continue working and give it values, another throw(error) method in case an throw(error) generated instead of a value, and finally return() , which while we keep silent. If compliance with the interface is achieved, then everything is fine.

So, let's try building the aforementioned counts() generator on pure ES5, without the function* keyword. For now, you can ignore throw() and pass the value to next() , because the method does not accept any input. How to do it?

But in Javascript, there is another mechanism for suspending and resuming program execution: closures! Familiar looks?

 function makeCounter() { var count = 1; return function () { return count++; } } var counter = makeCounter(); console.log(counter()); // 1 console.log(counter()); // 2 console.log(counter()); // 3 

If you used closures before, I’m sure you’ve already written something like that. The function returned by makeCounter can generate an infinite sequence of numbers, just like a generator.

However, this function does not correspond to the generator interface, and it cannot be directly applied in our example with counts() , which returns 4 values ​​and terminates. What is needed for a universal approach to writing generator-like functions?

Closures, state machines and hard labor!

 function counts(start) { let state = 0; let done = false; function go() { let result; switch (state) { case 0: result = start + 1; state = 1; break; case 1: result = start + 2; state = 2; break; case 2: result = start + 3; state = 3; break; case 3: result = start + 4; done = true; state = -1; break; default: break; } return {done: done, value: result}; } return { next: go } } const counter = counts(0); console.log(counter.next()); // {value: 1, done: false} console.log(counter.next()); // {value: 2, done: false} console.log(counter.next()); // {value: 3, done: false} console.log(counter.next()); // {value: 4, done: true} console.log(counter.next()); // {value: undefined, done: true} 

By running this code, you will see the same results as in the version with the generator. Nice, right?
So, we have disassembled the generating side of the generator; let's analyze consuming?
In fact, there are not many differences.

 function printer(start) { let state = 0; let done = false; function go(input) { let result; switch (state) { case 0: console.log("We are starting!"); state = 1; break; case 1: console.log(input); state = 2; break; case 2: console.log(input); state = 3; break; case 3: console.log(input); console.log("We are done!"); done = true; state = -1; break; default: break; return {done: done, value: result}; } } return { next: go } } const counter = printer(); counter.next(1); // ! counter.next(2); // 2 counter.next(3); // 3 counter.next(4); // 4 counter.next(5); // ! 

All you need to do is add input as an argument to go , and values ​​are piped. Does it look like magic again? Almost like generators?

Hooray! So we recreated the generator as a supplier and as a consumer. Why not try to combine these functions in it? Here is another rather artificial example of a generator:

 function* adder(initialValue) { let sum = initialValue; while (true) { sum += yield sum; } } 

Since we are all already experts on generators, we understand that this generator adds the value given in next(value) to sum , and then returns sum. It works exactly as we expected:

 const add = adder(0); console.log(add.next()); // 0 console.log(add.next(1)); // 1 console.log(add.next(2)); // 3 console.log(add.next(3)); // 6 

Cool. Now let's write this interface as a normal function!

 function adder(initialValue) { let state = 'initial'; let done = false; let sum = initialValue; function go(input) { let result; switch (state) { case 'initial': result = initialValue; state = 'loop'; break; case 'loop': sum += input; result = sum; state = 'loop'; break; default: break; } return {done: done, value: result}; } return { next: go } } function runner() { const add = adder(0); console.log(add.next()); // 0 console.log(add.next(1)); // 1 console.log(add.next(2)); // 3 console.log(add.next(3)); // 6 } runner(); 

Wow, we implemented a full-fledged coruntine.

It remains to discuss something about the work of generators. How do exceptions work? With the exceptions that occur inside the generators, everything is simple: next() will make the exception penetrate to the caller and the generator will die. Passing an exception to the generator is done in the throw() method, which we omitted above.

Let's enrich our addendum with a cool new opportunity. If the caller passes the exception to the generator, it will return to the last value of the sum.

 function* adder(initialValue) { let sum = initialValue; let lastSum = initialValue; let temp; while (true) { try { temp = sum; sum += yield sum; lastSum = temp; } catch (e) { sum = lastSum; } } } const add = adder(0); console.log(add.next()); // 0 console.log(add.next(1)); // 1 console.log(add.next(2)); // 3 console.log(add.throw(new Error('BOO)!'))); // 1 console.log(add.next(4)); // 5 

Programming Challenge - Generator Error Penetration

Comrade, how do we implement throw ()?

Easily! Error is just another value. We can pass it to go() as the next argument. In fact, some caution is needed here. When throw(e) called, the yield will work just as if we had written throw e. This means that we must check for the presence of errors every state of our state machine, and blame the program if we cannot handle the error.

Let's start with the previous implementation of the supplicant, copied

Template

Decision

Boom! We implemented a set of quorutines capable of transmitting messages and exceptions to each other, just like a real generator.

But the situation is getting worse, isn't it? The implementation of the state machine is increasingly moving away from the implementation of the generator. Not only that, due to error handling, the code is cluttered with garbage; the code is all the more complicated because of such a long while , which we succeeded here. To convert a while you need to “weave” it into states. So, our case 1 actually involves 2.5 iterations of the while , because yield ends in the middle. Finally, you have to add extra code to push exceptions from the caller and back, if there is no try/catch in the generator to handle this exception.

You did it!!! We have completed a detailed analysis of possible options for the implementation of generators, and I hope you have already better understood how the generators work. The bottom line:


Now that we understand generators better, I propose a potentially convenient way of reasoning about them: these are syntactic constructions with which you can write competitively executed procedures that transmit values ​​to each other through a channel that transmits values ​​one by one (the yield instruction). This will be useful to us in the next section, where we will produce co() implementation from Corutin.

Inversion Control with Corutin

Now, having mastered the work with generators, let's think about how they can be used in asynchronous programming. If we can write generators as such, this does not mean that promises in generators will be automatically resolved. But wait, the generators are not meant to work by themselves. They must interact with another program, the main procedure, the one that calls .next() and .throw() .

What if to put our business logic not in the main procedure, namely in the generator? Whenever a business logic gets some asynchronous value, say, promise, the generator says: “I don’t want to mess with this stupidity, wake me up when it resolves”, pause and issue a promise to the servicing procedure. The serving procedure: “OK, I'll call you later”. After that, it registers a callback with this promise, exits and waits until it is possible to trigger a cycle of events (that is, when promise is resolved). When this happens, the procedure will announce: “Hey, your turn”, and send the value via .next() sleeping generator. She will wait for the generator to do its work, while in the meantime she will be engaged in other asynchronous tasks ... and so on. You listened to a sad story about how the procedure lives in the service of the generator.

So, back to the main topic. Now that we know how generators and promises work, it will not be difficult for us to create such an “official procedure”. The utility procedure itself will be competitively executed as promise, instantiate and maintain the generator, and then return to the final result of our main procedure using the callback .then() .

Next, let's go back to the co () program and discuss it in more detail. co() is a service procedure that takes on slave labor so that the generator can work only with synchronous values. Already much more logical looks, right?

 co(function* () { var user = yield fetchJson('/api/user/self'); var interests = yield fetchJson('/api/user/interests?userId=' + self.id); var recommendations = yield Promise.all( interests.map(i => fetchJson('/api/recommendations?topic=' + i))); render(user, interests, recommendations); }); 

, , co() , .

— co()

Fine! co() , , . co()

  1. ,
  2. .next() , {done: false, value: [a Promise]}
  3. ( ), .next() ,
  4. , 4
  5. - {done: true, value: ...} , , co()

, co(), :

Template

 function deferred(val) { return new Promise((resolve, reject) => resolve(val)); } co(function* asyncAdds(initialValue) { console.log(yield deferred(initialValue + 1)); console.log(yield deferred(initialValue + 2)); console.log(yield deferred(initialValue + 3)); }); function co(generator) { return new Promise((resolve, reject) => { //   }); } 

Decision

, ? - 10 co() , . , . ?

– co()

, , , , co() . , .throw() .

Template

 function deferred(val) { return new Promise((resolve, reject) => resolve(val)); } function deferReject(e) { return new Promise((resolve, reject) => reject(e)); } co(function* asyncAdds() { console.log(yield deferred(1)); try { console.log(yield deferredError(new Error('To fail, or to not fail.'))); } catch (e) { console.log('To not fail!'); } console.log(yield deferred(3)); }); function co(generator) { return new Promise((resolve, reject) => { //   }); } 

Decision

. , , .next() onResolve() . onReject() , .throw() . try/catch , , try/catch .

, co() ! ! co() , , , . , ?

: async/await

co() . - , async/await? — ! , async await .

async , await , yield . await , async . async - .

, async/await , , - co() async yield await , * , .

 co(function* () { var user = yield fetchJson('/api/user/self'); var interests = yield fetchJson('/api/user/interests?userId=' + self.id); var recommendations = yield Promise.all( interests.map(i => fetchJson('/api/recommendations?topic=' + i))); render(user, interests, recommendations); }); 

:

 async function () { var user = await fetchJson('/api/user/self'); var interests = await fetchJson('/api/user/interests?userId=' + self.id); var recommendations = await Promise.all( interests.map(i => fetchJson('/api/recommendations?topic=' + i))); render(user, interests, recommendations); }(); 

, :




Javascript , , « » co() , , , async/await . ? Right.

Source: https://habr.com/ru/post/434360/


All Articles