Transducers were announced back in 2014, since then a considerable number of articles have been written ( one , two ), but after one article I could not say that I understand transducers are crystal clear. After each article, I had a feeling that I understood something complicated, but it still remained difficult. And then once in my head something clicked: "I already saw this pattern, only it was called differently!"
There is an array of scores containing the results of my soccer games as an object with fields:
It is necessary to find the numbers of the first two games I won.
The initial data for the task (the answer for such data is [1, 3]):
const scores = [ { gameID: 0, my: 1, others: 2 }, { gameID: 1, my: 2, others: 1 }, { gameID: 2, my: 0, others: 3 }, { gameID: 3, my: 3, others: 2 }, { gameID: 4, my: 3, others: 1 }, { gameID: 5, my: 0, others: 0 }, { gameID: 6, my: 4, others: 1 }, ]
Let's start from afar, return to those distant days when we wrote the usual imperative cycles and suffered from a changeable state (I agree that in this example it does not cause any problems):
const result = [] // , . for (let i = 0; i < scores.length && result.length < 2; i++) { const { gameID, my, others } = scores[i] if (my > others) { result.push(gameID) } }
It's all pretty simple, nice and fast, but imperative and mutable. It is important to note that unnecessary iterations are not performed, the loop ends immediately after the array with the results receives the required number of elements.
This solution is already immutable, more expressive, "modern".
const result = scores .filter(({ my, others }) => my > others) // .map(({ gameID }) => gameID) // .slice(0, 2) //
This is exactly what most js developers would decide these days, I guess. But this decision has one important problem: if in the previous decision we did only one pass, and that was not the end, now we already have two full passes. If our scores contained a million elements, the predicate in the filter would be called a million times, the function in the map would apply a smaller, but still a large number of times, and in the end we still take only the first two elements. Of course, premature optimization is evil, but this is not at any gate.
Convolutions can define [almost] any operation on arrays. We have only one pass in this solution:
const result = scores.reduce((result, { my, others, gameID }) => { // , // . if (my > others && result.length < 2) { return result.concat([gameID]) } return result }, [])
This is somewhat similar to a solution with cycles, but here we are passing the intermediate state explicitly and immiablely. But the problem remained - instead of two passes, we now have one, but it is still full, that is, with a million elements we will go through the entire million, even if we already received the required number of results, because the standard reduce cannot exit the loop through break. Let's then write our reduce, which can be quit if the reduced value is returned (the idea was borrowed from clojure).
// - . class Reduced { constructor(value) { this.value = value } } const isReduced = value => value instanceof Reduced const reduced = value => isReduced(value) ? value : new Reduced(value) // reduce reduced-. const reduce = (reducer, state, array) => { // - reduced- , // , . if (isReduced(state)) { return state.value } if (array.length === 0) { return state } // reduce , // . const [x, ...xs] = array return reduce(reducer, reducer(state, x), xs) } const result = reduce((result, { my, others, }) => { if (my > others) { if (result.length < 2) { result = result.concat(gameID) } // , . if (result.length >= 2) { result = reduced(result) } } return result }, [], scores)
Wow, now we have everything quickly (one pass and exactly until we get the right amount of elements), but is it beautiful? I believe that the code above is scary: there is too much code in it that is not relevant to the task.
The reducer above can be divided into 4 small functions that will be called in a chain, where one function calls the next.
filterWins
filters games by status, i.e., only filterWins
games through the chain.mapGameID
gets its number from the game.firstTwo
checks the number of results obtained. If there are fewer than two, then it calls the following function, gets new results, and then ends the cycle if the required number is recruited.appendToResult
adds the game number to the results array and returns it. // next - const filterWins = next => (result, score) => // , score.my > score.others ? next(result, score) : result const mapGameID = next => (result, { gameID }) => // next(result, gameID) // . const firstTwo = next => { // . // n - , // . let n = 2 return (result, score) => { // , . if (n > 0) { result = next(result, score) n -= 1 } // , . if (n <= 0) { result = reduced(result) } return result } } const appendToResult = (result, gameID) => result.concat([gameID]) const result = reduce( // , // , . filterWins(mapGameID(firstTwo(appendToResult))), [], scores, )
It seems like they simplified it, breaking it up into functions, but it turned out scary. I would not like to write this every time. But we will fix this a bit later when we unify the code for reuse.
The pattern may seem familiar to you, because this is how middleware works. This is middleware. And these are the transducers, because transducers are the middleware. A transducer is a function that accepts one reducer and returns a new one (with additional logic before or after calling the reducer).
Those who hear the concept of middleware
for the first time can familiarize themselves with it here: express.js , laravel , and also yesterday I tried to explain it in my own words: my post
In this solution, we have three transducer: filterWins
, mapGameID
and firstTwo
, and we consistently apply them to the appendToResult appendToResult
, creating an increasingly complex reducer.
And now let's unify our transducers:
// pred - - , true false. // true, , // . const filter = pred => next => (acc, x) => pred(x) ? next(acc, x) : acc // mapper - , . const map = mapper => next => (acc, x) => next(acc, mapper(x)) // , n . // n- . const take = n => next => (acc, x) => { if (n > 0) { acc = next(acc, x) n -= 1 } if (n <= 0) { acc = reduced(acc) } return acc }
At the same time rename the reducer:
const append = (xs, x) => xs.concat([x])
And also add a helper for the composition of functions to remove the extra brackets:
const compose = (...fns) => x => fns.reduceRight((x, fn) => fn(x), x)
Add a helper, which will apply the transducer to the reducer and cause reduce with the resulting reducer:
const transduce = (transducer, reducer, init, array) => reduce(transducer(reducer), init, array)
Finally, we will solve the problem with the help of all these functions:
const firstTwoWins = compose( filter(({ my, others }) => my > others), map(({ gameID }) => gameID), take(2), ) const result = transduce(firstTwoWins, append, [], scores)
It turned out a beautiful, fast, functional, immutable, ready-to-use solution.
Transducers are a very simple pattern, which is a special case of the middleware pattern, which is much more widely known (hence the name of the article), the basis of which is the creation of complex reducers using composition. And the resulting reducers, in turn, are very versatile, they can be used with the processing of collections, streams, Redux.
It’s not necessary to write all the code for working with transducers manually each time, because transducers have long been implemented in many libraries for data processing (for example, in ramda.js
or the standard clojure library)
const scores = [ { gameID: 0, my: 1, others: 2 }, { gameID: 1, my: 2, others: 1 }, { gameID: 2, my: 0, others: 3 }, { gameID: 3, my: 3, others: 2 }, { gameID: 4, my: 3, others: 1 }, { gameID: 5, my: 0, others: 0 }, { gameID: 6, my: 4, others: 1 }, ] { const firstTwoWins = compose( filter(({ my, others }) => my > others), pluck("gameID"), take(2), ) const result = transduce(firstTwoWins, flip(append), [], scores) } // { const firstTwoWins = into([], compose( filter(({ my, others }) => my > others), pluck("gameID"), take(2), )) const result = firstTwoWins(scores) }
(def scores [{:game-id 0, :my 1, :others 2} {:game-id 1, :my 2, :others 1} {:game-id 2, :my 0, :others 3} {:game-id 3, :my 3, :others 2} {:game-id 4, :my 3, :others 1} {:game-id 5, :my 0, :others 0} {:game-id 6, :my 4, :others 1}]) (defn win? [{:keys [my others]}] (> my others)) (def first-two-wins (comp (filter win?) (map :game-id) (take 2))) (def result (transduce first-two-wins conj [] scores)) ;; [1 3]
Source: https://habr.com/ru/post/325388/
All Articles