Using transducers
I’ve been working with Clojure professionally for four years now, and I made some posts about the language in the past. Clojure is a great language, although not without its fair share of things to consider. In other words, I don’t see Clojure as an ideal language by any means, and it’s not suitable for every type of project. Still, the language is my favorite among others, and I enjoy writing and reading code in it.
Now, I already made a post about transducers before. In that post I tried to explain what transducers are, how they work, and more importantly how they compose. I think since then I had a good grasp on what transducers are, and I use them when appropriate. However, I still see some concerns when it comes to transducers in the community, even though I’m almost not involved in it, and I see this even at work sometimes. So, this post isn’t going to be hard on the technical details behind transducers, but more about how and when you should use them.
Don’t be lazy until it matters
Clojure has a weird aspect to it that trips a lot of people - Clojure is lazy by default. Often I see this kind of code:
(clojure.string/join separator (map foo bar))
This is just an example, of course, but the pattern is not uncommon - we build a sequence and consume it.
Ideally, though this code should use mapv
instead of map
because map
is lazy.
Lazy sequences in Clojure come with a bit of overhead, and in this case we’ll be consuming everything immediately, thus this overhead is not justified.
I’ll admit, this particular example can benefit from consuming a lazy sequence, if the sequence is large enough.
This will ensure that we don’t store both the string and the sequence itself in the memory, only the string and one of the elements of the sequence.
Often, however, programmers just type out map
because it’s a common concept, not thinking about it being lazy, and whether if the laziness here is beneficial or not.
Now, that’s not about transducers, really, but it’ll make sense in a few minutes why I bring this up.
The spear of lazy poisoning
Clojure has a lot of convenience features in its core library, and I think that’s partly the reason why people, myself included, like the language.
One of such conveniences are ->
and ->>
macros.
These are hard to understand at first, but once you get what they do, I think it’s a safe bet that you’ll going to use them pretty much everywhere.
It makes the code easier to read.
Now, these macros don’t do anything crazy, they just make deeply nested expressions appear unnested.
For example, here’s an expression that maps over the range of ten numbers, incrementing everything, filtering odd numbers, and adding them together:
(reduce + (filter odd? (map inc (range 10))))
Notice how the expression is written pretty much backward from what I’ve described.
First comes reduce
, then filter
, then map
, and, finally, range
.
Lisps are known for this kind of notation, and you often read the code backward.
In a more mainstream language, this would probably have been written like this:
[0,1,2,3,4,5,6,7,8,9].map(x=>x+1).filter(x=>x%2!=0).reduce((res,x)=>res+x)
As you can see, the order of operations feels more logical, even though it’s just a syntactic abstraction. Methods, really, are just functions, and their first argument simply is what appeared before the dot. In its essence, the above code is equivalent to this code, even though it’s not a valid JavaScript:
reduce(filter(map([0,1,2,3,4,5,6,7,8,9], x=>x+1), x=>x%2!=0), (res,x)=>res+x)
So we’re back to the start, reduce
is executed last, but appears first.
The .
allows us to express this more sequentially, and that’s what Clojure’s ->
and ->>
are for.
The original code can be rewritten as:
(->> (range 10) (map inc) (filter odd?) (reduce +))
The difference between ->
and ->>
is only where the previous expression is going to be placed in the next expression.
The thread-first (->
)puts it as the first argument, and thread-last (->>
) as the last: (-> 42 (+ 1 2))
expands to (+ 42 1 2)
and (->> 42 (+ 1 2))
expands to (+ 1 2 42)
.
This repeats for all forms in the threading macro.
Now, that was a pretty long tangent, but I felt that this needs explaining, because we’re going to see the use of ->>
in the following examples.
Why did I call this section “The spear of lazy poisoning”?
Well, I think that ->>
looks like a spear, or a harpoon, and it often poisons the code with unnecessary laziness.
You can even see that in the example above - both map
and filter
are lazy, yet the reduce
is eager, and all laziness up to that point was unnecessary because the data we were processing is finite.
This gives us the laziness overhead, which is mostly computational.
We can, of course, use mapv
and filterv
- the eager versions of map
and filter
that produce vectors instead, but it would also harm us:
(->> (range 10) (mapv inc) (filterv odd?) (reduce +))
Now we construct a whole vector only to discard half of it in the next step, then discarding the remaining half in the reduce
.
That’s a memory overhead we’d also like to avoid.
So what should we do?
Transducers
Transducers are used by the transduce
function which is like reduce
except it doesn’t accept just a reducing function, it also accepts a transforming function.
Before we go into that, let’s go back to the lazy example for a moment:
(->> (range 10) (map inc) (filter odd?) (reduce +))
I think it’s important to build intuition of how it works.
The operations are lazy, so by the time we get to the reduce
, nothing happened yet.
What we get in the reduce
is a lazy composition of two operations:
- Take one item from the range sequence, and process it with
inc
; - Take one item from sequence built by
map
, filter it withodd?
; a. If the result is not truthly, go to step 1; - Reduce it to the result using
+
.
This means that in reality, the sequence that the reduce
is processing looks kinda like [(odd? (inc 0)), (odd? (inc 1)), (odd (inc 2)), ...]
, with the exception that filtered items magically disappear from it.
These expressions are lazy, and not computed until they get into reduce
which realizes them.
This, of course, isn’t how it works, but you can use this model for building intuition.
If you want to read more about lazy sequences, I have a post, describing the implementation of a simple lazy sequence. It’s in Lua, but the code should be simple enough to understand even if you don’t know the language.
Now, the key part here is that we’re composing operations by delaying them - we could say that we’re building a recipe.
However, each operation is lazy, but, as you can see, we don’t need every operation to be lazy, as once we get a number we can process it eagerly from there.
In other words, if we could combine both map
and filter
into a single step, we could avoid the overhead of multiple lazy compositions.
Now, Clojure has such a function, it’s called keep
:
(->> (range 10)
(keep #(let [x (inc %)]
(when (odd? x)
x)))
(reduce +))
However, keep
is also lazy, and in this case, again, there’s no point in being lazy.
And we may need to work with something more general, i.e. not always want to combine map
and filter
, sometimes we need to combine multiple map
calls, or even some other things, like partitioning.
That’s where transducers come in nicely.
Clojure developers actually wanted to solve a different problem when they introduced transducers, and I’ll get to it a bit later. For now, we can rewrite our code to be both eager and memory-efficient:
(loop [items (seq (range 10))
result 0]
(if items
(let [x (inc (first items))]
(recur (next items)
(if (odd? x)
(+ result x)
result)))
result))
Just kidding.
That’s a lot of machinery for describing the core concepts of the language.
Here’s how to do it with transduce
:
(->> (range 10) (transduce (comp (map inc) (filter odd?)) +))
As you can see, it is different from reduce
because both map
and filter
are not involved with the collection processing at all.
Instead, they go as a parameter to the transduce
function, and we use comp
to compose the two into a single step that does everything.
Importantly, both map
and filter
calls are made without any collection supplied to them - this way these functions return transducers.
I tried my best explaining how transducer composition works in my previous post on the matter, so read it if you want to know.
However, that’s a lot to take in, as transducers are quite involved, so here’s a simpler intuition for you.
If you want to refactor any transduceible expression into a transducer, first, transform it into the ->>
form, and then replace the ->>
with comp:
(reduce + (filter odd? (map inc (range 10))))
;; rewrite using ->>:
(->> (range 10)
(map inc)
(filter odd?)
(reduce +))
;; replace ->> with comp
(comp (range 10)
(map inc)
(filter odd?)
(reduce +))
We’re halfway through, though. This code obviously wrong, but what’s left out to do is to understand what it does, and finish the transformation. The first step is always the same though, we need to remove the data that we’re processing from this expression:
(range 10) ; just move it aside for now
(comp (map inc)
(filter odd?)
(reduce +))
Finally, reduce
isn’t transduceible, so we need to move it out and replace it with something that can accept the transducer, like transduce
:
(transduce
(comp (map inc)
(filter odd?))
+
(range 10) ;; with transduce we know where to put the data
)
That’s basically it.
You can apply this method to pretty much any ->>
expression, as long as the calls done there have arities that return transducers.
More examples
I know, the example above was weird, so here’s a different one:
(->> (get-users)
(filter active-user?)
(map fetch-capabilities)
(filter admin?)
(map fetch-notification-settings)
(notify))
In this example, we build a sequence of users who have the administrator role on the server and add some additional data while we’re at it.
We then notify
each such user.
This is not a real-world code, though I have seen something like that in production.
The notify
, again, consumes all of the users, so we don’t really need the laziness, here, unless get-users
doesn’t return an infinite sequence of users, which it is not.
But even if it did, we don’t really need all of the steps to be lazy here, so we can go about this two ways.
If the list of users is short, we can store it in a vector. Let’s refactor the code:
;; start with `comp`:
(comp (get-users)
(filter active-user?)
(map fetch-capabilities)
(filter admin?)
(map fetch-notification-settings)
(notify))
;; move out data and consumer:
(get-users)
(comp (filter active-user?)
(map fetch-capabilities)
(filter admin?)
(map fetch-notification-settings))
(notify)
;; Since we want a vector, we can use `into`:
(notify
(into []
(comp (filter active-user?)
(map fetch-capabilities)
(filter admin?)
(map fetch-notification-settings))
(get-users)))
;; let's bring back ->> to bring nesting down a bit
(->> (get-users)
(into []
(comp (filter active-user?)
(map fetch-capabilities)
(filter admin?)
(map fetch-notification-settings)))
(notify))
If we still want to consume users lazily, we can use sequence
:
(->> (get-users)
(sequence ;; now the result is a lazy sequence
(comp (filter active-user?)
(map fetch-capabilities)
(filter admin?)
(map fetch-notification-settings)))
(notify))
It’s a small change from the previous example, but it has benefits over the original example.
In the original example, we had all processing done lazily, however, we only needed to consume users lazily, and the transformation can be done eagerly for each user.
We just don’t want to do the transformation in the notify
function, because it is not its task really.
The sequence
function consumes input lazily, applying a composed transformation to every element, and produces a lazy sequence as a result.
Thus we only have overhead of one lazy sequence, instead of four.
Your profiling flame graphs would look much nicer this way.
The beauty of this is that ->>
transforms to comp
almost one-to-one.
The problem is, that it is counter-intuitive because for ordinary functions the order is the opposite.
So keep that in mind.
For ordinary function, the composition works like this:
((comp a b c d) x) ; same as the expression below
; (expressions are aligned for clarity)
(a (b (c (d x))))
For transducers, it works exactly the same, except the argument we’re composing with is not the value, it’s a reducing function.
The resulting function processes the value in the order specified by the code, so that’s one of the reasons people tend to dislike transducers.
My advice is just to remember that the order for transducer composition is the same as in ->>
.
Other contexts
Now, the reason why transducers are a thing is that when working on other libraries for the Clojure ecosystem, the authors noticed that while mapping, filtering, and reducing a core concept, it can’t be reused.
One such example is the clojure.core.async
library.
At one point they needed to map a function over a channel, but you can’t really do that.
Items appear on channels only when someone puts them, so map
can’t pull on the channel.
Moreover, map
would need an extra channel because it can’t put the item back into the original channel.
Originally, core.async
had its own version of map
and filter
, called map>
and filter>
.
However, these were deprecated in favor of transducers.
Say, get-users
doesn’t return you a lazy sequence, but instead returns you a core.async
channel.
How would we need to change our processing pipeline from the lazy variant?
(def user-pipeline
(comp (filter active-user?)
(map fetch-capabilities)
(filter admin?)
(map fetch-notification-settings)))
(let [admins (async/chan 1 user-pipeline)]
(async/pipe (get-users) admins)
(notify admins))
Imagine we’ve changed notify
to accept a channel.
Now, we can create a channel with our transformation pipeline as a transducer.
As a matter of fact, core.async
provides a way of running such actions in parallel:
(let [admins (async/chan)]
(async/pipeline 10 admins user-pipeline (get-users))
(notify admins))
Extracted the transformation into a separate binding makes the code a bit more concise, and reusable.
And that’s exactly what transducers are about - we can abstract the action away from how the data arrives and how it should be stored.
Here’s another example, with the manifold
library:
(->> (get-users)
(manifold.stream/->source)
(s/transform user-pipeline)
(notify))
The same transformation pipeline works for manifold.stream
because it also accepts transducers with the transform
function.
We can then modify notify
to accept a stream instead of a channel and even have an interop between both libraries.
So, there you have it - we can process, eager collections, lazy sequences, asynchronous channels, and manifold streams, using the same data pipeline:
(->> (get-users-seq)
(into [] user-pipeline)
(eager-notify))
(->> (get-users-seq)
(sequence user-pipeline)
(lazy-notify))
(let [admins (async/chan 1 user-pipeline)]
(async/pipe (get-users-chan) admins)
(chan-notify admins))
(->> (get-users-chan)
(manifold.stream/->source)
(s/transform user-pipeline)
(stream-notify))
Final thoughts
While transducers are great, there are always things to consider.
Let’s start with the elephant in the room - the code using transducers is often ugly. Like, let’s go a bit back and compare these two examples:
(->> (get-users)
(filter active-user?)
(map fetch-capabilities)
(filter admin?)
(map fetch-notification-settings)
(notify))
(notify
(into []
(comp (filter active-user?)
(map fetch-capabilities)
(filter admin?)
(map fetch-notification-settings))
(get-users)))
The first one is so slick, yet inefficient. The second one looks like a tool that can do the job efficiently, but you’re going to hurt yourself in the process.
Well, we can refactor the second one a bit more, moving out the comp
call, using ->>
on the result, but honestly, it’s not often when I see comp
moved out and defined as binding.
More often it’s written in this crud form.
However, this can be fixed for many scenarios with macros.
There’s a library that provides x>
and x>>
macros that automatically use transducers.
Here’s what they have in the readme:
(->> (range 10000000)
(map inc)
(filter odd?)
(mapcat #(do [% (dec %)]))
(partition-by #(= 0 (mod % 5)))
(map (partial apply +))
;; (mapv dec)
(map (partial + 10))
(map #(do {:temp-value %}))
(map :temp-value)
(filter even?)
(apply +)
time)
;; "Elapsed time: 8275.319295 msecs"
;; 5000054999994
(x>> (range 10000000)
(map inc)
(filter odd?)
(mapcat #(do [% (dec %)]))
(partition-by #(= 0 (mod % 5)))
(map (partial apply +))
;; (mapv dec)
(map (partial + 10))
(map #(do {:temp-value %}))
(map :temp-value)
(filter even?)
(apply +)
time)
;; "Elapsed time: 2913.851103 msecs"
;; 5000054999994
That’s two to three times the speed increase by one character change. If we expand the macro, we’ll see that it simply builds a transducer with some internal machinery:
(->> (range 10000000)
((injest.impl/xfn [[clojure.core/map inc]
[clojure.core/filter odd?]
[clojure.core/mapcat (fn* [p1__8713#] (do [p1__8713# (dec p1__8713#)]))]
[clojure.core/partition-by (fn* [p1__8714#] (= 0 (mod p1__8714# 5)))]
[clojure.core/map (partial apply +)]
[clojure.core/map (partial + 10)]
[clojure.core/map (fn* [p1__8715#] (do {:temp-value p1__8715#}))]
[clojure.core/map :temp-value]
[clojure.core/filter even?]]))
(clojure.core/apply +)
time)
The result of calling injest.impl/xfn
is a function that builds a transducer and passes it into a sequence
call.
So there’s no magic, apart from a clever way of transforming the code into this form.
But, there’s another problem with transducers - not every transformation pipeline can be described as one.
Notice that there’s a commented-out call to mapv
.
Unlike map
, mapv
doesn’t return a transducer, and is always eager.
If we were to refactor this code into comp
we would get an error.
The x>>
macro handles this by breaking the transducer into two parts:
(->> (range 10000000)
((injest.impl/xfn [[clojure.core/map inc]
[clojure.core/filter odd?]
[clojure.core/mapcat (fn* [p1__8724#] (do [p1__8724# (dec p1__8724#)]))]
[clojure.core/partition-by (fn* [p1__8725#] (= 0 (mod p1__8725# 5)))]
[clojure.core/map (partial apply +)]]))
(clojure.core/mapv dec)
((injest.impl/xfn [[clojure.core/map (partial + 10)]
[clojure.core/map (fn* [p1__8726#] (do {:temp-value p1__8726#}))]
[clojure.core/map :temp-value]
[clojure.core/filter even?]]))
(clojure.core/apply +)
time)
That’s another clever trick, and it certainly helps.
Now, I don’t use this library in practice.
Instead, I tend to write transducers by hand, because I want them to be understandable.
It’s easy enough to explain how comp
works with transducers to colleagues once, than to close my eyes and believe that the library will do everything for me.
Maybe I’m old-fashioned.
However, there’s another difference that we need to remember about - transducers have a bit different laziness semantics.
You can create lazy sequences using transducers with the sequence
function, however, you should be mindful that any step in your transducer will still be executed eagerly.
Finally, not all functions provide a transducer counterpart to them. There are libraries, again, that provide more transducers, but I’d much rather see more transducers in the Clojure core.
And sometimes, there’s no need to do that at all - lazy transformations can be fine, or exactly what you need in a particular context.
So don’t just rush out replacing all consecutive calls to map
and filter
with a transducer.
Although, if you want to, here’s a clj-kondo hook that issues a warning when you use too many lazy transformations in a row:
(ns hooks.thread-macros
(:require [clj-kondo.hooks-api :as api]))
(def consecutive-threshold 2)
(def transduceable
#{'map
'filter
;; add more functions if you need
})
(defn consecutive-threading [{:keys [:node]}]
(let [[_ & forms] (rest (:children node))]
(loop [forms forms
counter 0]
(when-let [form (first forms)]
(if (contains? transduceable (some-> form :children first api/sexpr))
(if (>= counter consecutive-threshold)
(api/reg-finding!
(assoc (meta node)
:message "prefer transducers to consecutive maps or filters in threading macros"
:type :expensive-threading))
(recur (next forms)
(inc counter)))
(recur (next forms)
0))))))
{:linters {:expensive-threading {:level :warning}}
:hooks {:analyze-call {clojure.core/->> hooks.thread-macros/consecutive-threading}}}
That’s all from me, and I hope you’ve found this post interesting. Reach out, if you have any interesting examples where transducers were hard to apply, or where they made the code more robust. And I hope transducers will get a better rep in the future.