Koan: YCombinator

Felix Bogdanski

since 01.06.2016


We all work within some industry and often it's about time and money. But essentially, coding to me is art and we should have fun practicing our art. So, this time, let's talk about the art of coding. For this, I picked a very beautiful and elegant piece of code – the YCombinator written in Scala1:

object YCombinator {
  def apply[A, B](f: (A => B) => (A => B)): A => B = f(apply(f))(_)
}

You might have a rough idea about what this YCombinator does, but can't grasp the whole concept of it, e. g. in terms of drawing a call stack in your head. What can we do with this baffling thing?

Let's start with a recursive implementation of the faculty function n! = n * (n - 1) * (n - 2) * ... * 1:

object Factorial {
  def apply(n: Int): Int = if (n > 1) n * apply(n - 1) else 1
}

val a = Factorial(5) // is 120

This is a straightforward thing. Since defs are method calls, thus functions with a this-pointer and a well-known memory location, we can express this recursive pattern by safely referring to apply within apply. Now, what if we can't refer to our own logic within our logic, because we simply don't have a name to refer to? Can we still express this faculty function in a recursive manner?

object FactorialTailrec {
  def apply(n: Int): Int = {
    @tailrec def fac(i: Int, k: Int): Int = if (i > 1) fac(i - 1, k * (i - 1)) else k
    fac(n, n)
  }
}

val b = FactorialTailrec(5)

If we try it with a nested tail-recursive version, we could express apply without using apply, but still we need to call fac within fac, and since this is the very same pattern, just on a different level, all we can really do to express the faculty function without a reference to its own name is to do what the compiler would do with @tailrec: eliminate recursion through an imperative while-loop!

object FactorialIter {
  def apply(n: Int): Int = {
    var k = n
    var i = n - 1
    while (i > 1) {
      k = k * i
      i = i - 1
    }
    k
  }
}

val c = FactorialIter(5)

What do we get? Well, a long but working faculty function that does not refer to itself!2 And since even the most elegant functional wizardish code at one point has to be translated to imperative assembler instructions, we could argue that this is the only way to express recursive functions without a name.

Is it, really? Let's go back to our YCombinator and try to define the faculty function with it:

object YCombinator {
  def apply[A, B](f: (A => B) => (A => B)): A => B = f(apply(f))(_)
}

val d = YCombinator[Int, Int](a => b => if (b > 1) b * a(b - 1) else 1)(5) // immediately applying 5! = 120

This variant of the faculty function is recursive, but it doesn't have a name to refer to within its definition. All it has are parameters a and b. Apparently, we can use the YCombinator to express nameless, anonymous, recursive functions. Granted. But, how does this work? Let's step into the beauty of it.

The function f: (A => B) => (A => B) can be considered as a template. The first function A => B is the continuation of the second function A => B. Using the YCombinator, we can define the recursive factorial in terms of (a: Int => Int) => ((b: Int) => ...) where a is the same logic as (b => ...), but never the same instance. The job of the YCombinator is to pre-set this logic a for all subsequent calls from inside (b => ...). If we write:

object Overflominator {
  def apply[A, B](f: (A => B) => (A => B)): A => B = f(apply(f))
}

This would compile, because it is the same on the type-level. But there will be a stack overflow error as soon as our factorial function is passed at run-time. The heart of the YCombinator is the partial application (_) in tail position, so let's rewrite it to explicitly show the ad-hoc application, which tames the recursion:

object ExplicitYCombinator {
  def apply[A, B](f: (A => B) => (A => B)): A => B = a => f(apply(f))(a) // instead of (_)
}

So calling the YCombinator with our factorial function will give us a function object. When we call this function object with a number, what will happen? Before it computes something with this number, it will create its own logic as function object X, which can be used by factorial function A to continue, if need be, or terminate. When the current factorial A needs to continue, it calls this function object X. When X is called, it will do the same thing before handing out the desired next factorial function B: it will prepare the factorial function B with its own logic as function object Y and then return this next factorial function B, which, hence, when called from current factorial function A via X, can continue the computation through the very same recursive mechanism Y offers, or terminate. When it terminates, the call stack will unwind and give the desired result.

Thanks for reading.


1: The YCombinator can be defined in a way which is closer to Curry's original formulation. In this implementation, the recursion is realized by deftly interweaving a combinator case class instead of using native recursion. I find that this version is a little less intuitive to explain in plain english. So instead, I picked the native, recursive version to illustrate the concept of it. Although they are different, both implementations carry out the same idea.

2: A true zen master would question this, of course.