Transform stacks

What is an “effect stack”?

There is an abuse of language here. The name “stack” comes from “monad stack” used when talking about monad transformers. With Eff though, effects are modelled differently, as a tree of effects.

For example the type level representation of four effects T1, T2, T3, T4 is represented as:

Fx.fx4[T1, T2, T3, T4]

// or

FxAppend[
  Fx1[T1],
  Fx3[T2, T3, T4]
]

So every-time we manipulate effects at the type level we modify a tree of effects. For example, interpreting the effect T3 would leave us with the tree:

FxAppend[
  Fx1[T1],
  Fx2[T2, T4]
]

This code should prove it: ``

Unfortunately the compiler has some difficulties with it, so you can either get the member value by using the implicit definitions “manually” or you can just summon the member instance without the Aux part: ``

More importantly the compiler is still able to track the right types resulting of the interpretation of a given effect so the following compiles ok:

``

Transform an effect to another

Change the effect

A typical use case for this is to transform a stack having a Reader[S, *] effect to a stack having a Reader[B, *] effect where S is “contained” in B (meaning that there is a mapping from B, “big”, to S, “small”). Here is an example:`> Some(hello world)`

There are also specialized versions of transform for Reader and State:

Translate an effect into multiple others

A common thing to do is to translate effects (a webservice DSL for example) into multiple others (TimedFuture, Eval, Either, etc…).

For example you might have this stack:

type S = Fx.fx3[Authenticated, TimedFuture, Either[AuthError, *]]

And you want to write an interpreter which will translate authentication actions into TimedFuture and Either:``

The call to send above needs to send an TimedFuture value in the stack U. This is possible because TimedFuture is an effect in U as evidenced by future.

Furthermore, authenticate returns an Either[AuthError, *] value. We can “collapse” it into U because Either[AuthError, *] is an effect of U as evidenced by either.

You might wonder why we don’t use a more direct type signature like:

def runAuth2[R, U :_future :_error, A](e: Eff[R, A])(
  implicit authenticated: Member.Aux[Authenticated, R, U]): Eff[U, A]

The reason is that scalac desugars this to:

def runAuth2[R, U, A](e: Eff[R, A])(
  implicit future:        _future[U],
           either:        _error[U],
           authenticated: Member.Aux[Authenticated, R, U]): Eff[U, A] =

And then authenticated is last in the list of implicits parameters and can not be used to guide type inference.

Interpret an effect “locally”

Let’s say you have a method to run database queries ``

The database queries (the Db effect) are being executed by the runDb method inside the Eval effect, and they use a WriterString effect to log what is being executed.

However you know that some clients of your component don’t care about the logs and they don’t want to have the WriterString effect. that they consider an implementation detail.

So you’d like to provide this additional method: ``

How can you implement executeOnDb with runDb? ``

You create a “local” stack containing the WriterString effect using the prepend method. You now run the Db effect and discard the logs to finally return only Eff[U, A].

Merge stacks

We can create effects for a given effect stack, for example to interact with a Hadoop cluster. We can also define another stack, for storing and retrieving data on S3.

  import org.atnos.eff._
  import all._
  import cats.data._
  import cats.Eval

  object HadoopStack {

    case class HadoopConf(mappers: Int)

    type HadoopReader[A] = Reader[HadoopConf, A]
    type WriterString[A] = Writer[String, A]
    type Hadoop = Fx.fx3[HadoopReader, WriterString, Eval]

    def readFile(path: String): Eff[Hadoop, String] =
      for {
        c <- ask[Hadoop, HadoopConf]
        _ <- tell[Hadoop, String]("Reading from " + path)
      } yield c.mappers.toString

    def runHadoopReader[R, U, A](conf: HadoopConf)(e: Eff[R, A])(using Member.Aux[HadoopReader, R, U]): Eff[U, A] =
      ReaderEffect.runReader(conf)(e)

  }

  object S3Stack {

    case class S3Conf(bucket: String)

    type S3Reader[A] = Reader[S3Conf, A]
    type WriterString[A] = Writer[String, A]

    type S3 = Fx.fx3[S3Reader, WriterString, Eval]

    def writeFile(key: String, content: String): Eff[S3, Unit] =
      for {
        c <- ask[S3, S3Conf]
        _ <- tell[S3, String]("Writing to bucket " + c.bucket + ": " + content)
      } yield ()

    def runS3Reader[R, U, A](conf: S3Conf)(e: Eff[R, A])(using Member.Aux[S3Reader, R, U]): Eff[U, A] =
      ReaderEffect.runReader(conf)(e)
  }

So what happens when you want to both use S3 and Hadoop? As you can see from the definition above those 2 stacks share some common effects, so the resulting stack we want to work with is:``

Then we can use the into method to inject effects from each stack into this common stack:`> ((),List(Reading from /tmp/data, Writing to bucket bucket: 10))`

You can find a fully working example of this approach in src/test/org/atnos/example/StacksSpec.