Documentation

You are viewing the documentation for the 2.8.6 release in the 2.8.x series of releases. The latest stable release series is 3.0.x.

§Handling asynchronous results

§Make controllers asynchronous

Internally, Play Framework is asynchronous from the bottom up. Play handles every request in an asynchronous, non-blocking way.

The default configuration is tuned for asynchronous controllers. In other words, the application code should avoid blocking in controllers, i.e., having the controller code wait for an operation. Common examples of such blocking operations are JDBC calls, streaming API, HTTP requests and long computations.

Although it’s possible to increase the number of threads in the default execution context to allow more concurrent requests to be processed by blocking controllers, following the recommended approach of keeping the controllers asynchronous makes it easier to scale and to keep the system responsive under load.

§Creating non-blocking actions

Because of the way Play works, action code must be as fast as possible, i.e., non-blocking. So what should we return as result if we are not yet able to generate it? The response is a future result!

A Future[Result] will eventually be redeemed with a value of type Result. By giving a Future[Result] instead of a normal Result, we are able to quickly generate the result without blocking. Play will then serve the result as soon as the promise is redeemed.

The web client will be blocked while waiting for the response, but nothing will be blocked on the server, and server resources can be used to serve other clients.

Using a Future is only half of the picture though! If you are calling out to a blocking API such as JDBC, then you still will need to have your ExecutionStage run with a different executor, to move it off Play’s rendering thread pool. You can do this by creating a subclass of play.api.libs.concurrent.CustomExecutionContext with a reference to the custom dispatcher.

import play.api.libs.concurrent.CustomExecutionContext

// Make sure to bind the new context class to this trait using one of the custom
// binding techniques listed on the "Scala Dependency Injection" documentation page
trait MyExecutionContext extends ExecutionContext

class MyExecutionContextImpl @Inject() (system: ActorSystem)
    extends CustomExecutionContext(system, "my.executor")
    with MyExecutionContext

class HomeController @Inject() (myExecutionContext: MyExecutionContext, val controllerComponents: ControllerComponents)
    extends BaseController {
  def index = Action.async {
    Future {
      // Call some blocking API
      Ok("result of blocking call")
    }(myExecutionContext)
  }
}

Please see ThreadPools for more information on using custom execution contexts effectively.

§How to create a Future[Result]

To create a Future[Result] we need another future first: the future that will give us the actual value we need to compute the result:


val futurePIValue: Future[Double] = computePIAsynchronously() val futureResult: Future[Result] = futurePIValue.map { pi => Ok("PI value computed: " + pi) }

All of Play’s asynchronous API calls give you a Future. This is the case whether you are calling an external web service using the play.api.libs.WS API, or using Akka to schedule asynchronous tasks or to communicate with actors using play.api.libs.Akka.

Here is a simple way to execute a block of code asynchronously and to get a Future:

val futureInt: Future[Int] = scala.concurrent.Future {
  intensiveComputation()
}

Note: It’s important to understand which thread code runs on with futures. In the two code blocks above, there is an import on Plays default execution context. This is an implicit parameter that gets passed to all methods on the future API that accept callbacks. The execution context will often be equivalent to a thread pool, though not necessarily.

You can’t magically turn synchronous IO into asynchronous by wrapping it in a Future. If you can’t change the application’s architecture to avoid blocking operations, at some point that operation will have to be executed, and that thread is going to block. So in addition to enclosing the operation in a Future, it’s necessary to configure it to run in a separate execution context that has been configured with enough threads to deal with the expected concurrency. See Understanding Play thread pools for more information, and download the play example templates that show database integration.

It can also be helpful to use Actors for blocking operations. Actors provide a clean model for handling timeouts and failures, setting up blocking execution contexts, and managing any state that may be associated with the service. Also Actors provide patterns like ScatterGatherFirstCompletedRouter to address simultaneous cache and database requests and allow remote execution on a cluster of backend servers. But an Actor may be overkill depending on what you need.

§Returning futures

While we were using the Action.apply builder method to build actions until now, to send an asynchronous result we need to use the Action.async builder method:

def index = Action.async {
  val futureInt = scala.concurrent.Future { intensiveComputation() }
  futureInt.map(i => Ok("Got result: " + i))
}

§Actions are asynchronous by default

Play actions are asynchronous by default. For instance, in the controller code below, the { Ok(...) } part of the code is not the method body of the controller. It is an anonymous function that is being passed to the Action object’s apply method, which creates an object of type Action. Internally, the anonymous function that you wrote will be called and its result will be enclosed in a Future.

def echo = Action { request =>
  Ok("Got request [" + request + "]")
}

Note: Both Action.apply and Action.async create Action objects that are handled internally in the same way. There is a single kind of Action, which is asynchronous, and not two kinds (a synchronous one and an asynchronous one). The .async builder is just a facility to simplify creating actions based on APIs that return a Future, which makes it easier to write non-blocking code.

§Handling time-outs

It is often useful to handle time-outs properly, to avoid having the web browser block and wait if something goes wrong. You can use play.api.libs.concurrent.Futures to wrap a Future in a non-blocking timeout.

import scala.concurrent.duration._
import play.api.libs.concurrent.Futures._

def index = Action.async {
  // You will need an implicit Futures for withTimeout() -- you usually get
  // that by injecting it into your controller's constructor
  intensiveComputation()
    .withTimeout(1.seconds)
    .map { i =>
      Ok("Got result: " + i)
    }
    .recover {
      case e: scala.concurrent.TimeoutException =>
        InternalServerError("timeout")
    }
}

Note: Timeout is not the same as cancellation – even in case of timeout, the given future will still complete, even though that completed value is not returned.

Next: Streaming HTTP responses