I trying to create blaze client with limited number of threads like this:
object ReactiveCats extends IOApp {
private val PORT = 8083
private val DELAY_SERVICE_URL = "http://localhost:8080"
// trying create client with limited number of threads
val clientPool: ExecutorService = Executors.newFixedThreadPool(64)
val clientExecutor: ExecutionContextExecutor = ExecutionContext.fromExecutor(clientPool)
private val httpClient = BlazeClientBuilder[IO](clientExecutor).resource
private val httpApp = HttpRoutes.of[IO] {
case GET -> Root / delayMillis =>
httpClient.use { client =>
client
.expect[String](s"$DELAY_SERVICE_URL/$delayMillis")
.flatMap(response => Ok(s"ReactiveCats: $response"))
}
}.orNotFound
// trying to create server on fixed thread pool
val serverPool: ExecutorService = Executors.newFixedThreadPool(64)
val serverExecutor: ExecutionContextExecutor = ExecutionContext.fromExecutor(serverPool)
// start server
override def run(args: List[String]): IO[ExitCode] =
BlazeServerBuilder[IO](serverExecutor)
.bindHttp(port = PORT, host = "localhost")
.withHttpApp(httpApp)
.serve
.compile
.drain
.as(ExitCode.Success)
}
full code and load-tests
But load-test results looks like one thread by one request:
How I make restrict numbers of threads for my blaze client?
There are two obvious things that are wrong with your code:
use
method of the httpClient
Resource inside the HTTP route, meaning that every time the route is called, it will create, use and destroy the http client. You should instead create it once during startup.Executors, like any other resource (e. g. file handles etc.) should always be allocated using Resource.make
like so:
val clientPool: Resource[IO, ExecutorService] = Resource.make(IO(Executors.newFixedThreadPool(64)))(ex => IO(ex.shutdown()))
val clientExecutor: Resource[IO, ExecutionContextExecutor] = clientPool.map(ExecutionContext.fromExecutor)
private val httpClient = clientExecutor.flatMap(ex => BlazeClientBuilder[IO](ex).resource)
The second problem can easily be fixed by allocating the httpClient before building the HTTP app:
private def httpApp(client: Client[IO]): Kleisli[IO, Request[IO], Response[IO]] = HttpRoutes.of[IO] {
case GET -> Root / delayMillis =>
client
.expect[String](s"$DELAY_SERVICE_URL/$delayMillis")
.flatMap(response => Ok(s"ReactiveCats: $response"))
}.orNotFound
…
override def run(args: List[String]): IO[ExitCode] =
httpClient.use { client =>
BlazeServerBuilder[IO](serverExecutor)
.bindHttp(port = PORT, host = "localhost")
.withHttpApp(httpApp(client))
.serve
.compile
.drain
.as(ExitCode.Success)
}
Another potential problem is that you're using IOApp
, and it comes with its own thread pool. The best way to fix that is probably to mix in the IOApp.WithContext
trait and implement this method:
override protected def executionContextResource: Resource[SyncIO, ExecutionContext] = ???
Copy from my commment.
Answer for performance issue is properly setup for Blaze client - for me this is .withMaxWaitQueueLimit(1024) parameter.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With