In order to populate a system with data en masse before running performance scripts, our ideal use-case would be to do so with Gatling. Data need not be different, beyond having a unique Primary ID.
object Object {
val create = repeat(4, "n")
{
exec(http("Create Object")
.post("/our/api/objects")
.body(ELFileBody("CreateObject_0001_request.txt"))
.check(status.is(201)))
}
}
val createObjects = scenario("ModularSimulation").exec(CreateObjects.create);
setUp(
createObjects.inject(atOnceUsers(1))
).protocols(httpProtocol)
The above example can create any number of objects by changing the value of the repeat
, however at large scales (e.g. 100,000 objects) it becomes impractical to do this linearly. So what I would like to do is have a shared pool of objects to be created by, say, 100 users.
This is - of course - the use case for a feeder. Rather than generate a static .csv file
or use Redis
it seems simplest to use a simple iterative loop (e.g. 0
to 100000
).
I know (from documentation and other questions) that Feeder is a type alias for Iterator[Map[String, T]]
so I presume this should be very straightforward - but I can't seem to find a simple example of this most basic case.
Any help would be much appreciated.
I'm not sure what you want not achieve, but working with feeders is easy. Suppose you have:
import scala.util.Random
// An infinite feeder generating random strings
// accessible under "yourInfiniteSessionKey" from session
val infiniteFeeder = Iterator.continually(
Map("yourInfinteSessionKey" -> Random.alphanumeric.take(20).mkString)
)
// A finite feeder (in sense of possible values)
// accessible under "yourFiniteSessionKey" from session
// it contains just 2 values for illustration purposes
val finiteFeeder = (for (i <- 0 until 2) yield {
Map("yourFiniteSessionKey" -> s"I'm finite $i")
})
// A fixed feeder (again in sense of possible values)
// accessible under "yourFixedSessionKey" from session
// again it contains just 2 values for illustration purposes
val fixedFeeder = Array(
Map("yourFixedSessionKey" -> s"I'm fixed 1"),
Map("yourFixedSessionKey" -> s"I'm fixed 2")
)
val scn = scenario("Feeding")
.feed(infiniteFeeder)
.feed(finiteFeeder)
.feed(fixedFeeder)
.exec(http("root")
.get("/${yourInfinteSessionKey}/${yourFiniteSessionKey}/${yourFixedSessionKey}"))
Just for a sake of example our scenario takes from all our feeders and use the values to compose GET request against "/".
In this case fixedFeeder
is Array[Map[String, _]
and finiteFeeder
is IndexedSeq[Map[String, _]
. If the number of items in your finite feeders match your setup it is ok so and you can run the scenario, e.g. like:
setUp(
scn.inject(atOnceUsers(2))
).protocols(httpConf)
Setup has just two virual users thus it will run without an issue. When you have more virtual users in your setup, e.g.:
setUp(
scn.inject(constantUsersPerSec(1) during(30.seconds)) // equals 30 virtual users
).protocols(httpConf)
you will run into the problem with finite/fixed feeders and Gatling will complain about it and your simulation will stop like this:
[error] java.lang.IllegalStateException: Feeder is now empty, stopping engine
[error] at io.gatling.core.action.SingletonFeed.feed(SingletonFeed.scala:59)
[error] at io.gatling.core.action.SingletonFeed$$anonfun$receive$1.applyOrElse(SingletonFeed.scala:28)
[error] at akka.actor.Actor$class.aroundReceive(Actor.scala:467)
Good news is that you can make you finite/fixed feeders infinite using on of Gatling API methods of RecordSeqFeederBuilder, e.g.:
// Now the feeder from fixed values is infinite
val fixedFeeder = Array(
Map("yourFixedSessionKey" -> s"I'm fixed 1"),
Map("yourFixedSessionKey" -> s"I'm fixed 2")
).circular // go back to the top of the array once the end is reached
You can also use such API method calls directly in scenario definition and leave your fixed/finite feeders untouched, like:
val scn = scenario("Feeding")
.feed(infiniteFeeder)
.feed(finiteFeeder.random) // now is finiteFeeder infinite
.feed(fixedFeeder.circular) // fixedFeeder is infinite too
.exec(http("root")
Enjoy
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With