Am trying to do a benchmark of using aerospike, For example, lets say if I want to test storing a document may be JSON in aerospike and do the same test against couchbase, what kinda tool/method i can use it to test between aerospike and couchbase.
I am not very familiar with Couchbase but can give you pointers to benchmark against Aerospike (I work there). YCSB (Yahoo! Cloud System Benchmark) would probably be the best tool to use against both. Here is the code to use against Aerospike.
Now for specifically benchmarking Aerospike, I would recommend using one of Aerospike's benchmark tools, for example the Java benchmark tool and use a String (-o S:) or a Java blob (-o B:) with a length comparable to your json document.
For example:
./run_benchmarks -h 127.0.0.1 -p 3000 -n test -k 10000000 -b 1 -o B:1400 -w RU,80 -z 8
This would run a workload of 80% reads and 20% writes (-w RU,80
) and using 8 concurrent threads (-z 8
).
Some important points to consider though:
Finally, I would recommend reading some of the published document related to benchmarking Aerospike and other NoSQL databases. Particularly, this one.
Hope this helps!
For Couchbase the process is similar, with Couchbase offering a tool for generating an amount of traffic on a cluster, via cbworkloadgen.
A comparable workload to the one suggested by Aerospace above, would be executed as follows (while in the directory specified in the link above according to your OS):
./cbworkloadgen -n 127.0.0.1:8091 -u username -p password -j -i 10000000 -r 0.2 -t 8
As with Aerospike's example, 80:20 reads/writes (-r 0.2
) and 8 concurrent threads (-t 8
) again, with JSON documents (note, not JSON blobs) being stored (-j
)
If you're benchmarking for a future use case, try to use a setup close to what you'll be using (i.e. multiple nodes of a hardware configuration close to what you intend to use), both in terms of the datastore cluster and the client hosts (i.e. multiple hosts).
Finally, lots of different companies will produce many different benchmarks that may or may not accurately represent the performance you can expect to get of their software (and how they claim it will behave in relation to competitors). It's important that you research thoroughly and match any claims to your own benchmarks. Some published benchmark may return performance of one level, while on your particular setup getting the same performance may be possible, but might require really minute tuning (and a large amount of effort on your part).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With