Im experimenting with Redis on my local machine. So far i've got it working albeit very slowly.
Ive got an array of about 14,000 objects, and retrieving them is taking just over 3 seconds each time, which is obviously too slow for production purposes.
I have a feeling the majority of the time spent is on de-serializing the objects, but im not really sure if there's anything I can do to correct this.
Can I store them in the first place without serialializing them (if that makes sense)? Failing that, is there anything I can do to speed up the deserialization process? I've implemented ISerialization but it doesnt seem to make any difference.
For reference im using the ServiceStack adaptation of Redis.
Unless you measure, you will not know.
[Source: A wise dev' manager of mine, circa 1992!]
Before pointing fingers at supposed culprits, you should first profile/measure your code to determine exactly where your performance issue is. Then implement a fix and re-measure. Repeat until your perf is satisfactory.
There are many profilers available to use, including Visual Studio's built-in profiler, others are available as add-ins (e.g. RedGate's Ants profiler, JetBrains' dotTrace or Telerik's JustTrace, etc.)
Alternatively, try using Trace.WriteLine(...)
and Stopwatch
to instrument your code in order to work out how long the data access operations take to execute and how long it takes to de-serialize the data.
FWIW, I'd be surprised if a recent build of Redis ran as slowly as you're seeing on Windows or any other OS. Heck, even SQL Server Express (2012) can return 199000 rows and store them in a CSV in less than 1s:
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With