Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Redis Error "max number of clients reached"

I am running a nodeJS application using forever npm module.
Node application also connects to Redis DB for cache check. Quite often the API stops working with the following error on the forever log.

{ ReplyError: Ready check failed: ERR max number of clients reached
    at parseError (/home/myapp/core/node_modules/redis/node_modules/redis-parser/lib/parser.js:193:12)
    at parseType (/home/myapp/core/node_modules/redis/node_modules/redis-parser/lib/parser.js:303:14)
    at JavascriptRedisParser.execute (/home/myapp/ecore/node_modules/redis/node_modules/redis-parser/lib/parser.js:563:20) command: 'INFO', code: 'ERR' }

when I execute the client list command on the redis server it shows too many open connections. I have also set the timeout = 3600 in my Redis configuration.
I do not have any unclosed Redis connection object on my application code.
This happens once or twice in a week depending on the application load, as a stop gap solution I am restarting the node server( it works ).

What could be the permanent solution in this case?

like image 516
vikas kv Avatar asked Feb 24 '20 06:02

vikas kv


People also ask

How many clients can redis handle?

Redis can handle many connections, and by default, Redis has a maximum number of client connections set at 10,000 connections. You can set the maximum number of client connections you want the Redis server to accept by altering the maxclient from within the redis.

Can redis handle multiple connections?

Maximum Concurrent Connected Clients In Redis 2.4 there was a hard-coded limit for the maximum number of clients that could be handled simultaneously. In Redis 2.6 and newer, this limit is dynamic: by default it is set to 10000 clients, unless otherwise stated by the maxclients directive in redis. conf .

What is redis pool size?

To improve performance, go-redis automatically manages a pool of network connections (sockets). By default, the pool size is 10 connections per every available CPU as reported by runtime.


2 Answers

I have figured out why. This has nothing to do with Redis. Increasing the OS file descriptor limit was just a temporary solution. I was using Redis in a web application and the connection was created for every new request.

When the server was restarted occasionally, all the held-up connections by the express server were released.

I solved this by creating a global connection object and re-using the same. The new connection is created only when necessary.

You could do so by creating a global connection object, make a connection once, and make sure it is connected before every time you use that. Check if there is an already coded solution depending on your programming language. In my case it was perl with dancer framework and I used a module called Dancer2::Plugin::Redis

redis_plugin

Returns a Dancer2::Plugin::Redis instance. You can use redis_plugin to pass the plugin instance to 3rd party modules (backend api) so you can access the existing Redis connection there. You will need to access the actual methods of the the plugin instance.

In case if you are not running a web-server and you are running a worker process or any background job process, you could do this simple helper function to re-use the connection.
perl example

sub get_redis_connection {
    my $redis = Redis->new(server  => "www.example.com:6372" , debug => 0);
    $redis->auth('abcdefghijklmnop');
    return $redis;
}

...

## when required
unless($redisclient->ping) {
      warn "creating new redis connection";
      $redisclient = get_redis_connection();
}
like image 152
vikas kv Avatar answered Oct 12 '22 03:10

vikas kv


I was running into this issue in my chat app because I was creating a new Redis instance each time something connected rather than just creating it once.

// THE WRONG WAY
export const getRedisPubSub = () => new RedisPubSub({
    subscriber: new Redis(REDIS_CONNECTION_CONFIG),
    publisher: new Redis(REDIS_CONNECTION_CONFIG),
});

and where I wanted to use the connection I was calling

// THE WRONG WAY
getNewRedisPubsub();

I fixed it by just creating the connection once when my app loaded.

export const redisPubSub = new RedisPubSub({
    subscriber: new Redis(REDIS_CONNECTION_CONFIG),
    publisher: new Redis(REDIS_CONNECTION_CONFIG),
});

and then I passed the one-time initialized redisPubSub object to my createServer function.

It was this article here that helped me see my error: https://docs.upstash.com/troubleshooting/max_concurrent_connections

like image 22
Joshua Dyck Avatar answered Oct 12 '22 05:10

Joshua Dyck