Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Symfony 2 blocked concurrency

I have a Symfony 2.5 application and I have some weird problems with request concurrency.

To demonstrate the issue I've created two routes called /time and /sleep. The controller's bodies are quite simple:

timeAction():
    time();

sleepAction()
    sleep(30);

When I request the /time route in my browser - it responds momentarily with the current timestamp. However, when I first request the /sleep route and then the /time route - it just hangs in there until sleep() is completed. Only after that the /time controller will respond with the timestamp. In other words - one request is blocking all the others. I didn't even noticed this at first, but when you have requests with the long execution plan - it becomes apparent.

What could be the reason for this?

I'm still going to do some additional tests on my own to dig deeper in the situation. I will try to update the question with more details.

like image 401
Slava Fomin II Avatar asked Mar 19 '23 14:03

Slava Fomin II


2 Answers

Update

Looks like PdoSessionHandler is now uses some locking mechanism of it's own that will prevent concurrent requests. The old solution will no longer work out of the box.

The official solution to the concurrency problem is to close the session as soon as possible in the request handling cycle. You can do this by calling $session->close() or session_write_close().

However, if you are sure that session data conflicts will not arise in your application you can safely disable locking in the configuration of the PDO session handler:

# services.yml

session.handler.pdo:
        class: Symfony\Component\HttpFoundation\Session\Storage\Handler\PdoSessionHandler
        public: false
        arguments:
            - "pgsql:host=%database_host%;port=%database_port%;dbname=%database_name%"
            - db_username: %database_user%
              db_password: %database_password%
              db_table: session
              db_id_col: session_id
              db_data_col: session_value
              db_time_col: session_time
              db_lifetime_col: session_lifetime
              lock_mode: 0 # LOCK_NONE

You can read more in this issue: https://github.com/symfony/symfony/pull/10908

Old solution

Thanks to Crozin who pointed me in the right direction that helped to solve my problem. I will put additional information here that I hope will help someone in the future to save some time.

The issue is also described in the following topics:

  • How do I configure Apache2 to allow multiple simultaneous connections from same IP address?
  • Simultaneous Requests to PHP Script

The problem is that PHP by default is using file-based session handling. In other words, session data is stored in the specific file in the server's filesystem. And in order to protect this file from accidental simultaneous writing, file locking mechanism is used. This is a classic locking problem in computer science. The first request to the PHP will gain a lock on the session file and all other requests will have to wait for this lock to be released. And if you have a long-lasting requests in multi-request environment (like with simultaneous AJAX requests or multiple frames on a page) it will become apparent.

The problem can be solved by either calling session_write_close() prematurelly, before script is finished, but after all session manipulations are done or by switching to another session storage mechanism, like database session storage.

I think, that in Symfony 2 the best course of action is to store session with the PDO handler (in a database of your choice). Here's the official tutorial of how it can be set up:

How to Use PdoSessionHandler to Store Sessions in the Database.

HINT: If you are using Doctrine migrations, then you can create a new migration class and add SQL required to create table for session storage to it.

With this approach you will have a better non-blocking session storage mechanism and your application will be able to scale horizontally.

like image 200
Slava Fomin II Avatar answered Apr 01 '23 11:04

Slava Fomin II


While I think you have found an answer in the comment above, it is worth noting there is a reason why you can end up with one request to your server blocking another. Both web servers and browsers actively limit the number of open connections to a single host. The HTTP standard actually says that any client should not have more than 2 (!!!) active connections at the same time: http://www.w3.org/Protocols/rfc2616/rfc2616-sec8.html#sec8.1.4 . Modern browsers usually allow more, see Max parallel http connections in a browser?, but they still limit you.

To get around this issue people often set up multiple host names for a single server (for instance using CNAMES) so the limit does not apply.

like image 32
naneau Avatar answered Apr 01 '23 10:04

naneau