Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Laravel environment variables leaking between applications when they call each other through GuzzleHttp

I have two Laravel 5.2 applications (lets call them A and B) on my local machine, both configured on two different virtualhosts on my local Apache 2.4 development server.

Both applications sometimes are calling each other through GuzzleHttp.

At one point I wanted to use encryption and I started getting "mac is invalid" exceptions from Laravel's Encrypter.

While investigating the issue, I found that when app A calls app B, app B suddenly gets encryption key (app.key) from app A! This causes encryption to break because the values on app B where encrypted using app's B encryption key.

While debugging, I found the Dotenv library has some logic to keep existing variables if they are set. I found that both $_ENV and $_SERVER do not have leaked variables, but getenv() has them!

I'm a bit confused because PHP putenv says:

The environment variable will only exist for the duration of the current request.

It seems, if during current request I launch another request through GuzzleHttp, the variables set by Dotenv in A using putenv() suddenly become available in app B which is being requested by GuzzleHttp!

I understand that this will not be an issue on production servers where config cache will be used instead of Dotenv and most probably both apps will run on different Apache servers, but this behavior is breaking my development process.

How do I configure Laravel or GuzzleHttp or Apache or PHP to prevent this putenv() leakage from app A into app B?

like image 340
JustAMartin Avatar asked Feb 03 '16 14:02

JustAMartin


1 Answers

The problem is that you are using a shared instance of PHP, so when one of the apps sets an environment variable that is shared with the other app. I believe phpdotenv treats them as immutable, so once they are set, the other application cannot override them.

mod_php (which i presume you are using since you mentioned apache) basically provides a php interpreter inside each apache process. An apache process will be shared between all your vhosts hence why you are having this issue. You would also get the same issue if you were running nginx and php-fpm, however its easier to solve if you were running the latter software stack.

Unfortunately one port can only be bound to one process. So the only way to stick with mod_php and apache is too place your vhosts on seperate port numbers, which means you'll have to place the port number of at least one of them in the url when accessing it. I don't really use apache anymore so i can't give you specific details on doing this, it might be a case of just setting different ports in your vhost config and apache will just do it, but i'll have to defer you too google.

If you were running nginx/php-fpm it would probably just be a case of creating a second php-fpm process config running on a different port or socket and pointing the second vhost at that php instance and away you go.

So in summary you have a few solutions:

  1. Stay with apache and mod_php, and spend the rest of the week googling how to do what i said.
  2. Look into running php as a cgi module on apache will will then give you the flexibility you need (this is akin to using nginx/php-fpm but without changing your webserver software).
  3. Stop using phpdotenv and find an alternative approach (such as loading your config in htaccess or inside the vhost so its available as $_ENV or $_SERVER keys)
  4. Install a dev stack that includes nginx/php-fpm and it should be easily solvable by creating two php processes
  5. Use virtual machines (possibly look at vagrant or docker) .

Sorry i don't have better news, but unfortunately your WAMP stack is just a little too restrictive out of the box.

like image 154
Lee Avatar answered Sep 22 '22 14:09

Lee