Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Faster alternative to file_get_contents()

Currently I'm using file_get_contents() to submit GET data to an array of sites, but upon execution of the page I get this error:

Fatal error: Maximum execution time of 30 seconds exceeded

All I really want the script to do is start loading the webpage, and then leave. Each webpage may take up to 5 minutes to load fully, and I don't need it to load fully.

Here is what I currently have:

        foreach($sites as $s) //Create one line to read from a wide array
        {
                file_get_contents($s['url']); // Send to the shells
        }

EDIT: To clear any confusion, this script is being used to start scripts on other servers, that return no data.

EDIT: I'm now attempting to use cURL to do the trick, by setting a timeout of one second to make it send the data and then stop. Here is my code:

        $ch = curl_init($s['url']); //load the urls
        curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 1); //Only send the data, don't wait.
        curl_exec($ch); //Execute
        curl_close($ch); //Close it off.

Perhaps I've set the option wrong. I'm looking through some manuals as we speak. Just giving you an update. Thank you all of you that are helping me thus far.

EDIT: Ah, found the problem. I was using CURLOPT_CONNECTTIMEOUT instead of CURLOPT_TIMEOUT. Whoops.

However now, the scripts aren't triggering. They each use ignore_user_abort(TRUE); so I can't understand the problem

Hah, scratch that. Works now. Thanks a lot everyone

like image 327
Rob Avatar asked Apr 18 '10 16:04

Rob


People also ask

Which is faster cURL or file_get_contents?

cURL is capable of much more than file_get_contents . That should be enough. FWIW there's little difference with regards to speed.

Does file_get_contents cache?

Short answer: No. file_get_contents is basically just a shortcut for fopen, fread, fclose etc - so I imagine opening a file pointer and freading it isn't cached.

What is the difference between file_get_contents ($ file and file_get_contents ($ file in PHP?

2 Answers. Show activity on this post. The first two, file and file_get_contents are very similar. They both read an entire file, but file reads the file into an array, while file_get_contents reads it into a string.


2 Answers

There are many ways to solve this.

You could use cURL with its curl_multi_* functions to execute asynchronously the requests. Or use cURL the common way but using 1 as timeout limit, so it will request and return timeout, but the request will be executed.

If you don't have cURL installed, you could continue using file_get_contents but forking processes (not so cool, but works) using something like ZendX_Console_Process_Unix so you avoid the waiting between each request.

like image 199
Franco Avatar answered Sep 29 '22 20:09

Franco


As Franco mentioned and I'm not sure was picked up on, you specifically want to use the curl_multi functions, not the regular curl ones. This packs multiple curl objects into a curl_multi object and executes them simultaneously, returning (or not, in your case) the responses as they arrive.

Example at http://php.net/curl_multi_init

like image 37
Ian B Avatar answered Sep 29 '22 22:09

Ian B