It is very common to use include
files. I think it is overused to keep the codes tidy without considering performance. For several include
s, disk should read the files, and as we extremely use disk, it can be a slow process. However, this is not the main slow process or the rate-limiting process, as loading the file with file_get_contents
is few times faster.
I think this is the reason that major websites put javascripts within the html file rather than loading them by file. Alternatively, it can be a good idea to split a large JS file into several small JS files, as parallel http requests can load the entire JS codes faster. But this is different from php files, as php script reads include
files one by one during the process.
Please comment how much serious this problem can be? Imagine a webpage is loaded in 0.60s, can include
of 10 php files turn it to 0.70s?
Although this effect should be negligible, I want to know if there are approaches to speed up this process. I do not mean php caching like APC
.
P.S. This question is not for practical application (a typical case), but theoretical considerations in general.
Yes it will slowdown if you are including the script from other servers.
The include (or require ) statement takes all the text/code/markup that exists in the specified file and copies it into the file that uses the include statement. Including files is very useful when you want to include the same PHP, HTML, or text on multiple pages of a website.
Use fgets() Function to Read a Large File Line by Line in PHP. The built-in function fgets() reads a line from an open file. We can use this function to read a large file line by line.
include
and its ilk is a necessity. It is similar to import
in Java and python in that it is used for class and function definitions. include
should be extremely fast, but using it will delay script execution compared to if it was not there. include
is totally different from file_get_contents()
. The latter is a function rather than a construct and returns a string. include
will actually execute the code of the included file.
Your statement about splitting JS files is incorrect as script downloads from the same domain block parallel downloads and it's generally recommended to have as few includes as possible in general.
I highly doubt that having multiple include
s, assuming all are necessary, is going to slow down the performance of your page. If you are having performance problems, look elsewhere.
If you want to speed up php, look into using a php compiler.
Consider this:
(index.php)
for ($i=0; $i<100000; $i++) {
include('somefile.php');
}
(somefile.php)
<?php
// nothing here
index.php takes ~115 seconds (for me) to process 100,000 iterations while including somefile.php, even though somefile.php has nothing in it.
However:
(index.php)
for ($i=0; $i<100000; $i++) {
// no file included this time
}
index.php now takes 0.002 seconds to complete without an include() construct.
(index.php)
for ($i=0; $i<100000; $i++) {
echo $i .'<br/>';
}
index.php takes 0.02 seconds to echo 100,000 iterations of $i.
Of course, this is a pretty extreme example due to the large number of iterations, but it does show that by simply including an include construct, script execution times can be delayed quite exponentially. Consider this the next time you write a process with large numbers of iterations, ie. reading/writing large XML files, etc. It's best to keep your code inline, even if that means it's less "manageable". 'Cause not only are you adding ~115 seconds (~2 minutes) to script execution time at every ~100,000 iterations simply by including an include(), but consider if that include() (somefile.php) had processes of its own to execute. My example is simply adding an include() construct.. the included file contained nothing.
Now, including files here and there for a webpage, the times would be negligible. I was only pointing out that the include() construct does require extra processing regardless of its contents.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With