I have a function that generates a table with contents from the DB. Some cells have custom HTML which I'm reading in with file_get_contents through a templating system.
The small content is the same but this action is performed maybe 15 times (I have a limit of 15 table rows per page). So does file_get_contents cache if it sees that the content is the same?
The file_get_contents() reads a file into a string. This function is the preferred way to read the contents of a file into a string. It will use memory mapping techniques, if this is supported by the server, to enhance performance.
This function is similar to file(), except that file_get_contents() returns the file in a string, starting at the specified offset up to length bytes. On failure, file_get_contents() will return false . file_get_contents() is the preferred way to read the contents of a file into a string.
$file: It specifies the file in which you want to write. $data: It specifies the data that has to be written on the file. It can be a string, an array, or a data stream.
file_get_contents()
does not have caching mechanism. However, you can use write your own caching mechanism.
Here is a draft :
$cache_file = 'content.cache';
if(file_exists($cache_file)) {
if(time() - filemtime($cache_file) > 86400) {
// too old , re-fetch
$cache = file_get_contents('YOUR FILE SOURCE');
file_put_contents($cache_file, $cache);
} else {
// cache is still fresh
}
} else {
// no cache, create one
$cache = file_get_contents('YOUR FILE SOURCE');
file_put_contents($cache_file, $cache);
}
UPDATE the previous if
case is incorrect, now rectified by comparing to current time. Thanks @Arrakeen.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With