Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Caching JSON output in PHP

Tags:

json

php

caching

Got a slight bit of an issue. Been playing with the facebook and twitter API's and getting the JSON output of status search queries no problem, however I've read up further and realised that I could end up being "rate limited" as quoted from the documentation.

I was wondering is it easy to cache the JSON output each hour so that I can at least try and prevent this from happening? If so how is it done? As I tried a youtube video but that didn't really give much information only how to write the contents of a directory listing to a cache.php file, but it didn't really point out whether this can be done with JSON output and certainly didn't say how to use the time interval of 60 minutes or how to get the information then back out of the cache file.

Any help or code would be very much appreciated as there seems to be very little in tutorials on this sorta thing.

like image 280
GeordieDave1980 Avatar asked Jul 10 '12 06:07

GeordieDave1980


2 Answers

Here a simple function that adds caching to getting some URL contents:

function getJson($url) {
    // cache files are created like cache/abcdef123456...
    $cacheFile = 'cache' . DIRECTORY_SEPARATOR . md5($url);

    if (file_exists($cacheFile)) {
        $fh = fopen($cacheFile, 'r');
        $size = filesize($cacheFile);
        $cacheTime = trim(fgets($fh));

        // if data was cached recently, return cached data
        if ($cacheTime > strtotime('-60 minutes')) {
            return fread($fh, $size);
        }

        // else delete cache file
        fclose($fh);
        unlink($cacheFile);
    }

    $json = /* get from Twitter as usual */;

    $fh = fopen($cacheFile, 'w');
    fwrite($fh, time() . "\n");
    fwrite($fh, $json);
    fclose($fh);

    return $json;
}

It uses the URL to identify cache files, a repeated request to the identical URL will be read from the cache the next time. It writes the timestamp into the first line of the cache file, and cached data older than an hour is discarded. It's just a simple example and you'll probably want to customize it.

like image 51
deceze Avatar answered Oct 22 '22 03:10

deceze


It's a good idea to use caching to avoid the rate limit. Here's some example code that shows how I did it for Google+ data, in some php code I wrote recently.

private function getCache($key) {
    $cache_life = intval($this->instance['cache_life']); // minutes
    if ($cache_life <= 0) return null;

    // fully-qualified filename
    $fqfname = $this->getCacheFileName($key);

    if (file_exists($fqfname)) {
        if (filemtime($fqfname) > (time() - 60 * $cache_life)) {
            // The cache file is fresh.
            $fresh = file_get_contents($fqfname);
            $results = json_decode($fresh,true);
            return $results;
        }
        else {
            unlink($fqfname);
        }
    }

    return null;
}

private function putCache($key, $results) {
    $json = json_encode($results);
    $fqfname = $this->getCacheFileName($key);
    file_put_contents($fqfname, $json, LOCK_EX);
}

and to use it:

        // $cacheKey is a value that is unique to the
        // concatenation of all params. A string concatenation
        // might work. 
        $results = $this->getCache($cacheKey);
        if (!$results) {
            // cache miss; must call out
            $results = $this->getDataFromService(....);
            $this->putCache($cacheKey, $results);
        }
like image 32
Cheeso Avatar answered Oct 22 '22 03:10

Cheeso