Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Best way to store unchanging data

Tags:

php

mysql

As I work on my games I've been using MySQL databases a lot.

But more recently I've been putting a lot of SELECT results in Memcache, because they're unlikely to change.

Which leads me to wonder, wouldn't it be better to just have a PHP file that defines arrays of data? Sure, this means I have to load in the entire array just to access one row, but I believe that this is insignificant compared to the time saved not having to wait for MySQL.

Am I right on this path of thinking? Should I move my unchanging game data (such as character base stats, compatibility charts, item data, etc.) to PHP files? Or should I keep using MySQL+Memcache?

Which is better, and why?

like image 905
Niet the Dark Absol Avatar asked Aug 04 '13 05:08

Niet the Dark Absol


1 Answers

It would seem that some benchmarks are in order.

I put together a few scripts to test storing the data as a PHP array, as a CSV file, and in a MySQL DB accessed via PDO.

Array Test

Here are the scripts for the array test:

array-inc.php:

$data = array(
    array(
        "key"   => "something",
        "value" => "something else"
    ),
    array(
        "key"   => "something",
        "value" => "something else"
    ),
    array(
        "key"   => "something",
        "value" => "something else"
    ),
    array(
        "key"   => "something",
        "value" => "something else"
    ),
    array(
        "key"   => "something",
        "value" => "something else"
    )
);

array-test.php

$start = microtime( TRUE );

for( $i = 0; $i < 10000; ++$i ) {
    include( "array-inc.php" );
    $retrieve = $data[2];
}

$finish = microtime( TRUE );

print $finish - $start;

CSV Test

Here are the scripts for the CSV test:

values.csv:

key,value
something,something else
something,something else
something,something else
something,something else
something,something else

csv.php:

$start = microtime( TRUE );

for( $i = 0; $i < 10000; ++$i ) {
    $fp = fopen( "values.csv", "r" );

    $data = array();
    while(( $line = fgetcsv( $fp )) !== FALSE ) {
        $data[] = $line;
    }

    $retrieve = $data[2];
}

$finish = microtime( TRUE );

print $finish - $start;

MySQL Test

And here is the script for the MySQL test (the table has an id, key, and value column with the same five rows and values as the above two):

mysql.php:

$start = microtime( TRUE );

for( $i = 0; $i < 10000; ++$i ) {
    $query = "SELECT * FROM `values` WHERE id = :id";
    $result = $pdo->prepare( $query );
    $result->execute( array( ":id" => 2 ));
    $retrieve = $result->fetch( PDO::FETCH_ASSOC );
}

$finish = microtime( TRUE );

print $finish - $start;

Each of these is set to access one element from the stored data, and to loop through the process 10,000 times so that the time could be more accurately measured. Here are the results:

I ran each of these tests three times and received the following values:

  • Array:
    • 1.050, 1.020, 1.114 seconds
  • CSV:
    • 1.264, 1.232, 1.105 seconds
  • MySQL:
    • 1.038, 1.149, 1.171 seconds

It seems that CSV is the clear loser, but Array and MySQL are very close with Array perhaps taking a slight lead in performance. Feel free to tweak the code above to provide results that are more in line with your environment.

like image 167
Nick Coons Avatar answered Sep 29 '22 00:09

Nick Coons