As I work on my games I've been using MySQL databases a lot.
But more recently I've been putting a lot of SELECT
results in Memcache, because they're unlikely to change.
Which leads me to wonder, wouldn't it be better to just have a PHP file that defines arrays of data? Sure, this means I have to load in the entire array just to access one row, but I believe that this is insignificant compared to the time saved not having to wait for MySQL.
Am I right on this path of thinking? Should I move my unchanging game data (such as character base stats, compatibility charts, item data, etc.) to PHP files? Or should I keep using MySQL+Memcache?
Which is better, and why?
It would seem that some benchmarks are in order.
I put together a few scripts to test storing the data as a PHP array, as a CSV file, and in a MySQL DB accessed via PDO.
Array Test
Here are the scripts for the array test:
array-inc.php:
$data = array(
array(
"key" => "something",
"value" => "something else"
),
array(
"key" => "something",
"value" => "something else"
),
array(
"key" => "something",
"value" => "something else"
),
array(
"key" => "something",
"value" => "something else"
),
array(
"key" => "something",
"value" => "something else"
)
);
array-test.php
$start = microtime( TRUE );
for( $i = 0; $i < 10000; ++$i ) {
include( "array-inc.php" );
$retrieve = $data[2];
}
$finish = microtime( TRUE );
print $finish - $start;
CSV Test
Here are the scripts for the CSV test:
values.csv:
key,value
something,something else
something,something else
something,something else
something,something else
something,something else
csv.php:
$start = microtime( TRUE );
for( $i = 0; $i < 10000; ++$i ) {
$fp = fopen( "values.csv", "r" );
$data = array();
while(( $line = fgetcsv( $fp )) !== FALSE ) {
$data[] = $line;
}
$retrieve = $data[2];
}
$finish = microtime( TRUE );
print $finish - $start;
MySQL Test
And here is the script for the MySQL test (the table has an id, key, and value column with the same five rows and values as the above two):
mysql.php:
$start = microtime( TRUE );
for( $i = 0; $i < 10000; ++$i ) {
$query = "SELECT * FROM `values` WHERE id = :id";
$result = $pdo->prepare( $query );
$result->execute( array( ":id" => 2 ));
$retrieve = $result->fetch( PDO::FETCH_ASSOC );
}
$finish = microtime( TRUE );
print $finish - $start;
Each of these is set to access one element from the stored data, and to loop through the process 10,000 times so that the time could be more accurately measured. Here are the results:
I ran each of these tests three times and received the following values:
It seems that CSV is the clear loser, but Array and MySQL are very close with Array perhaps taking a slight lead in performance. Feel free to tweak the code above to provide results that are more in line with your environment.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With