I use PHP to proceed latitude/longitude points in order to generate JS and displays points on an OSM Map. I would like to make a new track on the map when I have a pause au 10 minutes or more on the recording.
My dataset as currently about 30000 records on about 10 differents tracks (some tracks have about 300 points, others have thousands).
I encounter a performance issue with PHP. When the loop agregate some hundreds of points, datas are processed with good speed, but if the track has thousands of points, performances drop dramatically.
Here is the time needed to proceed each point for each track
+-----------------+------------------------------+
| Points On Track | Time To Proceed 10000 Points |
+-----------------+------------------------------+
| 21 | 0.75 |
| 18865 | 14.52 |
| 539 | 0.79 |
| 395 | 0.71 |
| 827 | 0.79 |
| 400 | 0.74 |
| 674 | 0.78 |
| 2060 | 1.01 |
| 2056 | 0.99 |
| 477 | 0.73 |
| 628 | 0.77 |
| 472 | 0.73 |
+-----------------+------------------------------+
We can see that when I have a lot of points on a track, performances drop dramaticaly. In this particular case, processing all points require about 30 secondes. If I limit number of points for each track to 500 points, performances are pretty good (about 2,5 seconds to proceed my data set).
I use my Synology DS415play as webserver.
Here is my code :
$dataTab = array();
if ($result = $mysqli->query($sql))
{
$count = 0;
$row = $result->fetch_array();
$data = $dataTab[$tabPrt] . "[" . $row['latitude'] . "," . $row['longitude'] . "]," ;
$date = new DateTime($row['time']);
while($row = $result->fetch_array())
{
$count++;
$newDate = new DateTime($row['time']);
if(($newDate->getTimestamp() - $date->getTimestamp()) > 600)
{
array_push($dataTab, $data);
$data= "";
$count = 0;
}
$data = $data . "[" . $row['latitude'] . "," . $row['longitude'] . "]," ;
$date = $newDate;
}
array_push($dataTab, $data);
}
If I limit each track to 500 points like that, performance is pretty good
$dataTab = array();
if ($result = $mysqli->query($sql))
{
$count = 0;
$row = $result->fetch_array();
$data = $dataTab[$tabPrt] . "[" . $row['latitude'] . "," . $row['longitude'] . "]," ;
$date = new DateTime($row['time']);
while($row = $result->fetch_array())
{
$count++;
$newDate = new DateTime($row['time']);
if(($newDate->getTimestamp() - $date->getTimestamp()) > 600
|| $count > 500)
{
array_push($dataTab, $data);
$data= "";
$count = 0;
}
$data = $data . "[" . $row['latitude'] . "," . $row['longitude'] . "]," ;
$date = $newDate;
}
array_push($dataTab, $data);
}
Thanks
EDIT : I provide a sample of data here : http://109.190.92.126/tracker/gpsData.sql Slow script : http://109.190.92.126/tracker/map.php Normal execution speed by spliting each track (500 pts max) : http://109.190.92.126/tracker/map_split.php
Thanks
If you are getting 18000 records from a database in your worst case scenario, you could move the timestamp check to the query to drop it considerably, it looks like all you are doing is seeing if theres a ten minute gap then pushing to an array which could be done at the mysql level, this way you wouldn't be fetching 18000 rows from the database every time, just the ones you need.
If you post your mysql queries, we can take a look at putting that in there.
Edit: try changing the query to this:
SELECT time, latitude, longitude
FROM gpsData
WHERE time >= '2015-09-01'
AND provider = 'gps'
ORDER BY time DESC
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With