What is the easiest/fastest way to add 100 points to a database? Please assume all writes will not work due to duplicates, bad data, etc.
I'm trying to update a database with exactly 100 values.
Once I have a good piece of data, I need to add it to the database and I use a function called updateDB.
This function just writes a lat/lng coordinate to the database. If there is a duplicate or the write fails, I send "error" from php and the loop should continue collecting data until I have exactly 100 points to the database. Here's the function I'm using.
cct is used for xss prevention, please ignore it, this works fine.
////more above this
if(100-completed > dispatched)
dispatched++;
updateDB(lat,lng);
/// more junk and then this function
function updateDB(lat,lng)
{
var cct = $("input[name=csrf_mysite]").val();
$.ajax({
type: "POST",
url: "/form",
data: {
'lat': lat,
'lng': lng,
'id_set': id_set,
'csrf_complexity': cct },
success: function(result) {
var obj = jQuery.parseJSON(result);
if( obj.status === "OK" )
{
completed++;
var marker = new google.maps.Marker(
{
icon: markerIcon,
position: new google.maps.LatLng(lat, lng),
map: map
});
$( "#progressbar" ).progressbar( "option", {
value: completed,
max: 100
});
$("#amount").text("Getting image " + completed + " of 100");
}
},
error: function(data){
//alert(data.responseText);
},
complete: function(data){
if(completed == 100)
window.location = "/start/curate";
dispatched--;
}
});
}
This function does not work. So any idea why?
It should work simply. Call updateDB until it either reaches 100 added values and only call updateDB when there is no possibility that there will be extra calls. Dispatch does not decrement properly so I'm assuming complete isn't called on every event.
Ideas? Or any other way to do this would be awesome.
Why don't you have the server code deal with the counting and send only one (or a few) queries? E.g. construct an array of 100 data points first, send the data to the server in a single query, and have it respond with how many more it needs, then send back that many, and do it again until it's got 100.
If the overhead of obtaining each piece of data on the client is very low, and it's OK to get data you don't need, then just send (say) 110 at first. With some knowledge of the failure rate you should be able to optimize this easily.
You can only have (I believe) two simultaneous async queries at once, anyway, so async or not, it's going to take a long time to do this. I can't think of any reason not to group the data as much as possible and cut the number of queries down to 1 or a handful. Even if you still run 100 database queries at the server, the time to do that is inconsequential compared to the overhead of an HTTP post/response.
Write your server side script to respond with a figure - Then in the first request you sent 100 data points:
Browser Server
| |
|---> 100 data ---> |
|<- send 20 more <- |
| |
|---> 20 data ---> |
|<- send 1 more <- |
| |
|---> 1 data ---> |
|<- send 0 more <- |
| |
v v
Like that you don't get the kind of sync trouble that comes from counting at the client end how much information has been processed at the server end - the server processes the data, excludes the duplicates, ill-formed etc., and the server counts what it needs.
All the client has to do is send all it can and poll the server for how much more is needed. Also you want to send multiple points at once because of the overheads of an Ajax request, but you don't want to send much more than is actually needed, because that too is wasteful.
Hope this helps.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With