I used below ajax call to retrieve data from database and show it in my page.
$.ajax({
type: "POST", url: "MyPage.aspx/LoadGrid",
data: "{idyear:'2020'}",
contentType: "application/json; charset=utf-8",
dataType: "json",
success: function (response) {
$(".gridBody").html(response.d);
},
failure: function (response) {
alert(response.d);
}
});
Currently this operation returns 1026 records and takes aroud 12 seconds.
since this process is time consuming and records will be more in future I have to find an alternative way to load data faster.
So, I tried another approch. I decided to get total number of records at first. for example now i have 1026 records and if I want to load my data in 100 records boundles, I need to make 11 ajax calls simultanously and combine the results at then end of all ajax calls. I thought by applying this method I can start all calls together and I don't have to wait for ending a call to start a new one.
var pagesize = 100;
getQty(function () {
var pagesqty = Math.floor(qty / pagesize);
if (qty % pagesize > 0) {
pagesqty += 1;
}
var control = 0;
for (var i = 0; i < pagesqty; i++) {
$.ajax({
type: "POST", url: "MyPage.aspx/LoadGridMtoN",
data: "{idyear:'2020' , page:'" + i + "' , pagesize:'" + pagesize + "'}",
contentType: "application/json; charset=utf-8",
dataType: "json",
success: function (response) {
//alert(control+" succeed");
eval("var str" + i + "='" + response.d + "'");
control += 1;
if (control == pagesqty) {
var str = "";
for (var i = 0; i < pagesqty; i++) {
eval("str += str" + i);
}
$(".gridBody").html(str);
}
},
failure: function (response) {
alert(response.d);
}
});
}
});
but now I am getting time out error while executing ajax calls. does any one knows any bettere way?
P.S: I wanted to try web worker, but It seems that I can't use JQuery in web Workwer.
P.S: I dont want to use paging. I should load all data together.
Please note that simultaneously calling endpoints from client side (instead of a single call) has more overhead and is not a good solution. For example, your server should handle more connections from client, your DB should handle more connections from your ORM in back-end, your table and correspondingly your disk and IO is challenging etc...
By the way, by considering that all the parts of your system are perfectly designed, from UX point of view, and from technical point of view, incrementally loading the data is a good solution. I mean, loading the first 100 records, while user are using them loading the second 100 records and adding them to end of the list (or when the user scroll down).
However finally you have to consider pagination! You can't load 1M records on your webpage! And no one check and use all the records in the webpage. So, you had to limit the number of records fetched and use server side pagination or provide another approach to the users, to submit their request, and then you process the request and create the report and write it in a raw text file or an excel file and email him/her the download link.
The answer you don't want to hear is that you need to fix your server-side. 1026 records really should not be taking 12 seconds, especially for data being retrieved for the client. Consider reducing the number of JOINs in your SQL statement and aggregating the data in server-side logic instead, and try running EXPLAIN against your queries and consider indices were appropriate.
To answer your question about splitting AJAX calls...
It looks like you have implemented pagination, so perhaps create an asynchronous recursive function that obtains 5-10 records at a time, incrementing the pageNum and recursing after each promise response. You can populate the page and the user will be seeing data without waiting so long. However, you must understand that this would increase the volume to your server, and it will probably end up taking longer to obtain all of the data. I feel the way you are trying to accomplish this goal to be in bad practice.
Assuming you can make changes to the server side,
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With