Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Can we split big ajax calls into multiple smaller calls to load data faster?

I used below ajax call to retrieve data from database and show it in my page.

$.ajax({
    type: "POST", url: "MyPage.aspx/LoadGrid",
    data: "{idyear:'2020'}",
    contentType: "application/json; charset=utf-8",
    dataType: "json",
    success: function (response) {
        $(".gridBody").html(response.d);
    },
    failure: function (response) {
        alert(response.d);
    }
});

Currently this operation returns 1026 records and takes aroud 12 seconds.

since this process is time consuming and records will be more in future I have to find an alternative way to load data faster.

So, I tried another approch. I decided to get total number of records at first. for example now i have 1026 records and if I want to load my data in 100 records boundles, I need to make 11 ajax calls simultanously and combine the results at then end of all ajax calls. I thought by applying this method I can start all calls together and I don't have to wait for ending a call to start a new one.

var pagesize = 100; 
getQty(function () {
    var pagesqty = Math.floor(qty / pagesize);
    if (qty % pagesize > 0) {
        pagesqty += 1;
    }
    var control = 0;
    for (var i = 0; i < pagesqty; i++) {
        $.ajax({
            type: "POST", url: "MyPage.aspx/LoadGridMtoN",
            data: "{idyear:'2020' , page:'" + i + "' , pagesize:'" + pagesize + "'}",
            contentType: "application/json; charset=utf-8",
            dataType: "json",
            success: function (response) {
                //alert(control+" succeed");
                eval("var str" + i + "='" + response.d + "'");
                control += 1;
                if (control == pagesqty) {
                    var str = "";
                    for (var i = 0; i < pagesqty; i++) {
                        eval("str += str" + i);
                    }
                    $(".gridBody").html(str);
                }
            },
            failure: function (response) {
                alert(response.d);
            }
        });
    }
});

but now I am getting time out error while executing ajax calls. does any one knows any bettere way?

P.S: I wanted to try web worker, but It seems that I can't use JQuery in web Workwer.

P.S: I dont want to use paging. I should load all data together.

like image 588
Behnam Avatar asked Mar 21 '20 11:03

Behnam


3 Answers

Please note that simultaneously calling endpoints from client side (instead of a single call) has more overhead and is not a good solution. For example, your server should handle more connections from client, your DB should handle more connections from your ORM in back-end, your table and correspondingly your disk and IO is challenging etc...

By the way, by considering that all the parts of your system are perfectly designed, from UX point of view, and from technical point of view, incrementally loading the data is a good solution. I mean, loading the first 100 records, while user are using them loading the second 100 records and adding them to end of the list (or when the user scroll down).

However finally you have to consider pagination! You can't load 1M records on your webpage! And no one check and use all the records in the webpage. So, you had to limit the number of records fetched and use server side pagination or provide another approach to the users, to submit their request, and then you process the request and create the report and write it in a raw text file or an excel file and email him/her the download link.

like image 123
Alireza Mahmoudi Avatar answered Sep 30 '22 05:09

Alireza Mahmoudi


The answer you don't want to hear is that you need to fix your server-side. 1026 records really should not be taking 12 seconds, especially for data being retrieved for the client. Consider reducing the number of JOINs in your SQL statement and aggregating the data in server-side logic instead, and try running EXPLAIN against your queries and consider indices were appropriate.


To answer your question about splitting AJAX calls...

It looks like you have implemented pagination, so perhaps create an asynchronous recursive function that obtains 5-10 records at a time, incrementing the pageNum and recursing after each promise response. You can populate the page and the user will be seeing data without waiting so long. However, you must understand that this would increase the volume to your server, and it will probably end up taking longer to obtain all of the data. I feel the way you are trying to accomplish this goal to be in bad practice.

like image 22
buttface64 Avatar answered Sep 30 '22 05:09

buttface64


Assuming you can make changes to the server side,

  1. create a new table with all the fields that you are going to need on the front end
  2. write a stored procedure to update this table on regular basis
  3. use this table in your ajax call to fetch the records.
  4. Use pagination. No one is going to use 1000+ records at a time
  5. give a search option at the top, in case you feel like the user must have access to all the records.
  6. As suggested in other answers, don't create multiple ajax calls. You will only end up regretting and creating a bottleneck for yourself in later stages
like image 40
Irfan Harun Avatar answered Sep 30 '22 03:09

Irfan Harun