Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Loading large datasets with AngularJS

Tags:

angularjs

I'm trying to devise a way to load large amounts of data (upwards of 1000 rows) into a page, without pagination. The first hurdle in this was to query the DB in parallel bite sized chunks, which I've done with the help of the solution at How to make sequentially Rest webservices calls with AngularJS?

I'm running into two problems with what I've implemented however:

  1. Each returned object is being passed into an array which is then itself returned as the array that Angular uses to bind. i.e. [[{key:value, key:value, key:value}, {key:value, key:value, key:value}], [{key:value, key:value, key:value}, {key:value, key:value, key:value}]] As such I can't use ng-repeat="item in data" because data is an array of arrays. Doing "item in data[0]" does make item available. Concatenation seems to be the answer but I haven't been able to sort out a way that makes it work.

  2. I'm making multiple requests to the database and each request gets returned correctly but the page doesn't render until all the requests have completed -- which completely negates the point of doing multiple requests in the first place.

So looking over my code, how can I re-write it to solve these two issues? So that data is returned as one array and that data is rendered every time a query is completed?

app.factory('ScanService', function($http, $q) {
  function fetchOne(stepCount) {
    return $http({
      method: 'GET',
      url: '/index.php/scans',
      params: {step:stepCount}
    })
    .then(function onSuccess(response) {
      return response.data;
    }
    return {
      fetchAll: function(steps) {
        var scans = [];
        for (var i = 1; i <= steps; i++) {
          scans.push(fetchOne(i));
        }
        return $q.all(scans);
      }
    };
});

app.controller.ScanCtrl = function($scope, $q, ScanService) {
  $scope.scans = ScanService.fetchAll(10);
};

Follow up

I should add that I did manage to get this working based on the solution below and an angular.forEach(). Can't suggest anyone working with "big data" go this route. At around 1000 rows, the browser was overwhelmed and began slowing down considerably. Trying to filter with angular.filter also experienced a significant delay until the results were narrowed down. On the other hand, a few hundred rows worked respectably well and allowed native filtering - which was a key goal for my implementation.

like image 334
iamsar Avatar asked Jul 05 '13 21:07

iamsar


1 Answers

You can't really $q.all the promises together (which makes them into one big promise that succeeds or fails together) if you want to treat each one individually (display each one individually).

I would push the things you get back into the scope as soon as you get them. Below is an example:

    function MyCtrl($scope, $timeout, $q) {
        var fetchOne = function() {
            var deferred = $q.defer();
            $timeout(function() {
                deferred.resolve([random(), random() + 100, random() + 200]);
            }, random() * 5000);
            return deferred.promise;
        };

        $scope.scans = [];
        for (var i = 0; i < 2; i++) {
            fetchOne().then(function(items) {
                angular.forEach(items, function(item) {
                    $scope.scans.push(item);
                });
            });
        };
    }

Here's a fiddle showing it in action: http://jsfiddle.net/wWcvx/1/

There's an issue here where the order of the items are based on when they were returned, not on your original request order. I'll let you figure that one out yourself.

like image 191
John Tseng Avatar answered Nov 15 '22 09:11

John Tseng