Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Bootstrap export options works for 5,000 rows and failed for 16,000 rows with network failure

Below is the html which has 5,000 records. The export is working perfectly fine. However when the records are increased to 16,000 it says network failure for all exports. In console no error is found. I am not sure about the reason. Tested in Chrome.

<html>

<head>
  <link href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap.min.css" rel="stylesheet" />
  <link href="https://cdnjs.cloudflare.com/ajax/libs/bootstrap-table/1.11.1/bootstrap-table.min.css" rel="stylesheet" />

  <script src="https://ajax.googleapis.com/ajax/libs/jquery/3.2.1/jquery.min.js"></script>
  <script src="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/js/bootstrap.min.js"></script>
  <script src="https://cdnjs.cloudflare.com/ajax/libs/bootstrap-table/1.11.1/bootstrap-table.min.js"></script>
  <script src="https://cdnjs.cloudflare.com/ajax/libs/bootstrap-table/1.11.1/extensions/export/bootstrap-table-export.min.js"></script>
</head>

<body>
  <table data-toggle="table" data-search="true" data-show-refresh="true" data-show-toggle="true" data-show-columns="true" data-show-export="true" data-minimum-count-columns="2" data-show-pagination-switch="true" data-pagination="true" data-id-field="id"
    data-page-list="[10, 25, 50, 100, ALL]" data-show-footer="false" data-side-pagination="client" data-url="https://jsonplaceholder.typicode.com/photos">
    <thead>
      <tr>
        <th data-field="id">Id</th>
        <th data-field="title">Title</th>
        <th data-field="url">URL</th>
        <th data-field="thumbnailUrl">Thumbnail URL</th>
      </tr>
    </thead>
</body>

</html>

With > 15,000 records

<html>

<head>
  <link href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap.min.css" rel="stylesheet" />
  <link href="https://cdnjs.cloudflare.com/ajax/libs/bootstrap-table/1.11.1/bootstrap-table.min.css" rel="stylesheet" />

  <script src="https://ajax.googleapis.com/ajax/libs/jquery/3.2.1/jquery.min.js"></script>
  <script src="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/js/bootstrap.min.js"></script>
  <script src="https://cdnjs.cloudflare.com/ajax/libs/bootstrap-table/1.11.1/bootstrap-table.min.js"></script>
  <script src="https://cdnjs.cloudflare.com/ajax/libs/bootstrap-table/1.11.1/extensions/export/bootstrap-table-export.min.js"></script>
</head>

<body>
  <table data-toggle="table" data-search="true" data-show-refresh="true" data-show-toggle="true" data-show-columns="true" data-show-export="true" data-minimum-count-columns="2" data-show-pagination-switch="true" data-pagination="true" data-id-field="id"
    data-page-list="[10, 25, 50, 100, ALL]" data-show-footer="false" data-side-pagination="client" data-url="https://fd-files-production.s3.amazonaws.com/226483/16h4Vwxe1Wz9PZ5Gublomg?X-Amz-Expires=300&X-Amz-Date=20170906T130107Z&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAIA2QBI5WP5HA3ZEA/20170906/us-east-1/s3/aws4_request&X-Amz-SignedHeaders=host&X-Amz-Signature=5d705bfd19579c8a93ff81ee076363b2f36d1f5e4540b85f7c86de7643c17055">
    <thead>
      <tr>
        <th data-field="id">Id</th>
        <th data-field="title">Title</th>
        <th data-field="url">URL</th>
        <th data-field="thumbnailUrl">Thumbnail URL</th>
      </tr>
    </thead>
</body>

</html>
like image 460
Kathir Avatar asked Aug 31 '17 16:08

Kathir


2 Answers

Try doing the following:

1.) Download the library files instead of using a CDN.

2.) Increase your page time-out time on your AWS server. It's possible that you don't have enough time to process all those records.

3.) It's possible that you're hitting up against some unknown client-side restriction, like javascript.options.mem.max being 128MB. (16k records may hit that.)

4.) Try another server. There might be restrictions on AWS that you can't control (e.g. memory or "time-to-live" for your connection), but if you set up your own personal dedicated server for testing, you could rule that out.

5.) Disable your "ALL" option. Do you really want people to pull 16k records at once?

6.) As a last resort, try making a server-side pagination script.

like image 114
Ryan Battistone Avatar answered Nov 12 '22 16:11

Ryan Battistone


This looks to be a problem with the S3 request expiration:

<?xml version="1.0" encoding="UTF-8"?>
<Error>
   <Code>AccessDenied</Code>
   <Message>Request has expired</Message>
   <X-Amz-Expires>300</X-Amz-Expires>
   <Expires>2017-09-06T13:06:07Z</Expires>
   <ServerTime>2018-06-02T00:00:15Z</ServerTime>
   <RequestId>396C37F87B33C933</RequestId>
   <HostId>pg4uY75WW5p07yvAtqhEFvvKi0FreyHlNo/gJ329aRYHP9/KgzkVxRVkH4lZkwPtw7bLET+HPl8=</HostId>
</Error>
like image 1
ic3b3rg Avatar answered Nov 12 '22 16:11

ic3b3rg