Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to read files .csv very big with FileReader API javascript, html/jquery only

I'm trying to read a file .csv over 40 MB (with more 20.000 lines), viewing it as a table in html. The system I am designing will be in pure HTML + JQuery only. My worksheet .csv format that is:

=======================================================================
| rank ||  place           || population ||    lat    ||      lon     |
=======================================================================
|  1   || New York city    ||  8175133   || 40.71455  ||  -74.007124  |
-----------------------------------------------------------------------
|  2   || Los Angeles city ||  3792621   || 34.05349  ||  -118.245323 |
-----------------------------------------------------------------------
..........................Thousands of lines..........................

I have a textbox filter that makes me just showing the line that has the same number entered in the field. I have a textbox filter that makes me just showing the line that has the same number entered in the field. And then uploading the file it breaks the browser. :( Follow my code:

var fileInput = document.getElementById('fileInput');
var fileDisplayArea = document.getElementById('fileDisplayArea');

        fileInput.addEventListener('change', function(e) {
            var file = fileInput.files[0];

                var reader = new FileReader();

                reader.onload = function(e) {

                    //Convert .csv to table html
                    var data = reader.result; 
                    var rank = $("#rank").val(); 

                    if(rank == ""){
                        alert("digit rank number");
                        fileInput.value = "";
                     }
                     else{



                    data = data.replace(/\s/, " #"); 
                    data = data.replace(/([0-9]\s)/g, "$1#"); 
                    var lines = data.split("#"), 
                        output = [],
                        i;
                        for (i = 0; i < lines.length; i++)
                        output.push("<tr><td>" + lines[i].slice(0,-1).split(",").join("</td><td>") + "</td></tr>");
                        output = "<table>" + output.join("") + "</table>";

var valuefinal = $(output).find('tr').filter(function(){ 
    return $(this).children('td').eq(0).text() == rank;
});

 $('#fileDisplayArea').append(valuefinal);

                     }
                }

                reader.readAsBinaryString(file);    


        });

Is it possible to do something that I optimize my code and not let my browser and break my file to be read? I tried to split the file into pieces and then seal it. (BLOB) But this way he kept crashing my browser too. :(

(Excuse me if something was not clear and enlightening about my question to you, my english is bad.) DEMO CODE JSFiddle

like image 998
user3743128 Avatar asked Dec 05 '25 15:12

user3743128


1 Answers

As I found this thread on my research to parse a very big CSV file in the parser, I wanted to add that my final solution was to use Papa Parse:

http://papaparse.com/

like image 184
Nukey Avatar answered Dec 07 '25 05:12

Nukey