I am currently trying to send a very long csv file that will be processed in the browser.
I would like to stream it to the client as it would exceed the string size limit and would also take up too much memory in the server.
I have tried
app.get('/test', (req, res)=>{
let csvStream = byline(fs.createReadStream('./resources/onescsv.csv'));
csvStream.on('data', (line)=>{
csvStream.pipe(res);
});
csvStream.on('end', () => {
res.render('./test/test', {
css:['test/test.css'],
js:['test/test.js']
})
})
});
When I do the above, it sends read stream to the client but it renders to the page which is not what I want. I would like to be able to receive stream buffer by buffer in the client javascript to process the stream as they come in e.g. put them into a table. How can I do this?
Well firstly, you don't want to be calling render in the same request your looking to pipe data into the response. You'd want to split these out
To render the page, just have your default route send down the page HTML
app.get('/', (req, res) => {
res.render('./test/test', {
css: ['test/test.css'],
js: ['test/test.js']
});
});
Then to stream, at the server side tweak your code like
app.get('/api/csv', (req, res) => {
let stream = fs.createReadStream('./resources/onescsv.csv');
stream = byline.createStream(stream);
stream.pipe(res);
stream.on('end', res.end);
});
Then on your client, in your default HTML page, either on load (or hook up to a button press), fire an AJAX request to pull down the CSV data e.g. using jQuery
$.get('/api/csv', data => {
// do something with CSV data
});
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With