I'm getting an error when downloading multiple files from an SFTP site using the ssh2-sftp-client
library. The error thrown seems to indicate that the node stream is not getting cleared after each download completes. This is causing a memory leak in my app. In production I need to be able to download thousands of files so this memory leak is substantial. How can I close the stream so that the memory is released after each file is downloaded?
code:
const Client = require('ssh2-sftp-client');
const sftp = new Client();
sftp.connect({
host: '195.144.107.198',
port: 22,
username: 'demo',
password: 'password'
}).then(async () => {
const fileNames = ['readme.txt', 'readme.txt', 'readme.txt', 'readme.txt', 'readme.txt', 'readme.txt', 'readme.txt', 'readme.txt', 'readme.txt', 'readme.txt', 'readme.txt', 'readme.txt'];
// Loop through filenames
for (let i = 0; i < fileNames.length; i++) {
// Download all the files synchronously (1 at a time)
const fileName = fileNames[i];
await new Promise((resolve, reject) => { // <-- note the await
sftp.get(fileName, true, 'utf8').then((stream) => {
let text = '';
stream
.on('data', (d) => { text += d; })
.on('end', () => {
console.log('Success downloaded file', i);
resolve(text);
});
}).catch((err) => {
console.log('Error downloading file', err);
reject(err.message)
});
});
}
sftp.end();
});
Note: this code uses a public SFTP site so the credentials are not sensitive and you can run it for testing. Found here: https://www.sftp.net/public-online-sftp-servers
Error (occurs after file #9 is downloaded):
(node:44580) MaxListenersExceededWarning: Possible EventEmitter memory leak detected.
11 error listeners added. Use emitter.setMaxListeners() to increase limit
So you said that you are attempting to download thousands of files in prod but you're using a listener for each file. Node only allows you to make a max of 10
event listeners before triggering an alert.
See:
https://nodejs.org/dist/latest-v8.x/docs/api/events.html#events_eventemitter_defaultmaxlisteners https://github.com/nodejs/help/issues/1051
If you want to correct this, I'd recommend you implement a queue
and only download 10 files at a time.
Something like:
const Client = require('ssh2-sftp-client');
const sftp = new Client();
sftp.connect({
host: '195.144.107.198',
port: 22,
username: 'demo',
password: 'password'
}).then(async () => {
// Treat files array as a queue instead of an array
const fileQueue = ['readme.txt', 'readme.txt', 'readme.txt', 'readme.txt', 'readme.txt', 'readme.txt', 'readme.txt', 'readme.txt', 'readme.txt', 'readme.txt', 'readme.txt', 'readme.txt'];
// Use this function to grab files from your main files array
const downloadFilesFromQueue = (fileName) =>
new Promise((resolve, reject) => {
// Sanity check
if(!fileName) {
resolve();
}
sftp.get(fileName, true, 'utf8').then((stream) => {
let text = '';
stream
.on('data', (d) => { text += d; })
.on('end', () => {
console.log('Success downloaded file', fileName);
resolve(text);
});
}).catch((err) => {
console.log('Error downloading file', err);
reject(err.message);
});
})
// Handle errors
.catch((err) => console.log(err.message))
// Get next file from the queue
.then(() => {
// If there are no more items in the queue, we're done
if (!fileQueue.length) {
return;
}
downloadFilesFromQueue(fileQueue.shift())
});
// Track all unresolved promises
const unresolvedPromises = [];
// Request no more than 10 files at a time.
for (let i = 0; i < 10; i++) {
// Use file at the front of the queue
const fileName = fileQueue.shift();
unresolvedPromises.push(downloadFilesFromQueue(fileName));
}
// Wait until the queue is emptied and all file retrieval promises are
// resolved.
await Promise.all(unresolvedPromises);
// done
sftp.end();
});
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With