I'm currently experimenting with an idea for a project. For this, I had to write a script that recursively downloads a directory to my hard-drive using FTP.
I soon found out that when dealing with FTP and node.js, jsftp is the way to go as long as you don't need to support SFTP.
Recursively downloading an FTP-directory
Here is the steps I had in mind when writing my script:
- Authenticate via FTP
- Walk remote directory and save files/directories to an array
- Iterate through array and download files
Points 1 and 2 worked intuitively by writing a recursive function that uses jsftp's ftp.ls
-function to walk the remote directory.
I thought I could now just iterate through the directory-listing that I had saved to an array and download each file. In code, this looked something like this:
JS
remoteFiles.forEach(function (file) {ftp.get(file.path, targetPath, function (err) {if (err) return cb(err)console.log('downloaded ' + file.path)cb()})})
FTP error: ECONNREFUSED
However, nothing really happened. All I got, was a big, confusing (to me at least) error called ECONNREFUSED
. I did a bit of Googling and read some stuff about firewalls but that wasn't the issue.
The issue was: I was creating too many connections to the ftp-server simultaneously.
Limiting ftp connections with jsftp
jsftp
doesn't have a built-in functionality for limiting the number of concurrent connections, but it's still really easy by using the async-library.
There's a method called eachLimit that does just what we want. Everything works fine if we do something like this:
JS
async.eachLimit(remoteFiles,1,function (file, cb) {ftp.get(file.path, targetPath, function (err) {if (err) return cb(err)console.log('downloaded ' + file.path)cb()})},function (err) {if (err) return console.log(err)console.log('all files were downloaded')})
Of course there's some more logic required for dealing with "downloading" directories but the eachLimit-approach should solve your problems when dealing with the ECONNREFUSED
-error in this case.