Since Arvixe offers free website transfer to all new clients, we often come across the dilema of moving large amounts of files recursively from a remote ftp server (the client’s old hosting provider). On linux machines, the command line ftp client doesn’t do a good job of providing a way to do this. The command mget does get files recursively but only those inside a specific directory.
According to linux some users, on some flavors of the FTP client, the following may work:
ftp> recursive mget *
However, none of our servers (all running centOS or RHEL) allow for the “recursive” call. Another option we have found that works great is this:
wget -r ftp://user: pass@domain…
The only issue with this method is that if the command ‘wget’ is interrupted or the server goes down, there is no “resume” function. There is however most likely a way to have wget bypass files currently available and move on.