Recursively get files from a FTP server

Since Arvixe offers free website transfer to all new clients, we often come across the dilema of moving large amounts of files recursively from a remote ftp server (the client’s old hosting provider). On linux machines, the command line ftp client doesn’t do a good job of providing a way to do this. The command mget does get files recursively but only those inside a specific directory.

According to linux some users, on some flavors of the FTP client, the following may work:

ftp> recursive mget *

However, none of our servers (all running centOS or RHEL) allow for the “recursive” call. Another option we have found that works great is this:

wget -r ftp://user: pass@domain…

The only issue with this method is that if the command ‘wget’ is interrupted or the server goes down, there is no “resume” function. There is however most likely a way to have wget bypass files currently available and move on.

Relevant Link: http://forums.devshed.com/ftp-help-113/recursive-mget-with-command-line-ftp-37472.html

Tags: , , , , | Posted under Linux Server Admin | RSS 2.0

4 Comments on Recursively get files from a FTP server

  1. If needing to use a username with an ‘@’ in it, you can use %40 instead of the @ and it will go through just fine.

  2. Use the flag -nH to make wget not create a directory with the name of the host prior to fetching the files.
    Use the flag -c to not re-download everything that’s already been downloaded (and continue everything that’s been half done)
    Use the flag -l0 to make sure wget goes down every directory all the way. By default, wget only goes down 4 directories.

  3. Sandro says:

    Thanks very much!
    It works fine.

  4. Hello Arvand Sabetian,
    I am very much thankful to you for this post. You know I was having more than thousand issues for getting data from my old host to my new VPS one. I tried so many things but didn’t worked even the below
    # ftp host
    # ftp> mget *
    But issue was that I was unable to create that much thousand directories and files (almost 500,000+) located in it on my old server. The root level download completed with no problem but when going for inner download the transfer terminates with message saying that no folder blan blan.

    I am very much thankful to your post where I get the method to use wget for FTP transfer with recursive download which helped me out in saving plenty of days work on finger tips and completed in less than a 6 hours of Server side transfer progress.

    I will always keep you in my mind whenever have got some time.
    Do you have any experience with downloading the recursive FTP access using using #ftp> ?? (without creating folders for MGET * folders)

    Please let me know on my email posted with this message. :)

    Thanks
    Farhan Islam

Leave a Reply

Your email address will not be published. Required fields are marked *


− 1 = 6

You may use these HTML tags and attributes: <a href="" title="" rel=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>