The Linux Page

wput, a really bad tool?

The other day, I setup a small script to send file on an FTP. That was a long time since I had done anything like that and was not really thrilled, but I'm working for a client.

The FTP worked pretty well. The upload was going for a while. According to wput, if the connection is lost, it automatically retries and continues the transfer. It got stuck 3 times on me. For hours. No retry. Zilch.

Now, I updated my files on my end and then started the transfer again. Had to wait forever, although that is not the direct fault of wput, when it gets stuck mid-way, it's bad.

The transfer finally ended... and the website was completely hosed.

I looked into it and had to re-upload many files. I was starting to wonder what the heck is that wput doing?! or are the FTP transfer to GoDaddy that bad?! (would be surprising, wouldn't it?)

So... I re-transfered 3 modules for the sites and it started to work again.

What had happened?

I tried a new set of transfered today because I was working on a module and wanted to see what happened the other day. First transfer, sure enough, it failed. Big error! I checked into it, the fact is that only the end of the file was uploaded! I used the --reupload, but that is used ONLY when the files are of the same size. Otherwise, wput is in "finish that transfer" mode.

So I had to transfer everything "twice" (not really twice since the first time only the end part was being sent there...

Then I switched to ncftpput since that tool actually works. It is really like a graphical ftp client, only you can use it on your command line or in a script.

You can also use ncftpget and ncftp. Or even the aftp library if you're writing code and don't mind an old Unix library... (uses Unix pipes!)

If you need and/or want a secure transfer using SSL, then look into Curl.

Although you could use a shell script with that good ol' ftp and <<EOF followed by console data to the ftp. Good luck with that! (it works, but it really isn't practical!)