Our Products:   CompleteFTP  edtFTPnet/Free  edtFTPnet/PRO  edtFTPj/Free  edtFTPj/PRO
0 votes
8.7k views
in .NET FTP by (360 points)
We bundle edtFTPnet/PRO into our automated deployment product, Octopus Deploy. A number of customers have reported problems with our FTP sync feature when using Windows Azure websites with FTPS.

I've posted a gist containing:

1) The code to reproduce (just enter a license key and change the sync directory) - I even included credentials for an active Azure website so you can fully reproduce
2) The output I get (with logging enabled)

https://gist.github.com/PaulStovell/22a ... 9d64ca24a5

It seems like when issuing the LIST commands, connections eventually time out. Sometimes they work, but if I terminate the app part way through the connection, then run it again, I'm able to reproduce it 9 times out of 10.

Now I don't think this is limited to edtFTPnet/PRO, as FileZilla encounters similar sporadic errors when uploading files, and freezes when listing sometimes.

Response: 550 The supplied message is incomplete. The signature was not verified.
Error: File transfer failed after transferring 2,076,672 bytes in 26 seconds

Using FTP instead of FTPS also seems to be fine, so I'm assuming this is a problem with the FTP server that Windows Azure use for website publishing. But I wanted to share it with you in case there's a magical setting I can set that will make it "just work" :)

Some more info on what errors people are seeing is here:

https://octopus-deploy.tenderapp.com/di ... o-redeploy

And a question: we currently set an 8-hour Timeout on the FTP connection's sockets, the assumption was if people were uploading large files we'd want to give them plenty of time. But this means that failures like the ones above result in our app appearing to "hang" rather than timeout. Is it safe to set the Timeout property to a small value, say 10 minutes, without it preventing file transfers that might take an hour to complete?

12 Answers

0 votes
by (51.1k points)
Timeouts on listings (and file-transfers) in FTPS are often due to incorrectly configured firewalls. How are you using Azure? Are you renting a VM on which you run an FTP server that you control, or are you using some inbuilt FTPS service that you don't control? If it's the former, then perhaps you've made a mistake in your port-forwarding configuration of the router. If it's the latter then it looks like the mistake is MS's.

- Hans (EnterpriseDT)
0 votes
by (360 points)
Hi Hans,

Windows Azure web sites have a built-in FTPS server that Microsoft manage which you can publish to. I don't get any control over the firewall.

Paul
0 votes
by (360 points)
Also, can you comment on this:

> And a question: we currently set an 8-hour Timeout on the FTP connection's sockets, the assumption was if people were uploading large files we'd want to give them plenty of time. But this means that failures like the ones above result in our app appearing to "hang" rather than timeout. Is it safe to set the Timeout property to a small value, say 10 minutes, without it preventing file transfers that might take an hour to complete?

Paul
0 votes
by (51.1k points)
OK. How have you found their support? Are they responsive? I guess they're not since you're contacting us :). If I were them I'd be analyzing the distribution of port-forwarding settings in my routing infrastructure.

Apart from that, the first thing I'd suggest is to use SFTP instead of FTPS if that's possible. SFTP doesn't suffer from these sorts of problems since it does everything on a single connection.

Failing that, you could try setting the SecureFTPConnection.UseUnencryptedCommands flag. This causes all FTP commands (including their arguments) to be sent in plain text, while doing all transfers and listings over SSL. This is clearly less secure than standard FTPS, but is more secure than FTP since transfers are encrypted. Not all FTPS server support this command.

By the way, was it intentional that you included your credentials in the gist?

- Hans (EnterpriseDT)
0 votes
by (360 points)
Looks like unencrypted commands isn't supported by Azure's FTPS server.

I've posted the same issue here:

http://social.msdn.microsoft.com/Forums ... 1acde94c5f

Including the credentials was intentional, I wanted you to be able to run it :) In a few days I'll delete that account.

Any comment on the socket timeout setting and whether it will break big/slow uploads?
0 votes
by (51.1k points)
A timeout of 10 minutes seems a bit long. Have you considered setting the timeout to a few seconds, catching the timeout and then retrying? Since the problem is intermittent you may have success after retrying a few times?

- Hans (EnterpriseDT)
0 votes
by (360 points)
At first Microsoft suggested it was an issue with the service, now they are pointing it back to being a client issue:

http://social.msdn.microsoft.com/Forums ... acde94c5f/
0 votes
by (51.1k points)
OK. I'm running your example at the moment. I should expect to see it hang, right? I've run it about 10 times without seeing a hang. How many times can you run it without a hang?

I just realized that I misunderstood your question regarding timeouts. The timeout that you can set is not related to the time a transfer takes. It signifies the length of time the socket should wait while expecting a response. Generally, if a socket doesn't respond within a few seconds then it's probably isn't going to. SecureFTPConnection will automatically reconnect and resume transfers, so even if it does prematurely a transfer because of an unusually slow server response then it should be able to reconnect and resume the transfer automatically. Of course you have to have the retry feature enabled for this to take place.

- Hans
0 votes
by (51.1k points)
It just occured to me that if Codeplex works, it may in fact be because it automatically retries silently when a timeout occurs. So the fact that one client works, doesn't necessarily mean that MS doesn't have a problem. As I mentioned previously, you can configure SecureFTPConnection to automatically reconnect/retry/resume transfers when a connection breaks. It won't automatically retry listings unfortunately, but it wouldn't be too much work to do that manually by putting a loop and a try-block around the call to GetFileInfos.

- Hans (EnterpriseDT)
0 votes
by (360 points)
Interesting, I've just run it three times and it hung all three times for me (usually on the third LIST command).

Looking at NETSTAT I can see the connection is ESTABLISHED while hanging. What else can I look at to diagnose the problem?

Categories

...