That does seem to stop at a random point. I tried it again, and then the downloaded file was 121,397,248 bytes, and again, and it was 121,896,960 bytes. (Wget or other FTP client software consistently downloads everything).
Note that 'testserver' is on my local LAN with no firewall between it and my PC (I'm using it for unit testing our own software), so /P1 shouldn't be necessary.
It's just a standard VSFTPD FTP server running on Ubuntu.
Using IFTP /V shows it seeing the correct size of the file and no errors - but the resulting file is still too small
See screenshot below
Oddly, I've just set up a test FTP server at Digital Ocean using an *identical* setup to our internal test server (same Linux version, same vsftpd version, same config, etc). That one seems to work fine. The only significant difference I could think of was that our internal test server is on our internal gigabit network, whereas the connection to Digital Ocean is obviously slower.
So,I tried setting bandwidth throttling on the VM where the local test server is running to 5Mbps - hey presto it worked fine (but obviously a lot slower), downloading the whole file, and the SHA1 checksum matches, so the download is correct.
So, it looks like it may be a timing issue somewhere.
Odd. Here, I can download at 1Gbps using wget, FTP, Filezilla etc, but if I use TCC, it truncates it randomly unless I put severe bandwidth throttling on, then it works fine.
(FWIW, I've tried changing our unit tests to download the large file use IPWorks ftp::download() and that works fine for us at 1Gbps, so if TCC is using IPWorks for that as well, something odd is going on...)