Welcome!

By registering with us, you'll be able to discuss, share and private message with other members of our community.

SignUp Now!

How to? ftp server

Is there a catalog file available we can compare with a saved catalog file to know what is new? I mean similar to the http://jpsoft.com/downlaods/v##/tcmd###.aiu files, but instead of 2 separate files for each current version, a single all-encompassing file so the search could be automated? For example, I have effectively replaced one of my machines with a 64b machine, but newver downloaded older 64b version installers...
 
Moving away from automatable procedures... Many years ago JPsoft support published a batch file to create such a catalog.
 
Moving away from automatable procedures... Many years ago JPsoft support published a batch file to create such a catalog.

Many years ago, JP Software distributed software on its FTP site. But that was many years ago, FTP is largely obsolete (and wholly dangerous), and Take Command has had a built-in updater (for many years) which renders the point entirely moot.
 
Is FTP deprecated by W3C, and if so, what is the suggested replacement? And why is File Transfer Protocol more dangerous for file transfers than hypertext transfer protocol? Does HTTP provide a method to compare a downloaded file with the original as is possible with FTP? Or is FTP more dangerous than HTTP because the security of an FTP server against unauthorized intrusion modifying its downloadable files is inherently less than that of an HTTP server against an analogous attack?

TC's built-in updater is fine for those who have unmeasured wideband internet connection, and for those who only have a single-copy license, but if you have copies on multiple machies, and a slow or measured internet connection, it requires multiple downloads of the same file, an obvious waste of bandwidth and download volume. Thanks to the update information (.aiu) files, I can minimize the download volume even with HTTP, but I must depend on the hash codes in the .aiu files as the sole means to verify download accuracy, I cannot compare with the originals.
 
My mistake - I referenced W3C (World Wide Web Consortium), which issues some of the IP standards, but FTP standards were actually issued by IETF (Internet Engineering Task Force) as RFC 959,also known as STD 9, and many others. The latest modification (a registry for FTP extensions) is RFC 5797 of March, 2010. FTP is alive and well! There is also FTPS, such as RFC 2228, already supported by TCMD. SCP could also work; I do not recall ever using it.

My wild guess is that the real issue with FTP is that a server can have a virtually unlimited number of concurrent HTTP connections, and thus concurrently deliver downloads to all requestors (albeit slower), the number of concurrent FTP connections is very limited, and concentrated denial of service attacks can totally prevent legitimate users from accessing the FTP server.
 
Is FTP deprecated by W3C, and if so, what is the suggested replacement? And why is File Transfer Protocol more dangerous for file transfers than hypertext transfer protocol?

If you have an extra 40-50 hours a week (more like 80-100 once the hackers find you), I suggest you actually run a public FTP server for a while. You'll quickly find learn some painful truths about the holes in FTP security.

I could write an additional new version of Take Command every year with the time I have spent supporting < 10 (mostly non-paying) users on FTP. It is a ridiculously extravagant waste of my time for an infinitesimal benefit -- the public JP Software FTP server is gone, and it is never going to return. End of discussion.
 

Similar threads

Back
Top