Author Archive

What Is Your High Availability Plan for Your SFTP Server?

Posted by on Wednesday, 2 January, 2013

As your organization and its trading partners become more and more integrated, it is becoming critical that file transfers are performed without delays or disruptions.

For instance, a document containing a batch of transactions could traditionally be delivered within a window of several hours without causing any issues. But today, in the effort to make business processes as efficient as possible, that same document must now be delivered within seconds.

Organizations are therefore taking a closer look at how they can provide the best high availability for their systems to minimize any potential disruptions to their file transfers.

Comparing High Availability Strategies

Many of the secure file transfers from your trading partners are probably going through an SFTP server in your organization’s network.  If that SFTP server were to go down (for example, due to a CPU or drive failure), then you would need to fail over to a redundant backup system to continue to service your trading partners, thus maintaining high availability.

Two common approaches for providing high availability for SFTP servers and most other applications are Active-Passive or Active-Active.

Active-Passive

With an Active-Passive approach, only one SFTP server will be active at a time to service your trading partners. A backup copy of the SFTP server would exist on your network as a “passive” system, meaning that it is installed and configured, but it is not actively running.

To prepare in the event of a failure of the active SFTP server, it is important that you frequently replicate all settings and configuration files from the active SFTP server to the passive system. If the active SFTP server fails, then the passive SFTP server could be launched and your network configured to point to this new system.

In an Active-Passive configuration, the downtime for your trading partners (when a failure occurs) can be a few seconds or several hours depending on how the passive system is started.

The least efficient and often slowest implementation of an Active-Passive approach is to rely on human intervention to detect the failure and then manually start up the passive system.  This could take several hours depending on when the outage is reported, the process to start the passive system, and the complexities of configuring the network to route traffic to the new system.

A much better approach would be to have a third-party system monitoring tool that would immediately detect when the SFTP server fails, and then would automatically start up the passive system.  The result should be a a much shorter disruption for trading partners of only a few seconds.

Active-Active, or Clustering

The next level in high availability is to use an Active-Active approach, also referred to as “clustering.”

With Active-Active, two or more installations of the SFTP server can be running concurrently, sharing the same set of configurations and trading partner accounts. The SFTP servers in the cluster are in constant communication with each other, so if one of the SFTP servers were to fail, the remaining systems in the cluster will continue to service the trading partners. This configuration will provide the maximum high availability since it is not dependent on human interaction or third-party tools to start up other systems.

If you need maximum up-time for your SFTP server, GoAnywhere Services™ now offers clustering.

Another advantage of an active-active configuration is that you can load balance the traffic over multiple systems, which is important when you need to service a large number of trading partners.  This will require that you install a load balancer like GoAnywhere Gateway™ in front of the cluster.  Typically this load balancer will be in your DMZ and will be your trading partners’ initial point of contact.

The Bottom Line

Both Active-Passive and Active-Active methods provide high availability for your SFTP server environment if configured properly.  However, Active-Active will provide the maximum up-time because it keeps multiple SFTP servers running concurrently in a cluster, along with the added benefit of load-balancing.

How critical up-time is to your bottom line will be the best guide to determining which high availability approach best fits your organization.

 

Bob Luebbe

Bob Luebbe has worked in the IT field since 1985. During his career, he has worked in a wide variety of roles including software development, project management, consulting and architecting large-scale applications. Bob has been with Linoma Software since 1994 and is currently serving its Chief Architect. His main focus for the last several years has been developing technologies to help organizations to automate and secure their file transfers, as well as to protect data at rest through encryption and key management.

More Posts - Website

The Problem with FTP

Posted by on Thursday, 8 December, 2011

Moving files from one computer to another has been essential since the start of computing. Once we were able to connect computers via network protocols and cables, we worked to find ways to send data and files back and forth across the network connections. Thus, FTP was developed specifically for this.  FTP also allowed programmers and system administrators to begin writing scripts to automate transfers based on some event that occurred in software applications.

Of course, once the Internet came along, businesses found more reasons to exchange files with their business partners and an FTP solution became an everyday necessity.

The Problemsecure file transfer, secure ftp server, managed file transfer

Early users of the Internet were universities sharing information that was freely available to the public.  But as businesses began using the Internet, the culture of data security changed.

Sending files to trading partners including confidential transactions and detailed customer information were becoming daily events.  In response, hackers expanded from hacking into computers connected to the Internet to plucking confidential data from the streams now traveling across the public network.

Network firewalls were developed to block hackers from access to individual networks, but the FTP protocol and its problems remained essentially the same.

The Solution

The flaws with standard FTP soon became obvious. To better secure file transfers, more secure protocols such as SFTP, FTPS, HTTPS, and PGP were developed for Internet file transfers. While these protocols allow greater protection for confidential data, many businesses and organizations still lack the understanding and the expertise to properly manage all of the processes that each of these methods involve. Some have turned to free PC-based tools like as Filezilla and CoreFTP, but most organizations that do regular file transfers need a much more robust way to manage these critical processes.

This need trigged the development of what are called “managed file transfer (MFT)” solutions that help IT staffs confidently manage and secure the file transfer process.  In particular, these MFT products make it easy to set up and manage FTP workflows that can be scheduled, automated, and logged with alerts for successful and/or failed connections and transmissions. A flexible MFT solution, like GoAnywhere, will work on most databases and run on multiple platforms.

So from the early days of sending files through rudimentary network connections to the fast-paced Internet driven business processes of today, finding the right FTP solution for your organization is more critical than ever.  Your customers, trading partners and compliance auditors are depending on you.

Bob Luebbe

Bob Luebbe has worked in the IT field since 1985. During his career, he has worked in a wide variety of roles including software development, project management, consulting and architecting large-scale applications. Bob has been with Linoma Software since 1994 and is currently serving its Chief Architect. His main focus for the last several years has been developing technologies to help organizations to automate and secure their file transfers, as well as to protect data at rest through encryption and key management.

More Posts - Website

SFTP versus FTPS – What is the best protocol for secure FTP?

Posted by on Thursday, 20 October, 2011

SFTP versus FTPSAn increasing number of our customers are looking to move away from standard FTP for transferring data, so we are often asked which secure FTP protocol we recommend. In the next few paragraphs, I will explain what options are available and their main differences.

The two mainstream protocols available for Secure FTP transfers are named SFTP (FTP over SSH) and FTPS (FTP over SSL). Both SFTP and FTPS offer a high level of protection since they implement strong algorithms such as AES and Triple DES to encrypt any data transferred. Both options also support a wide variety of functionality with a broad command set for transferring and working with files. So the most notable differences between SFTP and FTPS is how connections are authenticated and managed.

With SFTP (FTP over SSH), a connection can be authenticated using a couple different techniques.  For basic authentication, you (or your trading partner) may just require a user id and password to connect to the SFTP server. Its important to note that any user ids and passwords supplied over the SFTP connection will be encrypted, which is a big advantage over standard FTP.

SSH keys can also be used to authenticate SFTP connections in addition to, or instead of, passwords. With key-based authentication, you will first need to generate a SSH private key and public key beforehand. If you need to connect to a trading partner’s SFTP server, you would send your SSH public key to them, which they will load onto their server and associate with your account. When you connect to their SFTP server, your client software will transmit your public key to the server for authentication. If the keys match, along with any user/password supplied, then the authentication will succeed.

With FTPS (FTP over SSL), a connection is authenticated using a user id, password and certificate(s).  Like SFTP, the users and passwords for FTPS connections will also be encrypted. When connecting to a trading partner’s FTPS server, your FTPS client will first check if the server’s certificate is trusted. The certificate is considered trusted if either the certificate was signed off by a known certificate authority (CA), like Verisign, or if the certificate was self-signed (by your partner) and you have a copy of their public certificate in your trusted key store.

Your partner may also require that you supply a certificate when you connect to them.  Your certificate may be signed off by a 3rd party CA or your partner may allow you to just self-sign your certificate, as long as you send them the public portion of your certificate beforehand (which they will load in their trusted key store).

In regards to how easy each of the secure FTP protocols are to implement, SFTP is the clear winner since it is very firewall friendly. SFTP only needs a single port number (default of 22) to be opened through the firewall.  This port will be used for all SFTP communications, including the initial authentication, any commands issued, as well as any data transferred.

On the other hand, FTPS can be very difficult to patch through a tightly secured firewall since FTPS uses multiple port numbers. The initial port number (default of 21) is used for authentication and passing any commands.  However, every time a file transfer request (get, put) or directory listing request is made, another port number needs to be opened.  You and your trading partners will therefore have to open a range of ports in your firewalls to allow for FTPS connections, which can be a security risk for your network.

In summary, SFTP and FTPS are both very secure with strong authentication options.  However since SFTP is much easier to port through firewalls, and we are seeing an increasing percentage of trading partners adopting SFTP, I believe SFTP is the clear winner for your secure FTP needs.

Bob Luebbe

Bob Luebbe has worked in the IT field since 1985. During his career, he has worked in a wide variety of roles including software development, project management, consulting and architecting large-scale applications. Bob has been with Linoma Software since 1994 and is currently serving its Chief Architect. His main focus for the last several years has been developing technologies to help organizations to automate and secure their file transfers, as well as to protect data at rest through encryption and key management.

More Posts - Website

Data Breach: Are You Next (or Again)?

Posted by on Monday, 25 April, 2011

A data breach is closer than you think. As the percentage of data breaches increase, the risk of organizations losing your sensitive data also increases. No one wants to receive the news that some or all of their personally identifiable information (PII) was stolen. There are people who are victims of various phishing scams, but it is more likely that your information will be leaked or stolen from an organization.

The health care industry is currently in the spotlight, as they are moving to mandated Electronic Health Records (EHR) and the American National Standards Institute (ANSI) is investigating the two main health care related data privacy concerns today: how to protect patient information and what is the financial harm or cost per record if it is stolen.

The numbers are staggering. According to the Privacy Rights Clearinghouse (www.privacyrights.org), there have already been 47 reported leaks or breaches in the health care realm this year. That is about one every other day (102 additional reported breaches if counting business and government).

In the world of data security; breaches are no longer thought of in terms of “if,” but “when.” Fortunately, there are easy steps companies and health care organizations can take to protect the PII that they maintain from direct hacking attempts. The procedures data security companies recommend you acquire begin with the following:

  • Require strong passwords
  • Use encryption to protect files in motion and at rest
  • Reduce the number of computers that process sensitive information
  • Audit every transaction
  • Limit the number of accounts that can access the critical data

The organization you own or work for doesn’t have to be the next headline, start researching different options to protect your customer’s sensitive data and keep your organization from a possible breach. The fines and surcharges are exponentially higher than purchasing a secure managed file transfer solution or a database encryption tool. Not sure where to start? Read the Top 10 Managed File Transfer Considerations.

Bob Luebbe

Bob Luebbe has worked in the IT field since 1985. During his career, he has worked in a wide variety of roles including software development, project management, consulting and architecting large-scale applications. Bob has been with Linoma Software since 1994 and is currently serving its Chief Architect. His main focus for the last several years has been developing technologies to help organizations to automate and secure their file transfers, as well as to protect data at rest through encryption and key management.

More Posts - Website

Message Queues and Network Shares Added to Managed File Transfer Solution

Posted by on Monday, 28 February, 2011

The new 3.5 release of GoAnywhere Director is now available with more features to help organizations automate, secure and manage file transfers.

In this new release, GoAnywhere Director provides simpler access to files and folders on Network Shares. It can also connect to enterprise Message Queues (e.g. WebSphere MQ) for better integration with customer applications. The new version also includes “File Monitors” which can be used to easily scan for new, modified and/or deleted files in targeted folders. Additionally, this release includes the ability to auto-resume file transfers if FTP and secure FTP connections are broken.

In addition, better High Availability (HA) capabilities allow GoAnywhere Director to store configurations in customer database systems including SQL Server, MySQL and DB2 for IBM I (iSeries). This allows customers to manage and replicate this data using in-house database and HA tools.

I’ll say it again, that of all the tools I have purchased over 28 years in I.T. GoAnywhere Director is my favorite! ~ Don McIntyre, Kansas City, Missouri School District

Read the press release  > >

Bob Luebbe

Bob Luebbe has worked in the IT field since 1985. During his career, he has worked in a wide variety of roles including software development, project management, consulting and architecting large-scale applications. Bob has been with Linoma Software since 1994 and is currently serving its Chief Architect. His main focus for the last several years has been developing technologies to help organizations to automate and secure their file transfers, as well as to protect data at rest through encryption and key management.

More Posts - Website