In our last post, we outlined the most common regulations and requirements that affect file server data security. This follow-up post will address the common steps you’ll need to take to ensure compliance. 

For this post, we are focusing on internal system and application requirements, rather than processes related to handling consumer data information (such as privacy policies, data breach notifications, etc.).

Regulation and industry guidance for file server data security practices fall into seven areas: 

1. Security by design

Security by design means that, when you build and provision your file server, you are doing so in a way that ensures it will be as protected as possible from outside access or attack by default. Using the techniques below can help you implement security by design on your file server.

  • Encrypting storage volumes: While it can slow file access and add to server weight, encrypting your file server using a tool like PGP is one of the best ways to ensure that anyone who does gain access to your server cannot read any of your data. 
    • On a related note, your encryption practices are only as good as your key management solutions. Be sure that you keep your private keys secret, and handle key distribution safely.
  • File integrity monitoring: Tracking any changes to your file server using methods such as checksumming, hashing or time-stamping can help detect changes that might indicate a breach. 
  • Implementing Data Loss Prevention (DLP): DLP tools can be installed on your file server to prevent sensitive data from leaving your network. These tools can be configured to look for matches to sensitive data and shut down any changes to that data, or use machine learning to look for unusual data access and alert your administrator.

Auditing and logging

Log auditing benefits both your organization and the regulators who may ask to review those logs. By maintaining detailed logs of data access, modifications, user activity and administrator actions, you will be able to build a pattern of behavior, identify unusual activity that might raise concern and review logs for any potential security issues (ideally before they happen). 

It is good practice to set a regular log review schedule for your file server, while also spot- and cross-checking logs at other intervals based on your security posture. 

Regulators may also request to review logs for any necessary forensic investigations.

Cerberus by Redwood’s auditing and logging features on its FTP Server provide a great tool to help analyze any traffic that might leave your file server through your secure file transfer application. 

Access controls 

Controlling access is one of the most critical security elements we face when managing a file server, because it is often the most challenging. Most regulations require “strong” file server access controls and give administrators leeway to define what strong means. Implementing the following on your file server should help you meet this requirement.

  • Strong password policies: Your first line of access control defense is ensuring that your users’ passwords are resistant to brute force attacks by requiring longer passwords with more complex character combinations.  
  • Lockout policies: In the event of multiple failed access attempts, which can be indicative of a brute force attempt to access your file server, you’ll want to configure a lockout policy that prevents user access and alerts your administrator to a potential threat. 
  • Role-based access control (RBAC): RBAC can make your data security posture easier to manage by confining certain users to certain directories. Then, as an administrator, you can review a user’s roles to ensure that your access controls are in place. Cerberus FTP Server, for example, integrates with Active Directory security groups to help implement RBAC.  
  • Multi-factor authentication (MFA): MFA decreases the likelihood that a compromised password will allow a bad actor to access your system by requiring additional verification of the client’s identity. Given how frequently passwords are compromised, it should be a default requirement on your file server.

Encryption

Encrypting your file server’s data is a core component of security by design and should be enforced at every data state: at rest, in motion and in use. Your encryption requirements will vary depending on your compliance requirements, but generally, you’ll be asked to use an industry-standard encryption algorithm that meets a minimum bit size. We’ve outlined several of these practices in our post answering common questions about encrypted file transfer and what is FIPS compliance

Data management

Data management practices ensure that your organization handles sensitive data correctly and has a plan in case of disruption or breach. The most common data management requirements for file servers are:

  • Data retention policies: To reduce risk and improve privacy, many organizations are required to delete data on a regular basis. Your data retention policy will govern this area, and can be enacted by tools like Cerberus FTP Server’s Folder Monitor
  • Recovery and backup: Your organization no doubt has built in redundancy and failover support in the event of a system issue. It’s good practice to document these procedures and test them on a regular basis to ensure that your file server backups work as designed. 

System analysis

As you focus more on meeting ISO standards, you’ll be asked to more thoroughly vet your processes and systems on an ongoing basis and demonstrate that you’ve done so. These requirements will typically ask you to:

  • Document your incident response plans: A security incident can be catastrophic, which means that it’s important to be prepared if one does occur so that you can respond as required. Most process-focused requirements will ask that your team be familiar with the steps to identify and shut down an incident, and how to respond as the severity of the incident increases. 
  • Conduct periodic risk assessments and vulnerability scans: This process involves identifying potential threats and scanning your file server and other infrastructure for any vulnerabilities. Your goal is to identify and resolve weak points in your systems before attackers can, in order to maintain a proactive security posture. Cerberus FTP Server’s Enterprise Plus edition comes with automated network scanning and rogue transfer detection to support these requirements. 
  • Establish third-party vendor management processes: Vendors who access and work with your file server represent “covered entities” for many of the regulations we’ve discussed. In order to ensure their security posture matches your own, and that you are not violating any compliance requirements, it’s important to establish processes that define and verify your vendors’ data handling requirements.