Transferring Data

Movement of data is difficult to avoid in today's legal world. In the discovery stage of litigation, for example, a common scenario goes something like this: A client collects data via their in-house IT team or through a third party forensic collections vendor. The collected data is delivered to a law firm. The law firm outsources the data processing to a third-party vendor or processes the data in-house utilizing data processing software. The processed data is then moved into a review platform, either in-house or with a third-party vendor via their public or private cloud. Then, after the data has been initially culled, analyzed and organized, it is temporarily moved out of the review platform into yet another software program to leverage advanced analytics, data visualization, communications graphing, artificial intelligence and tools like technology assisted review. Upon completion of the analysis, the remaining data is then moved back into the review platform for final review and eventual production. Once discovery has narrowed the original data set to the most relevant documents, those materials must be shared among lawyers and parties to litigation, and that may involve the downloading of files from file shares or the transfer of files via email, incurring additional risk.

With every movement (to or from a hard drive or thumb drive, or over a network between any number of machines or devices), the data can get lost or stolen, end up in the wrong hands (inside or outside the organization), get hacked by parties with malicious intent, or otherwise become compromised or corrupted. As law firms incorporate more technology into their workflows to increase productivity and efficiency, and the number of applications and service providers embedded in their processes increases, the frequency with which data must be moved from one location or application to another also increases. Data movements are inherently risky—particularly when the data is from a client.

Because data security is typically considered an IT matter in law firms and is an area in which firm personnel are unlikely to have real expertise, many firms lack the advanced security tools and rigorous protocols that data in transit requires to properly mitigate risk. What should they be thinking about, and what questions should they be asking?

|

Encryption for Physical Data Movement

While the details of data encryption may seem obscure to some lawyers, encryption is an essential tool in protecting the confidentiality of client data. Firms should be familiar with the Federal Information Processing Standard (FIPS) used to approve cryptographic modules, and specifically they should meet and maintain all standards required for FIPS 140-2 certification. When firms are transporting or receiving data via physical means, they should ensure the data is stored on FIPS 140-2 encrypted drives.

Among other advantages, FIPS mandates better logging, which provides better visibility into access logs. Also, drive passwords should never be included in an email or letter; always communicate passwords through a phone call, and ensure that the drives you are using support auto erasure of data when someone enters incorrect passwords multiple times.

|

FTP Protocols and VPNs for Virtual Data Movement

When data you control is moved virtually—i.e., between applications or between data centers or other network locations—you should be using secure file transfer protocols or SFTP.

The SFTP server should be configured to communicate only with designated IP addresses, and data should be sent using secured virtual private network (VPN) tunnels—essentially private servers for routing data that ensure proper data encapsulation and encryption.

The FIPS standard is also important in this context. Any servers you are using for data movement across networks should run in FIPS mode, because FIPS mandates robust cryptographic controls that ensure that your data cannot be snooped in transit.

|

Access Control

Many law firms have multiple divisions and locations from which a diverse range of employees access a variety of applications within a complex IT infrastructure. Who is authorized to use specific applications, and who is authorized to see and "touch" and move specific sets of data while using those applications? Controlling access to data across large organizations is fraught with complexity, but it's a vital piece of the data security puzzle. There should be tiered logging permissions, user logs and user activity tracking. (The FIPS 140-2 standard also addresses some of these issues.)

The goal of access control is to prevent unnecessary or duplicative access to sensitive data. In general, access should be controlled using a strict, rules-based system. Access privileges should be granted only on an as-needed/on-demand basis to the specific users who have a legitimate need to work with the data, and should be read-only whenever possible. Access rights can be configured according to team, job role or even on an individual basis. Another important principle in access control is to keep the number of end points with internet access to an absolute minimum.

Firms require exceptional process rigor and high levels of IT expertise to get access control right. If you can't reliably meet these requirements, look for a vendor that follows the best practices presented in this article to manage access control.

|

Data Security in the Cloud and in Dev Ops

Organizations that store, access and manipulate data in a cloud computing environment require, first of all, a good front-end web application firewall or WAF with multi-factor authentication. Also essential: a strict access rules matrix including audit logging, a proper security information and event management (SIEM) solution, and a rigorous vulnerability assessment and penetration testing (VAPT) program that identifies and classifies system vulnerabilities and conducts "ethical hacking" tests to verify specific vulnerabilities actually exist.

Firms engaged in developing their own software and applications must secure their development operations. Ideally, your security and development teams will work closely together to verify static code testing and VAPT testing before any code goes into production. Reputable vendors with development operations will generally conduct such testing routinely, but firms should always ask and make sure the companies they work with to develop custom applications are mature and rigorous in their security protocols.

|

Vendor Vetting

Does your firm have a formal vendor assessment process? The data-in-transit protocols for firms outlined above should also be applied to each vendor you use. Firms should have clearly defined processes in place that require the vendor to verify security certifications and the specific processes and tools they deploy in handling data. Do they have their own data centers, or do they rely on the data centers of other vendors? What are the specific security and control measures for the data centers of these "fourth parties"? How do your vendor and the vendors they use manage data handling processes like shipping physical media?

Wherever practical, your goal should be to minimize the number of third and fourth parties that touch your data. When data does move from your firm to a vendor, make sure you are auditing all access. Audit logs should be checked every day. You should also have automated processes in place that identify and monitor inactive accounts and automatically "expire" or eliminate those accounts after a designated period of inactivity.

|

Consolidation of Technology

We've already highlighted the inherent risk that results when firms deploy increasing numbers of applications onsite and use vendors that somehow must connect to each other and allow frequent transfers of data back and forth. In response, some firms are moving toward the consolidation of all—or many—processes and applications within a single platform where all data resides in a single cloud. In addition to the mitigation of data security concerns, this approach has several additional advantages. It can minimize the very real problem of fragmented workflows, reduce the burden on employees to learn to use multiple tools, and enable the application of advanced technologies like data analytics and artificial intelligence to a more comprehensive range of litigation- and operations-related activities within the firm.

Firms who are not yet prepared to consolidate on a single platform need to remain vigilant with vendors and their own operations whenever data is transferred. Is there a formal cybersecurity program in place? Does the program adequately address each of the key areas of concern outlined in this article? Is it being followed internally and by third and fourth parties? The only way to know is to ask.

 

Sundhar Rajan is Chief Information Security Officer for Casepoint. He oversees and is responsible for information security, maintaining security compliances, global infrastructure, all cloud initiatives, and proactive compliance security monitoring. Prior to joining Casepoint, Sundhar spent more than 9 years at the Am Law 100 firm, Crowell and Moring LLP, where he was the Manager of Network Operations. Sundhar brings over 18 years of experience working in information security, leading network security teams, and building highly scalable application infrastructure.