In an era of escalating cyber threats, data privacy and security are critical concerns for law firms. The legal industry handles extensive amounts of sensitive information, making it an inviting target for cybercriminals. In 2024, the rising number of breaches spotlights the need for all-sized firms to adopt strong cybersecurity measures to protect client confidentiality and maintain trust.
According to IBM’s Cost of a Data Breach Report, the global average cost of a data breach in 2024 reached an all-time high of $4.88 million, reflecting a 10% increase over the previous year. A breach’s financial repercussions can be even more painful for law firms. Two recent examples highlight the growing risks:
Gunster Law Firm agreed to an $8.5 million settlement over a 2022 data breach that exposed thousands of individuals’ personal and health information.
Orrick, Herrington & Sutcliffe reached an $8 million settlement in April 2024 after a breach compromised the personal data of over 600,000 people.
Beyond the financial and potential reputational brand damage, data breaches impact firms on several other fronts. Phil Favro, a court-appointed Special Master and expert advisor, highlights several considerations.
Litigation. Firms are in the business of generating income by representing client interests. When firms shift to a defensive posture to address harm arising from a data breach, income-generating resources are redirected to protecting firm interests. This means firms may have to (among other things) address the concerns of government regulators and also litigate to defend firm interests against impacted parties — all of these drain firm resources and revenue.
Ethics. It’s not a stretch to suggest that data breaches could result in lawyers being disciplined under certain circumstances. This is particularly the case where firms have not taken reasonable steps to prevent inadvertent disclosures of client information. See Mod. R. Prof. Cond. 1.6(c). Professional discipline can severely impact a lawyer’s reputation, ability to practice law, and future earnings.
Law firms hold valuable data from confidential client communications to personally identifiable information (PII) and health records. Cybercriminals target firms with outdated security measures, making them vulnerable to attacks. By adopting several defensive and offensive measures, they can reduce risk and increase firm and client data security.
Multifactor Authentication (MFA): The American Bar Association’s 2023 Cybersecurity TechReport survey found that only 54% of attorneys had MFA available, despite Microsoft reporting that MFA can block 99.9% of credential-based attacks. Enforcing MFA across all platforms is a fundamental step toward improved security.
Software Updates and Patch Management: Another American Bar Association study found that 42% of law firms with 100 or more employees were using outdated software. Regular updates and patching are essential to closing security gaps.
Jason Brandes, an executive with 25+ years of experience in data management, offers a proactive perspective, noting that Data Compliance and Data Management are crucial for data privacy and security because they ensure that organizations handle data responsibly, legally, and efficiently.
Data Compliance: Ensures adherence to regulations (e.g., GDPR, CCPA) to protect sensitive information, reduce legal risks, and maintain customer trust.
Data Management: Enforces policies, controls, and best practices for data storage, access, and lifecycle management, minimizing risks of breaches and unauthorized access.
Law firms are increasingly turning to generative AI (GenAI) to enhance efficiency and improve legal research. However, these tools pose data privacy risks to manage. Cathy Fetgatter, Senior Vice President of Analytics & Managed Review at Innovative Driven, and Wayland Radin, VP of Analytics, weigh in with their team’s perspective.
GenAI models are initially trained on vast amounts of publicly available and user-provided data—sometimes “memorizing” sensitive inputs. This raises concerns about inadvertent disclosure and the improper use of confidential client information. The release of ChatGPT in late 2022 marked a turning point in GenAI adoption, making advanced language models widely accessible for the first time. However, many users fail to realize that these tools often still learn from their interactions. The familiar thumbs-up/thumbs-down feedback buttons on platforms like ChatGPT serve as a reminder that user interactions continue to refine these models. In some cases, even without explicit feedback, AI providers may collect data to improve their systems – reinforcing the adage used by Google, Facebook, and other tech companies, “If you’re not paying for the product, you are the product.”
GenAI providers, particularly consumer-facing ones, rely on user inputs to train their models, and some monetize data by selling insights or sharing information with third parties. If an AI tool does not explicitly guarantee confidentiality, anything inputted— privileged communications, personally identifiable information (PII), or sensitive health data—may become part of a broader dataset with unintended exposure. To mitigate these risks, law firms should adopt a cautious approach:
Review Terms of Service: Before integrating any GenAI tool, firms must understand its privacy policies and security commitments.
Control Prompt Data: Users should avoid entering sensitive or privileged data unless they are certain the tool enforces strict confidentiality measures. Many firms bring these tools wholly in-house for this reason.
Manage Data Retention: Many AI tools store prompt histories by default. Firms should consider disabling retention settings or ensure data deletion after each session.
The convenience and power of GenAI come with significant responsibilities, particularly in the legal industry where confidentiality is paramount. While these tools can enhance efficiency, law firms must remain vigilant, ensuring that the benefits of AI do not come at the cost of ethical obligations nor client trust. By establishing clear protocols and carefully vetting AI tools, firms can harness AI’s potential while safeguarding the integrity of their data.
Conclusion
With cyber threats showing no signs of slowing, law firms must take decisive action to protect client data. By implementing defensive and offensive measures, ensuring strong information governance, and addressing the unique challenges of GenAI, firms can mitigate risks and uphold their commitment to client confidentiality. In today’s evolving technology landscape, proactive cybersecurity is essential.