Is Your Contract Technology Secure? A Guide to Security and Privacy Protections

Written by: David Curle

March 18, 2021
Artificial Intelligence

3 minute read

Contracts contain some of the most critical and sensitive information that an organization wants to protect - information about strategies, relationships, prices, timetables, intentions.

Today’s technology has expanded the ways that contracts are managed and analyzed, and technology is a powerful tool for extracting important insights and value from contracts. However, the manipulation of contracts in software, like any technology, entails certain security and privacy risks.

The use of machine learning to analyze documents, in particular, has given rise to new kinds of potential threats to the confidentiality of contracts, and to effective new techniques that protect that confidentiality.

A new guide from Kira, authored by Dr. Joey Coleman, Kira Fellow, and Dr. Sam Fletcher, Applied Research Scientist, sets out a number of potential security and privacy threats, and the state-of-the-art protections available against those threats. It’s a handy reference for anyone considering the security and privacy measures that a particular contract technology vendor offers its customers.

Security protections that buyers should look for include techniques that limit the access of insiders or third-party vendors to customer data, viruses and malware, and other outside malicious attacks. This kind of protection is about keeping malicious actors out of the machinery of vendor software and networks, and this guide outlines the techniques that a secure solution should offer.

In addition to those kinds of security measures, there are privacy protections that are focused on protecting data in the underlying documents, ensuring that sensitive data in contracts is not available to third parties, even when models trained on the data are shared with them. There are indirect ways that outsiders can attempt to infer confidential information, either from quasi-identifiers in the underlying contracts, or by inferring it from the behavior of machine learning models that have been trained on contract data. Differential Privacy is an important technique that protects the data used to create models by inserting “noise” into the data or model. This prevents reverse engineering of a model to identify confidential information in the underlying data.

The development of Differential Privacy is an example of how security and privacy measures need to keep up with the changing nature of the tools that lawyers use in their work. Fletcher, the research scientist who developed Kira’s patent-pending Differential Privacy algorithm, said, “With more and more data available at everyone’s fingertips, it’s becoming increasingly easy for malicious users to piece together data from separate sources to deduce sensitive information. For the first time in computer science, Differential Privacy equips us with a guaranteeable, quantifiable method of protection, even in the most catastrophic of scenarios.”

This guide presents all of these techniques, and more, and serves as a resource for buyers evaluating contract technology providers.

Download our guide on Security and Privacy Questions to Ask Your Contract Technology Vendor by clicking here.

Share this article:

Get the latest legal tech insights sent straight to your inbox.