Keyword searches are the easiest way to find contract provisions. But how well do they work on unfamiliar agreements? In the previous post in the Contract Review Software Buyer’s Guide, I covered manual-rule based contract provision extraction. As discussed in much greater detail there, manual rule-based automated contract abstraction systems (aka keyword search systems) (1) are relatively easy to build, (2) work well on known documents and consistent contract provisions, and (3) are likely to underperform on unfamiliar documents. This post further explores manual rules-based automated agreement review systems by focussing on Mumboe, a now defunct web-based contract management company.
Mumboe offered a fully-featured web-based contract management system. Their system included “a secure contract repository, an agreement template library, alerts and reminders, and a full audit trail of agreement changes and revisions.” They garnered thousands of users before shutting down. One of their features was automated contract provision extraction:
Mumboe’s On-Demand Contract Intelligence capability lets users automatically extract key dates, deadlines and other details from your uploaded contracts.
Mumboe’s provision extraction system was rules-based, and covered over 100 data fields. A graduate of a top university’s linguistics Ph.D program (with previous experience in information extraction) led their natural language processing efforts. Beyond having a very well qualified manual rule writer, Mumboe had $4.5 million in seed financing. One problem: according to someone who would know, their manual rule-based provision models worked well on known contracts but didn’t scale well to new contracts.
Mumboe had a real expert writing rules, lots of money, and plenty of users to get experience with. And manual rules didn’t work well on unfamiliar agreements for them. This is unsurprising, based on the research on keyword search. Perhaps today’s manual rules-using automated contract metadata extraction vendors have solved the “accuracy on unfamiliar agreements” issue that Mumboe struggled with. Perhaps.
The next post in the Contract Review Software Buyer’s Guide series will cover comparison-based approaches to contract provision extraction, and also discuss whether header detection is a sufficient solution. We will then cover machine learning approaches to automated contract review.
Contract Review Buyers Guide Series:
- Part 1 - An Introduction to the Contract Review Software Buyer’s Guide
- Part 2 - What is Contract Review Software & Why Does it Exist?
- Part 3 - How Automated Contract Provision Extraction Systems Find Relevant Provisions, And Why “How” Matters
- Part 4 - No Rules: Problems With Rules-Based Contract Provision Extraction
- Part 5 - Manual Rule-Based Automated Provision Extraction Software Case Study: Mumboe
- Part 6 - Comparison- and Header Detection-Based Automated Contract Provision Extraction
- Part 7 - Foundations of Machine Learning-Based Contract Review Software
- Part 8 - Machine Learning Based Automated Contract Provision Extraction
- Part 9 - Machine Learning-Based Contract Provision Extraction on Poor Quality Scans
- Part 10 - Garbage In, Garbage Out: Why Who Instructs An Automated Contract Provision Extraction System Matters
- Part 11 - Further Information on Why Who Instructs An Automated Contract Provision Extraction System Matters
- Part 12 - How to Build an Automated Contract Provision Extraction System
- Part 13 - How to Add Non-Standard Clause Detection to Your Contract Metadata Extraction System
- Part 14 - Non-Standard Contract Clause Detection is Easy to Build, Hard to Get Right
- Part 15 - What is the difference between contract analysis and eDiscovery software?