AI Glossary for Lawyers — 25 Terms Explained
AI is entering legal practice whether firms are ready or not. Here’s a plain-English glossary focused on what matters for lawyers.
A
AI (Artificial Intelligence) — Software that performs tasks requiring human-like reasoning. In legal practice: research, document review, contract analysis, and drafting.
AI Hallucination — When AI generates false information with confidence. In legal contexts, this means fabricated case citations, invented statutes, or incorrect holdings. Multiple attorneys have been sanctioned for filing briefs with hallucinated citations. Verification is mandatory.
C
ChatGPT — OpenAI’s AI assistant. Useful for drafting client emails, brainstorming arguments, and administrative tasks. Not suitable for legal research without independent verification. Free or $20/month.
Claude — Anthropic’s AI assistant. Known for better writing quality and longer context windows. Good for drafting memos and analyzing long documents. Free or $20/month.
CoCounsel — Thomson Reuters’ AI legal assistant built on GPT-4. Searches actual Westlaw databases, cites real cases, and flags confidence levels. Designed to minimize hallucination risk.
Confidentiality Risk — The risk of disclosing client information when using AI tools. Consumer AI tools (ChatGPT free tier) may use inputs for training. Enterprise tools (CoCounsel, Harvey) offer data protection agreements. See Rule 1.6.
Context Window — How much text AI can process in one conversation. Important for lawyers because it determines whether you can paste an entire contract or brief for analysis. Claude has one of the largest context windows.
D
Document Review (AI-assisted) — Using AI to review large volumes of documents for relevant information, key provisions, or privileged content. Dramatically faster than manual review for discovery and due diligence.
E
e-Discovery — The process of identifying and producing electronic documents in litigation. AI-powered e-discovery tools use machine learning to classify documents as relevant, privileged, or responsive.
F
FERPA / HIPAA Compliance — AI tools processing protected information must comply with relevant regulations. Most consumer AI tools are not compliant. Enterprise legal AI tools typically offer compliance certifications.
Fine-tuning — Training an AI model on specific data. Harvey and CoCounsel are fine-tuned on legal texts. This makes them more accurate for legal tasks than general-purpose AI.
G
Generative AI — AI that creates new content. When AI drafts a memo or contract clause, that’s generative AI. When it searches case law, that’s retrieval — different capability.
H
Harvey AI — An AI platform built for law firms. Handles contract analysis, due diligence, and memo drafting. Used by major firms including Allen & Overy. Enterprise pricing.
L
Large Language Model (LLM) — The technology behind ChatGPT, Claude, and legal AI tools. Trained on massive text datasets. LLMs generate statistically likely text — they don’t reason about law. This is why verification is essential.
Legal AI Ethics — The intersection of professional responsibility rules and AI use. Key areas: competence (Rule 1.1), confidentiality (Rule 1.6), supervision (Rules 5.1/5.3), and candor to the tribunal (Rule 3.3).
M
Machine Learning — AI that improves with more data. e-Discovery tools use machine learning: as you mark documents as relevant or irrelevant, the system gets better at predicting which remaining documents matter.
Mata v. Avianca — The 2023 case where an attorney was sanctioned for filing a brief containing AI-fabricated case citations from ChatGPT. Now the standard cautionary example for AI use in legal practice.
N
Natural Language Processing (NLP) — AI understanding and generating human language. Enables tools like CoCounsel to accept research questions in plain English rather than Boolean search strings.
P
Prompt — The instruction given to an AI tool. In legal practice: “Summarize the key holdings of [case]” or “Draft an indemnification clause favoring the buyer.” Prompt quality directly affects output quality.
Predictive Coding — AI classifying documents in e-discovery based on examples you provide. You review a sample, the AI learns your criteria, and applies them to the full document set.
R
RAG (Retrieval-Augmented Generation) — AI that searches a verified database before generating a response. CoCounsel uses RAG — it searches Westlaw first, then generates a summary based on actual cases. This dramatically reduces hallucination.
S
SOC 2 Compliance — A security certification for service providers. AI tools handling confidential legal data should be SOC 2 Type II certified. Check before uploading any client information.
Spellbook — An AI contract drafting tool that works inside Microsoft Word. Suggests clauses, flags unusual terms, and generates language based on descriptions. Trained on legal contracts.
T
Token — The unit AI uses to process text (~¾ of a word). Token limits affect how long a document you can analyze in one request. Relevant when pasting lengthy contracts or briefs.
Training Data — What the AI learned from. General AI tools were trained on internet text. Legal AI tools were additionally trained on case law, statutes, and legal documents. Neither was trained on your client’s confidential information (unless you provide it).
Bookmark this page. We update it as AI legal technology and ethical guidance evolve.