For the Legal Sector

Protecting Legal Judgement When Using AI Research and Drafting Tools

When Westlaw Edge or Harvey AI finds a case in seconds, your junior lawyers stop building the research discipline that teaches them to spot weak authority. When ChatGPT drafts a contract clause in minutes, your team stops learning why certain provisions matter in your client's industry. The speed of these tools can erode the very judgement that distinguishes legal advice from information lookup.

These are suggestions. Your situation will differ. Use what is useful.

Download printable PDF

AI outputs must pass your professional conduct test, not your time budget

Lexis+ AI and Casetext can hallucinate citations that read like real cases. A junior lawyer under time pressure may cite a case the AI generated, exposing you to disciplinary action and malpractice liability. Before any AI-drafted research memo or contract language reaches a client file, a lawyer with subject matter knowledge must verify every citation, statutory reference, and factual assertion against primary sources. Speed is the enemy of this step. If your firm adopts AI to cut review time, you have inverted the risk management equation.

Preserve junior lawyer development by assigning foundational research as human work

The research process teaches judgement. A junior lawyer who spends two hours searching for unfavourable authority and finding it learns to anticipate opposing counsel. A junior lawyer who asks Harvey AI for the five best cases and moves straight to drafting never builds that instinct. If your firm uses Casetext or Westlaw Edge to eliminate the research phase for junior staff, you are eliminating the stage where they learn to distinguish strong from weak legal reasoning. The cost appears on your balance sheet later, when mid-level lawyers cannot spot a faulty analogy or miss a controlling circuit split.

Use AI as a check on your reasoning, not as the source of your reasoning

The risk of commoditised client advisory grows when your firm uses ChatGPT or Lexis+ AI as the first step in legal analysis. Your client hired you for reasoning that reflects your firm's experience and the specific facts of their matter. If your analysis flows from what the AI suggested rather than from your independent assessment of risk and opportunity, you have handed your competitive advantage to a product that your competitors licence on identical terms. Start with your own analysis. Use AI to find gaps, stress-test arguments, or surface cases you missed. This reverses the workflow and keeps the reasoning layer yours.

Create transparency about where AI touched a work product

Your client agreement and engagement letter should disclose which tasks you plan to use AI tools for and which tasks you will not. Do not assume that a general statement about using current technology covers the use of generative AI in legal reasoning or document generation. If your firm uses Harvey AI for first-draft contract language or Lexis+ AI for case analysis, your client should know this, partly for informed consent and partly because it affects how they should review the work product. Some clients will accept AI-assisted drafting for routine contract schedules but require human-only work for risk allocation clauses. Your engagement terms should be specific enough to allow this conversation.

Defend the work that builds judgement, even when it costs more time

Your firm's profitability depends on your lawyers being able to solve problems that junior staff cannot. If every task that AI can do faster gets handed to the AI, your mid-level and senior lawyers have fewer clients, fewer complex matters, and less leverage for partner compensation. The hours a junior spends learning to draft a release or spot an unfavourable statute are hours that generate lower billing rates today but higher leverage and specialisation tomorrow. Defend the human research and reasoning work by building it into your rates and project scope, not by treating AI speed as permission to reduce the hours your team spends on thinking. When you win a matter because your lawyer spotted a weakness in opposing counsel's contract, that advantage came from years of human practice, not from AI access.

Key principles

  1. 1.Professional liability flows from errors in AI outputs your lawyers did not catch, so human verification of every citation and assertion is mandatory before it reaches a client file.
  2. 2.The research and drafting work that teaches judgement is the same work that AI promises to eliminate, so you must deliberately preserve human-only projects for junior lawyer development.
  3. 3.Your competitive advantage is your reasoning, not your access to the same AI tools your competitors licence, so use AI to test your conclusions rather than to generate them.
  4. 4.Your client's trust depends on clarity about where AI was used in their work product, so your engagement terms and matter records must track AI involvement specifically.
  5. 5.The long-term value of your firm comes from lawyers who have mastered independent analysis, so the hours AI saves must be reinvested in judgment-building work, not in profit margin.

Key reminders

Related reads

The Book — Out Now

Cognitive Sovereignty: How To Think For Yourself When AI Thinks For You

Read the first chapter free.

No spam. Unsubscribe anytime.