Profits for PartnersLegal Management Update
March 2026
Monthly newsletter for law firm leaders

What Law Firm Leaders Should Be Watching

A briefing for managing partners and law firm leaders on current developments affecting how firms are governed, priced, staffed, and run.

This month at a glance

March reporting points to a more serious phase of legal AI adoption.[1][8] Firms are moving beyond casual experimentation and starting to build it into the way they operate. That shift reaches well past technology purchasing. It affects supervision, pricing, quality control, staffing, vendor oversight, and leadership responsibility.

The practical issue is whether the firm’s management systems are keeping pace with changes in the economics and risk profile of legal work.

AI adoption is becoming operational

Legal management developments this month suggest a more mature phase of adoption. Firms are no longer just trying tools in isolated pockets. They are starting to assign executive responsibility and tie AI use to real workflows, with clearer ownership than before.

A notable example came from Herbert Smith Freehills Kramer, which described a model that puts AI oversight under a global chief AI officer and pairs lawyers with forward-deployed legal engineers.[6] Most Canadian firms will not copy that structure directly. The signal still matters. AI is moving closer to firm leadership.

The management question is shifting toward ownership and supervision, with a closer look at how AI fits the firm’s service model.

This matters most in firms where informal experimentation is already happening. Once tools influence drafting, research flow, client communication, or matter economics, informal use becomes a management issue. Leaders need a clear view of where AI is being used and whether that use fits the firm’s confidentiality and quality expectations.

Governance is moving onto the leadership agenda

Recent ABA commentary and Legalweek reporting point in the same direction.[2][3] Firms are expected to show that AI use is controlled and credible. They need approved tools, workable rules, proper review standards, and a clear process for bringing in new products.

In management terms, AI governance now sits beside quality control and risk management. A short policy by itself will not help much if lawyers do not know which tools are approved or when human review is mandatory.

For smaller and mid-sized firms, the answer may be simpler than in global firms, but it still needs structure. The policy needs an owner. Product evaluation needs an owner. Exceptions need a decision-maker. Without that clarity, adoption becomes fragmented and the firm’s risk profile becomes harder to judge.

March 2026 issue continued

Pricing pressure is growing

As AI reduces time on routine and mid-level tasks, the link between hours worked and value delivered becomes harder to defend in some categories of work. Recent conference coverage suggests this will strengthen the case for hybrid pricing and tighter matter scoping.[4]

The risk includes discount pressure. It also includes pricing AI-assisted work as though the underlying cost structure has not changed. That can create client friction and erode margin even while lawyers feel extremely busy. The management challenge is to see where the pricing model is lagging behind the workflow.

For many firms, the next step is to review which work types are becoming more predictable and decide where pricing needs to move first.

Risk management still matters

Court sanctions remain a reminder that supervision still matters in AI-assisted work. Reuters reported a March appeals court decision imposing a $30,000 sanction after fake citations appeared in a filing.[5] For firm leaders, every AI-related mistake is also a process failure that exposes weaknesses in training and review.

That matters because the leadership response should not focus only on the individual lawyer involved. The larger question is whether the firm has given people clear expectations and effective review practices. If not, the risk is systemic rather than isolated.

The strongest management response is usually procedural. Define when AI assistance is allowed and what level of review is required for different forms of work product.

Cybersecurity and vendor oversight stay central

As firms adopt more AI-enabled workflows, information governance becomes more important. Practical Law commentary highlighted data sovereignty and security as priority concerns for 2026, while Legalweek reporting emphasized systems that are governed and defensible.[3][7]

The management burden is to govern a growing mix of tools and vendors. A firm may now rely on several connected systems, each touching sensitive information in different ways. Leadership needs a view across that stack.

That means vendor review and confidentiality controls become operational issues. In practice, firms that move fastest without creating unnecessary risk are often the firms that simplify tool choice and apply consistent rules across the approved environment.

What firm leaders should do now

Start by reviewing whether the firm has a real governance structure rather than a growing collection of tools. Then look at practice areas where automation is reducing time faster than pricing is adapting. Tighten verification standards for AI-assisted work and reassess vendor oversight across the firm’s technology environment. Training should focus on judgment and responsible use.

Final page

Talent strategy is shifting

Canadian hiring commentary points to continued demand for legal operations, compliance, technology, employment, and advisory roles. Robert Half also reports skill gaps in legal technology and workflow capability.[10] That suggests adaptability and judgment are becoming more valuable alongside technical legal skill.

For managing partners and executive directors, this has direct implications for recruitment and development. Hiring for technical excellence alone is becoming less sufficient. Firms increasingly need people who can work well inside redesigned processes and exercise sound judgment in AI-assisted environments.

Education is moving from optional to expected

Reuters also reported a proposal that would require practice-based AI training in California-accredited law schools.[9] Even where that proposal does not apply, it reflects a broader direction of travel. AI competence is moving toward baseline capability.

That development matters because it changes the leadership agenda. Firms that wait for outside pressure before building internal capability may find themselves behind on productivity and client confidence. Training should focus on judgment and responsible use.

Firms that respond well are likely to treat AI capability as part of professional development and management discipline, not as a side project for a few enthusiasts.

Source notes

This issue draws on March 2026 reporting and commentary selected for management relevance. Source notes are listed below:

  1. Wolters Kluwer, Future Ready Lawyer 2026.
  2. American Bar Association, Law Practice Magazine, Why You Need an AI Policy Yesterday, and How to Write It.
  3. Legal IT Insider / LegalTechnology, Legalweek 2026: The floor report and a sobering reminder of why we’re here.
  4. Reuters, Lawyers flood tech expo wondering: Is AI about to devalue their time?
  5. Reuters, US appeals court fines lawyers $30,000 in latest AI-related sanction.
  6. Business Insider, How a Big Law heavyweight is shaping its AI rollout around Palantir’s model.
  7. Practical Law / Thomson Reuters, 2026 commentary on legal AI implementation, data sovereignty, security, and use-case governance.
  8. Thomson Reuters, Highlights from the 2026 AI in Professional Services report and what it means for legal teams.
  9. Reuters, California could be first state to make law schools teach AI.
  10. Taylor Root Canada and Robert Half Canada 2026 hiring commentary referenced for talent-market direction.

This newsletter was created with the assistance of artificial intelligence.