The Accountability Wrapper
Published on Mar 10, 2026
Sequoia published Services: The New Software earlier this month. The core argument: winning AI companies won’t sell tools that help people do work - they’ll just do the work. Copilots become autopilots. Software budgets get replaced by labor budgets. For every $1 spent on software, $6 is spent on services. AI is collapsing that $6 into software-style margins.
The piece maps out the opportunity well. What it doesn’t spell out is the filter - which services actually survive this, and which get eaten.
The accountability wrapper
Here’s the test: can a sophisticated user get 80% of this outcome by spending 10 minutes with Claude?
For a lot of services, the honest answer is yes. Translation - paste your document, done. Market research - run a deep research query, close enough. First draft of almost anything - prompt it out. These services die. Not because AI is marginally better, but because the raw AI output IS the deliverable. There’s no gap between “what the AI produces” and “what the client wants.”
Some services survive this test cleanly. A CA filing your tax return. A law firm drafting a contract. A licensed insurance broker placing your policy. Not because AI can’t do the intelligence work - it can, mostly. But because the client isn’t just buying the output. They’re buying the signature. The liability. The guarantee. Someone accountable if it’s wrong.
That’s the accountability wrapper. It’s what I’d look for in any service business trying to figure out its exposure.
The distinction is structural, not qualitative. “Human review for accuracy” erodes as models improve - in two years, AI translation quality is fine without review. “Licensed sign-off legally required” doesn’t erode. A CA’s signature on a tax return has legal standing that no AI output has, regardless of how good the underlying analysis gets.
What Lawhive actually did
The Sequoia piece references Lawhive as a case study. What’s interesting about Lawhive isn’t that they built better legal software. They became a law firm.
If they’d built a legal drafting tool, they’d be competing against Harvey, against ChatGPT plugins, against every law firm building their own tooling. Instead they acquired a law firm license, hired lawyers to work within their system, and sell directly to consumers. The AI does the work. A licensed solicitor reviews and signs. The client gets an affordable legal outcome, not a tool.
Every incumbent legal software vendor is competing on features. Lawhive is competing on outcomes.
The same pattern is showing up everywhere Sequoia flags: Rillet in accounting, WithCoverage in insurance, Crosby in legal. None of them are building tools for existing service providers. They ARE the service provider, with AI restructuring the internal cost.
What this means if you run a services business
I run CodeWithSense - we embed AI engineers into product teams. The honest version of the Sequoia thesis applied to my own work: the intelligence layer of what we do is increasingly automatable. Writing code to spec, integrating APIs, debugging known patterns - models are improving at this faster than most engineers admit.
What survives: the judgement layer. Knowing what a client actually needs versus what they asked for. Recognizing when a system design will cause problems six months out. Reading whether a project is going to ship or die based on organizational dynamics. The accountability - someone who owns delivery, not just output.
The near-term question for any services business isn’t “will AI replace us” but “where exactly is our accountability wrapper, and is it structural or just qualitative?” If your value is in the quality of your output, you’re on borrowed time. If your value is in owning the outcome - the client pays you because something needs to be right and you carry it if it isn’t - that holds.
The longer-term question is harder: how much of what looks like judgement is actually just intelligence work we haven’t automated yet. Nobody knows the answer. But the Lawhive model gives a partial one: even in a world where AI does almost everything, someone still has to sign.
Related: Companies Made of Agents - the extreme case: what happens when you build the whole organization this way from the start.