The Hallucination Defense
Why logs make 'The AI Did It' the perfect excuse
“The AI hallucinated. I never asked it to do that.”
That’s the defense. And here’s the problem: it’s often hard to refute with confidence.
A financial analyst uses an AI agent to “summarize quarterly reports.” Three months later, forensics discovers the M&A target list in a competitor’s inbox. The agent accessed the files. The agent sent the email. But the prompt history? Deleted. The original instruction? The analyst’s word against the logs.
Without a durable cryptographic proof binding the human to a scoped delegation, “the AI did it” becomes a convenient defense. The agent can’t testify. It can’t remember. It can’t defend itself.
Read More