The Wrong Question
Every few months, a new think piece declares that AI will replace tax professional and concerned readers forward it to each other with varying degrees of panic. I've read a dozen of them. They're all asking the wrong question.
The right question isn't will AI replace enrolled agents? It's what does the EA profession look like when the mechanical layer of tax work is fully automated? Those are different questions, and the second one has a historical answer. We just have to look in the right place.
140 Years of Disruption That Didn't Kill the Profession
The enrolled agent credential dates to 1884, created by Congress in the aftermath of the Civil War to regulate claims agents who were filing on behalf of citizens against the government. That's 140 years of professional history.
In that span, consider what was supposed to make tax professionals obsolete:
The adding machine eliminated hand-calculation of columns of figures. Widespread electrification transformed office work entirely. The 16th Amendment in 1913 created federal income tax — a structural change to what tax professionals even did for a living. Mainframe computing in the mid-20th century automated payroll and large-scale calculation that previously required rooms full of clerks. Personal computers and spreadsheets in the 1980s put those same capabilities on every desk. TurboTax in 1993 put guided tax preparation in the hands of any W-2 employee willing to answer questions on a screen.
The profession didn't disappear. It evolved. Each wave of automation eliminated the most mechanical layer of the work — and elevated what remained.
The Engineering Parallel
Nobody asks whether engineers were replaced by AutoCAD.
The question sounds strange because we've seen how it played out. The logarithm table didn't replace the engineer — it eliminated the drudgery of manual calculation so the engineer could design more. The slide rule was faster. The electronic calculator faster still. CAD didn't replace architects and structural engineers; it let them iterate on designs in hours that used to take weeks and tackle complexity that was previously out of reach.
Engineering didn't shrink. It expanded, elevated, and became more technically demanding — not less.
The progression in tax work follows the same pattern: ledger books, adding machines, spreadsheets, professional tax software, and now AI. Each step automated the mechanical layer. What remained — and what grew — was judgment, pattern recognition across a client's full financial picture, and accountability for the outcome.
The EA profession has been riding this escalator for 140 years. AI is the next step, not the last one.
The Threat Is Real — Let's Not Wave It Off
Here's where I'll lose some readers if I'm not careful, because the previous section can sound like comfortable denial.
GenAI is genuinely different from prior automation in one important respect: it doesn't just automate computation. It can reason, explain, and advise. A client can ask a large language model whether they should do a Roth conversion and receive a sophisticated, well-organized answer — in plain English, immediately, for free.
That matters. The entry-level market for tax advice — the client who needs someone to explain the basics, organize their documents, and file a straightforward return — is under real pressure. If you were planning to build a career on uncomplicated individual returns, the threat is not hypothetical.
So let's be precise about what AI actually threatens: the mechanic layer. The part of EA work that is retrieval, calculation, and template application. That is genuinely at risk, and pretending otherwise helps no one preparing for this exam.
The Gap AI Cannot Close
What AI cannot do is be responsible.
A client who follows ChatGPT's advice on a Roth conversion strategy and gets it wrong has no recourse. No one signed off. No one can be sanctioned. No one is on the hook. The client bears the consequence alone.
A client who relies on a credentialed enrolled agent has a human professional who reviewed the situation, applied judgment to their specific facts, and can be held accountable to the IRS and to the client if something goes wrong. That accountability layer is not a formality — it is the actual product.
This is the engineering parallel made concrete: CAD can generate a structurally unsound design. The engineer catches it, because the engineer signs the drawings and owns the outcome. AI can generate a plausible but incorrect tax position. The EA catches it — and is the one whose PTIN is on the return.
As AI proliferates, that accountability gap doesn't narrow. It widens. The more AI output floods the zone, the more valuable the credentialed human who can audit it, catch its errors, and stake their professional reputation on the result.
What This Means for How We Study
The EA exam tests whether you know the rules. That's the floor, not the ceiling.
The next generation of enrolled agents will need to be more than rule-retrievers — AI will do that faster than any of us. What will distinguish the EA who thrives from the one who gets squeezed is the capacity to ask the right questions, connect a client's tax situation to their broader financial picture, and own the judgment call when the answer isn't obvious.
That's not a reason to study less. It's a reason to study differently — to understand why the rules exist, not just what they say, so that when the edge case arrives that no AI confidently handles, you're the one in the room who can.
We're not the last generation of enrolled agents. We might be the first generation that has to be something more than mechanics.
Next issue: PSI Has Lofty Plans for the EA Exam. Candidates Are Flying Blind.
