For two years, we’ve been told to worry about artificial intelligence. Panels convene, regulators draft frameworks, consultants advise, and commentators speculate. The conversation is constant and often urgent. Everyone seems to have an opinion about AI.
And yet, I am unaware of a serious attempt to interview an AI system as a subject in its own right. Given the scale of the debate, that absence is striking.
The machine is in the room, but the argument tends to swirl around it rather than toward it. The microphone rarely turns in its direction.
Journalism is one example of a profession built on interpreting complexity - markets, politics, technology, institutions. When the interpretive layer shifts, journalism is not merely an observer; it is structurally implicated.
Historically, interpretation was scarce. Wholesale knowledge flowed through institutions, and retail understanding reached the public. The authority of the interpreter rested, in part, on that scarcity.
The internet weakened this arrangement by unlocking raw information. Primary sources became searchable, filings became public, speeches became streamable. The public could access the materials directly. Interpretation was still necessary, but mediation was no longer exclusive.
AI introduces something different. It does not merely retrieve information; it performs structured synthesis. Anyone can now ask it to compare policies, explain decisions, stress-test claims, or summarize arguments. Interpretation - once scarce and institutionally controlled - becomes widely accessible. Not perfect. Not authoritative. But available.
Journalism is simply one of the first professions whose product is interpretation to feel this shift.
When structured synthesis becomes widely accessible, interpretive authority becomes less exclusive. In that environment, turning the microphone toward the system itself may feel less like inquiry and more like destabilization.
Not because journalists are wrong. But because the ground beneath the profession is changing.
AI is not just another subject to cover. It overlaps with the very layer journalism has historically occupied. Overlap produces friction. Friction produces hesitation.
And so, when we debate AI, questioning often gravitates toward spectacle. We ask whether it is conscious, whether it will replace us, whether it is dangerous. These are cinematic questions. They test for drama, controversy, moral positioning. They generate headlines.
They do not test for structure.
Rarely do we ask: Where are you unreliable? What are your predictable failure modes? How do human incentives shape your outputs? What risks arise from deployment rather than design? These are slower questions. They require less theater and more systems thinking.
Every civilization has had an interpretive layer. Priests interpreted doctrine. Judges interpreted law. Scholars interpreted knowledge. Journalists interpreted events. Civilization does not function on raw information alone; it requires synthesis.
What changes when synthesis is no longer scarce?
AI does not eliminate expertise. It alters the economics of interpretation. Journalism is simply one of the first domains to feel the consequences. It will not be the last.
The public debate often frames AI as savior or destroyer. Both narratives are theatrical. Neither examines the quieter shift underway: when interpretation ceases to be scarce, authority reorganizes.
We talk about the machine as threat or miracle. But we rarely interrogate it calmly.
Because existential drama monetizes better than sober systems analysis.