In May 2024, an officer of Australia’s eSafety Commissioner assessed a post on X critiquing queer theory in primary schools. The officer concluded it did not meet the legal threshold for removal. The post was removed anyway. The post’s author, Celine Baumgarten, was not notified. When she found out and sought review, the Commissioner went to the Federal Court to stop her. Last month, the Full Court dismissed that appeal. The decision is called Baumgarten v eSafety Commissioner, and it is worth reading.
In March 2026, the Australian Strategic Policy Institute published Social Insecurity, a report on social cohesion, outrage economics and national resilience in Australia. ASPI describes its reports as delivering “deeply researched analysis that surfaces implications, tests assumptions, and may consider future courses of action.” Social Insecurity is co-authored by John Coyne, Director of National Security Programs, and Justin Bassi, the institute’s Executive Director. Its central recognition — that social cohesion has become a national security problem — is correct and overdue. Trust in the federal government sits at 33 percent. The attention economy rewards speed over accuracy. Radicalisation pathways are faster and more individualised than they were a decade ago. Which makes its central recommendation to expand the Office of the eSafety Commissioner hard to explain.
The report quotes John Stuart Mill on the necessity of free collision of adverse opinions. It warns that treating awful-but-lawful speech as a policing problem corrodes social licence and hands radicals the repression narrative they seek. The authors mean it — the argument depends on it.
Baumgarten’s post was awful-but-lawful by the Commissioner’s assessment. The regulator used informal pressure on X to achieve a removal outcome the law did not permit, in a case her own officer had already decided did not meet the legal standard. The post was withheld within 73 minutes. When Baumgarten sought review, the Commissioner went to the Federal Court to prevent it.
Baumgarten is bisexual and was criticising queer theory in schools. That detail matters because the office is routinely justified as protection for minorities.
The action against Baumgarten was not a one-off affair. The Commissioner’s own evidence confirmed she issues a few hundred such informal removal requests each year. Formal notices — the ones that carry statutory preconditions and review rights — number three or four. This is how the office operates. The Full Court found the process functioned in practice as compulsion. The office fights to keep its practices unexamined.
ASPI’s report recommending expansion of this office mentions none of it.
Security institutions tend to expand the definition of the problem domain to match the solution set they already possess. ASPI’s professional formation is in defence and intelligence. When social cohesion becomes a national security problem — as this report explicitly frames it — the available tools are institutional, regulatory and coercive. The possibility that those tools are partly producing the problem does not feature in the analysis.
The question is never whether a policy achieves its stated intention. Thomas Sowell has settled this. The question is what incentives the system creates, and what outcomes those incentives reliably produce. An office whose purpose is the removal of harmful content, and whose performance is measured by removal outcomes, will keep finding harmful content to remove. It will develop informal processes when formal ones prove too constrained. It will resist review because review introduces friction into the outcome it exists to produce. None of this requires bad intent. It is incentivised by the structure.
The report recognises that trust requires proportionate enforcement, transparent decision-making, and institutions that behave predictably. That framing is correct. An office that removes lawful speech through informal channels, resists review, and then seeks parliamentary legitimacy for expanded powers is none of those things. ASPI’s recommendation to expand the eSafety Commission as a social cohesion measure is a category error.
ASPI’s report identifies overreach as a predictable failure mode. It warns that aggressive content controls erode trust and fuel repression narratives. It acknowledges the awful-but-lawful should not be banned. Then it recommends expanded resources and parliamentary support for the eSafety Commissioner — the same office that bans the awful-but-lawful through a process designed to avoid scrutiny. The report does not examine the Commissioner’s operational record. It does not address the informal alert process. It does not note the ratio of informal removals to formal notices. It makes the recommendation anyway.
The more likely explanation for ASPI’s recommendation is structural. Once social cohesion is framed as a national security problem, the eSafety Commissioner appears as infrastructure rather than as an institution requiring the same scrutiny applied elsewhere in the report.
The report warns that foreign adversaries exploit declining trust in democratic institutions. Correct. Discretionary speech regulation exercised through opaque informal processes is not a defence against that exploitation. Every removal that cannot be examined, every appeal brought to prevent review, is usable evidence that Australian institutions cannot be trusted to exercise power proportionately. ASPI understands this dynamic in every domain except, apparently, the one closest to home.
The eSafety Commissioner is a trust-destroying instrument. The report’s own conclusion states that resilience is built not just by laws passed but by norms defended and institutions trusted. That sentence is correct. Applied consistently, it argues against the report’s central recommendation.