Free Speech Union Logo
March 13, 2026

Define the Harm First: UK Rejection of Social Media Ban Exposes the Gap in New Zealand’s Approach



MEDIA RELEASE

13 March 2026

Define the Harm First: UK Rejection of Social Media Ban Exposes the Gap in New Zealand’s Approach

The Free Speech Union is calling on the Government to do the work that the Education and Workforce Committee did not: define the specific harms, identify the gaps in existing law, and develop targeted solutions, rather than reaching for a blanket social media ban that even UK child safety charities say will not work.

On 5 March, the Education and Workforce Committee released its final report after a nine-month inquiry into online harms facing young New Zealanders. The report makes 12 recommendations, headlined by restricting social media for under-16s and establishing a new online safety regulator. What it does not do is the foundational work that should come before any of that.

“The committee heard 400 submissions and 87 oral witnesses, and still couldn’t clearly define the harm it wants to regulate,” said Jillaine Heather, CEO of the Free Speech Union. “It acknowledged that ‘harm’ and ‘emotional wellbeing’ are subjective concepts, then built an entire regulatory framework on top of them anyway.”

"A regulator tasked with policing 'subjective harm' without a clear, measurable mandate effectively becomes a 'Ministry of Truth' or a 'Censorship Bureau' and invites massive regulatory overreach. It hands a state-appointed official the power to decide which legal speech is 'uncomfortable' enough to be suppressed."

The committee’s report concedes that current legislation is “fragmented”, yet rather than mapping those failures and closing specific gaps, it leapfrogged to recommending a new regulator, age verification infrastructure, and platform liability for broadly defined harm.

The UK just showed a better path

Four days after the committee report dropped, UK MPs voted 307 to 173 to reject a blanket social media ban for under-16s. More telling than the vote itself was why.

Forty-two leading UK child protection charities, including the NSPCC, the Molly Rose Foundation, and the 5Rights Foundation, issued a joint statement warning that a blanket ban would be the “wrong solution” with “an array of unintended consequences.” These are not free speech organisations. They are child safety organisations, many founded by bereaved families.

Ian Russell, whose daughter Molly took her own life after viewing harmful content online, cautioned that bans “risk unintended consequences that could leave children at greater risk of harm by treating the symptoms, not the problem.”

The UK charities called for an approach that is “both broader and more targeted”: enforce existing age restrictions for under-13s, stop platforms using addictive design features on teenagers, and compel tech companies to block harmful content at the source.

“That is what doing the work looks like,” said Heather. “You name the specific harms, you identify which laws already cover them, you find the gaps, and you target those gaps.”

What the committee should have done

The Free Speech Union supported the committee’s inquiry. Protecting children online matters. But the committee skipped the critical steps:

It did not define the specific harms or audit which are already covered by the Harmful Digital Communications Act, the Crimes Act, the Classification Act, the Privacy Act and that Oranga Tamariki Act.

It did not explain why existing regulators have failed to act, or establish that current tools are insufficient, before recommending new powers and a new regulator.

“Internal Affairs Minister Brooke van Velden has herself observed that illegal content is already policed and that concepts like "harm" are inherently subjective. She is right. The Government's response to this report should start where the committee stopped: with precision.

“New Zealand does not need another layer of vague regulation sitting on top of laws we haven’t bothered to enforce,” said Heather. “We need to hold platforms accountable for defined, measurable harms, not whatever a future regulator decides makes people uncomfortable.”

The UK’s 42 child safety charities have shown that you can take children’s safety seriously and still reject a blunt, performative ban. It is time New Zealand did the same.

ENDS

Notes to editor:

1. The Education and Workforce Committee’s final report, Inquiry into the harm young New Zealanders encounter online, was released 5 March 2026. The Government must respond by 3 June 2026.

2. On 9 March 2026, UK MPs voted 307-173 to reject Lord Nash’s amendment to the Children’s Wellbeing and Schools Bill that would have banned social media for under-16s.

3. 42 UK child protection charities issued a joint statement in January 2026 warning against a blanket ban and calling for a “broader and more targeted” approach.

4. The NSPCC proposed three targeted actions: enforce existing under-13 restrictions, stop addictive platform design for teenagers, and compel tech companies to block harmful content at the source.

FSU Media Contact: Jillaine Heather | [email protected]

www.fsu.nz