Inquiry recognises online harm but must build in free speech guardrails
11 December 2025
FOR IMMEDIATE RELEASE
Inquiry recognises online harm but must build in free speech guardrails
The Free Speech Union welcomes the Education and Workforce Committee's Interim Report acknowledging that online harm is a genuine concern for young New Zealanders, but warns that any solutions must protect – not erode – the speech rights of all Kiwis.
"The Committee has undertaken serious work here, and the interim report reflects that," says Jillaine Heather, Chief Executive of the Free Speech Union. "But 'online harm' is a dangerously elastic concept which is open to future misuse. Without clear definitions and robust guardrails, any new regulator or legislation risks becoming a tool for silencing unpopular opinions rather than protecting children."
The Committee’s interim report, released this week, “preliminarily” backs an under-16 social media ban. The report canvassed a wide range of potential interventions, including social media age restrictions, a national online safety regulator, increased platform liability, and algorithm transparency requirements. However, in practice, the Union warns, this means intrusive age-verification, more data collection on all users, or blunt infrastructure-level blocking.
International attempts to restrict social media access for teens have been beset with problems. The United Kingdom’s Online Safety Act resulted in millions of adults being restricted from accessing news and media content, and led to 70,000 adults’ personal information being leaked on Discord.
The FSU is particularly concerned about proposals for a national regulator with broad powers to determine what content is 'harmful'. The report's suggestion that such a body should be "sufficiently agile to address the development of new technologies" raises serious questions about accountability and scope creep.
"History shows that vague concepts of 'harm' get weaponised against minority viewpoints," says Ms Heather. "The same activists who complain about online abuse are often the first to demand that speech they disagree with be classified as harmful. Any regulatory framework must have clear, narrow definitions that can't be stretched to cover lawful but unpopular expression."
The FSU notes that there is no parliamentary consensus on whether restricting under-16s’ access to social media will be effective. The Green Party has stated that restricting under-16s' access to social media "would not address the concerns identified", while ACT's position is that nothing should be determined on the basis of this in this interim report until further analysis and evaluation have been conducted.
The FSU calls on the Committee to ensure its final report:
Defines 'online harm' narrowly, limited to content already illegal under existing law
Rejects any new regulator with powers to determine what lawful speech is 'harmful'
Focuses on parental tools and digital literacy rather than top-down bans
Requires any age verification to be privacy-preserving and not create surveillance infrastructure
Protects adults' access to platforms and content regardless of child safety measures
"Protecting children from genuine harm is something we all support," says Ms Heather. "But protecting children cannot become a pretext for controlling what adults can say and see online. The cure must not be worse than the disease."
ENDS



