Contents
A recent judgment of the New South Wales Supreme Court (NSWSC) is creating pressure for law change because it exposes the operators of public digital platforms to liability for external comments posted on those platforms.
Professor David Rolph, one of Australia’s leading defamation experts, has said that it changes the legal landscape for digital platform operators by going “further than any decision in the common law world holding intermediaries liable for defamation as publishers.”
The case
Dylan Voller – a former detainee at a youth detention centre and now a high profile justice campaigner – sued three major media groups for defamatory statements made about him on their public Facebook pages by members of the public.
In Voller v Nationwide News Pty Ltd & Ors the NSWSC found that the media companies:
- were the publishers of the public comments, not “merely a conduit” for them
- were responsible for the comments because they could have “hidden” them until they had vetted the content, and
- used Facebook for commercial purposes to drive internet traffic to their news sites, and so had assumed responsibility for the comments.
The NSWSC suggested that administrators of public Facebook pages should be required to “hide” third party contributions pending a content check by developing a list of prohibited words sufficient to cover any sentence (examples quoted by the Court were all pronouns, the definite and indefinite articles and/or all conjunctions and prepositions).
The decision has been widely criticised. The Australian newspaper, one of the defendants in Voller, wrote a scathing editorial, saying the Court had shown “little practical understanding of how Facebook operates, the volume of material that works its way through public pages and the resources that would be required to undertake such prophylactic actions on a grand scale".
The New Zealand position
The NSWSC in Voller reached the opposite conclusion to the New Zealand Court of Appeal in Murray v Wishart 2014, a case covering similar ground.
The background was that the father of the Kahui twins had been acquitted of their murder and had suggested in the course of his trial that their mother, Macsyna, was responsible. This theory gained some currency among the general population, prompting freelance journalist Ian Wishart to collaborate with Macsyna on a tell-all book to establish her innocence.
Ahead of the book’s release Christopher Murray established a Facebook page “Boycott the Macsyna King Book”. The page was active only a couple of months but attracted a number of negative comments about Wishart, who filed defamation proceedings against Murray.
The Court found that Murray had not published the statements and should not be liable for them unless he had actual knowledge that they were defamatory and had failed to take them down. Any other conclusion, the Court warned, would place an undue burden on social media hosts and would represent an unwarranted intrusion on freedom of speech.
Since the Murray decision, Parliament has introduced a safe harbour for digital platform owners in section 24 of the Harmful Digital Communications Act 2015.
This gives them both civil and criminal immunity in relation to statements posted on their platforms by others provided, if they receive a complaint, they inform the statement’s author within 48 hours.
If no response is received within 48 hours of that notification, or if the author consents, they must remove the content. If the author responds within the deadline and refuses consent to the material being removed, it stays and the host’s responsibility is met.
Importantly, the protection in section 24 applies to defamation.
Could Voller alter the New Zealand position?
Both Australia’s and New Zealand’s defamation laws are based on the common law. As a result, the Voller decision could impact the New Zealand position. Indeed, the door was left open to this in Murray – where the Court of Appeal recognised that decisions about liability of internet hosts and platforms were highly-fact dependent.
What is more, in Voller the NSWSC discussed Murray v Wishart extensively.
The NSWSC suggested that our Court of Appeal had got it wrong but thought that, in any event, Murray was of limited relevance because of the distinction between Murray’s “private [individual] Facebook page” which allowed no opportunity for pre-publication vetting and the much more sophisticated and public Facebook pages run by the media organisations for commercial purposes.
That said, we do not think the Voller findings should or could apply in New Zealand because:
First, the Court was clear in Murray v Wishart that New Zealand was following the English approach which recognises that internet platforms have little real control over externally generated comments.
Second, unlike in Australia, freedom of speech is expressly protected by the New Zealand Bill of Rights Act 1990 which, as mentioned above, the Court relied on in Murray v Wishart.
Third, and perhaps most importantly, if Voller applied, this would undermine the safe harbour in section 24 of the Harmful Digital Communications Act. It would be illogical for digital platform owners to be liable for users’ comments only until a complaint is received and then to achieve immunity by complying with the required complaints process.
From here?
Expectations are that Voller will be appealed. But industry leaders in Australia, aware of the dangers it creates, are now engaged in a coordinated campaign to pressure the Morrison Government into law change to protect digital platforms.
If Australia changes its defamation law, New Zealand could follow suit. A number of possible solutions could be adopted in New Zealand to ensure operators of digital platforms are suitably protected by:
- broadening the innocent dissemination defence in the Defamation Act to incorporate section 24 from the Harmful Digital Communications Act, and/or
- making the defence technologically neutral by specifying that the defence extends beyond “processors” and “distributors” to the operators of websites, and/or
- codifying the Murray v Wishart decision by specifying that liability will apply only where it can be demonstrated that the platform owner had knowledge that the site contained defamatory content and did nothing about it.
An amended version was published in LawNews on 12 July 2019.