By Michael Douglas, University of Western Australia
When you go online and write something nasty about a person, or even a small business, you risk being sued for defamation.
But if someone else goes online and writes something nasty about a person on your social media page, can you be held liable even though you didn’t write it? Depending on who you are: maybe.
A recent decision of the Supreme Court of New South Wales determined that media companies could be liable for the defamatory comments made on news stories on their Facebook pages.
That is, media organisations could be held liable for the comments of random people on the internet. Journalists, the companies that employ them, and a bunch of people on Twitter are not happy.
Voller’s case
Dylan Voller is the young man whose treatment in custody inspired a Royal Commission. His case attracted significant press coverage, as well as ‘commentary‘ which seems to pass itself off as news but is really something else.
Voller sued the publishers of The Sydney Morning Herald, The Australian, the Centralian Advocate, Sky News Australia and The Bolt Report. He sued them for defamation for content on their Facebook pages.
What makes this case unique is that Voller did not sue based on posts made by the media companies who were responsible for the pages. Rather, he sued based on comments made by members of the public on ten Facebook posts, arguing that the media companies behind the pages were responsible.
The media defendants argued that Voller’s case was based on an incorrect understanding of the law. Justice Rothmam disagreed, holding that they were “publishers” of third-party comments on their public Facebook pages.
Anyone can be a ‘publisher’ of defamation
The case turns on the concept of ‘publication’.
To be liable for defamation, you must publish something that is defamatory. In defamation law, publication is the process of communication of defamatory ‘matter‘ to a person other than the plaintiff.
This means that a publisher of defamatory content is not necessarily the author of the defamatory content. For example, consider a defamatory letter to the editor. Although the newspaper does not author that letter, it may still be treated as a publisher because it communicated that defamatory letter.
‘Publication’ does not even require a positive act: in certain cases, an omission may constitute a publication of defamation. More than 90 years ago, an English court determined that owners of a golf club could be liable for defamation posted on the club notice board which they did not author. The court reasoned that the owners knew of the defamation, and could have prevented it, but didn’t.
The common law adapts that old reasoning to the internet age. Before Voller’s case, a New Zealand court held that a host of a Facebook page could be liable for defamatory comments on their page if the host actually knew about the comments and failed to remove them in a reasonable time.
Providers of digital forums and platforms – from businesses with Facebook pages, to Google itself – could be liable for defamatory content authored by other people if they know about it and fail to act.
A landmark case?
In some ways, this case just adapts the old authorities on publication to a modern situation. It is also a fact-specific decision, made with reference to evidence of the particular moderation functionality available to the hosts of these particular pages on particular dates.
But the reasoning deployed in Voller’s case does have broader significance. The fact that the Facebook pages of the defendants allowed them to vet comments in advance meant that they had some control over those comments. The defendant companies could have dedicated staff to ensure any comments were not defamatory before making them visible, but failed to do so. Their control over the comments opened the door to their responsibility for the comments as ‘publishers’.
The court also considered the business model of the defendants. It should go without saying, but it is important to remember that the production of news and commentary is a business. Media companies depend on broad readership to make money. Arguably, social media platforms like Facebook have helped media companies build readership by linking to news websites. The public’s engagement with media companies’ social media content, via the comments sections of news posts, could be one of the factors keeping those companies alive.
The court heard evidence that the appearance of defamatory comments was a “thoroughly predictable” result of posting a relevant article onto a public Facebook page. Social media defamation risk is a moral hazard of the modern media business.
Here is the controversial gist of Voller’s case: by encouraging engagement, the media walked into this mess. In the judge’s words:
[a] defendant cannot escape the likely consequences of its action by turning a blind eye to it.
This means that media companies, and anyone who drums up social media engagement with controversy, are well advised to dedicate more resources to content moderation.
The sky has not fallen in
According to my friend and professor of media law David Rolph, the case “seems to go further than any decision in the common law world holding intermediaries liable for defamation as publishers”.
It is, however, a first-instance decision, which may be appealed. Justice Rothmam’s decision is on the issue of publication, not liability.
Further, even if an ‘intermediary’ like a media company is held to be a publisher, it may still escape liability. In certain cases, would-be publishers will have an innocent dissemination defence for the publication of defamatory content they did not know about.
A NSW-led law reform process is considering bolstering that defence even further. The parts of Voller’s case which media companies do not like may be short lived.
Until then: be wary of what people say on your social media pages.
This article is republished from The Conversation under a Creative Commons license. Read the original article.
COMMENTS
SmartCompany is committed to hosting lively discussions. Help us keep the conversation useful, interesting and welcoming. We aim to publish comments quickly in the interest of promoting robust conversation, but we’re a small team and we deploy filters to protect against legal risk. Occasionally your comment may be held up while it is being reviewed, but we’re working as fast as we can to keep the conversation rolling.
The SmartCompany comment section is members-only content. Please subscribe to leave a comment.
The SmartCompany comment section is members-only content. Please login to leave a comment.