This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.

MediaWrites

By the Media, Entertainment & Sport group of Bird & Bird

| 12 minute read

Fork in the road: Australian internet defamation law reform options released

On 12 August 2022, the Meeting of Attorneys-General (MAG) released options for statutory reform of Australia’s defamation laws to deal with issues relating to internet intermediary liability for defamatory content authored by third parties.

If they become law, these reforms will significantly clarify the position in relation to the potential liability of the various participants in the internet. Critically there are two potential models for the implementation of a statutory safe harbour for certain internet intermediaries. The proposed changes can be summarised as follows:

  1. Two liability exemptions (for “mere conduits” including search engine providers);
  2. Two alternative options for a defence for internet intermediaries.
  3. Carving defamation law out from s 235 of the Online Safety Act 2021 (Cth).
  4. Increasing the threshold for preliminary discovery in prospective defamation claims.
  5. Content blocking orders.
  6. Offer of amends changes.

Stakeholders are asked to provide written submissions by 9 September 2022.

This article considers key aspects of the proposed reforms.

Background

The issues relating to the liability of internet intermediaries for third party content have been the subject of extensive consideration by Australian courts in the last decade, culminating in the recent decision of the High Court in Google LLC v Defteros [2022] HCA 27.

In the Defteros decision, a majority of the High Court found that Google was not a publisher of material made available via hyperlinks in certain search results provided through its search engine, and therefore not liable for that material in defamation. This is the latest in a line of cases considering Google’s liability in respect of material generated by various search functionalities, including allegedly defamatory search result snippets and autocomplete search suggestions.

Controversies and uncertainties continue in relation to liability of social media platforms, media organisations and others in respect of third party posts and comments in a variety of internet contexts.  In Fairfax Media Publications Pty Ltd & Ors v Voller [2021] HCA 27, the High Court determined that media organisations were publishers of third party comments on their Facebook posts. The Intermediary liability issues arising from “likes” and “comments” on social media have been judicially considered in cases such as Bolten v Stoltenberg [2018] NSWSC 1518, and Aldrigde v Johnston [2020] SASCFC 31. Additionally, there have been several cases where parties have applied for Court orders against digital platforms requiring them to hand over personal information of third-party authors of online content to enable commencement of proceedings against those third parties. As was made clear in the High Court Trkulja case, whilst the applicable principles are long-standing and well-understood, each new internet defamation case requires a fresh consideration of them in context.

Against this backdrop, there is a widespread view that statutory reform would be helpful to increase certainty and to put in place clear parameters in relation to matters such as take down time frames and principles.

The reform process has followed a winding path. It is worth remembering the context in which the latest proposed reforms have been published. In 2018, the Council of Attorneys-General (CAG) Defamation Working Party was reconvened to facilitate a “cyber-age” reboot of Australia’s model defamation provisions (MDPs). The review would consider widespread changes to Australian defamation laws. In December 2019 CAG released proposed defamation law reforms (including the introduction of the serious harm threshold), but also determined that issues dealing with internet intermediary liability were complex and should be addressed by way of separate reform process, which became “Stage 2”, while other broader defamation law reforms would be rolled out earlier in “Stage 1”. Stage 1 was implemented from mid-2021 onward. In March 2021, a separate discussion paper for Stage 2 was released which looked at a wide range of issues and options from maintaining the status quo to an unconditional safe harbour for internet intermediaries.

That process was ongoing when in December 2021, the Commonwealth government proposed a legislative response to internet “trolling”. That response was the Social Media (Anti-Trolling) Bill 2022. After several rounds of consultation and review, as well as a change of government, that Bill is not proceeding.

The CAG process has once again moved again to the forefront of reform in this area publishing:

  • draft Part A Model Defamation(opens in a new ta Amendment Provisions;
  • Background Paper; and
  • Summary Paper.

In total, MAG has made seven recommendations.

Key concepts

The proposed reforms will regulate the liability of “digital intermediaries”. A digital intermediary means a person, other than the author, originator or poster of digital matter, who provides an online service in connection with the publication of the matter. Although it is not explicitly stated in the definition, it appears that forum administrators, that is, individuals or organisations that use online platforms to host forums that allow or invite third-party comments, fall within the definition of “digital intermediaries”. This would include those who publish social media pages and allow comments to be posted.

An online service means a service provided to a person to enable to access, search or otherwise use the internet, and includes:

  • a transmission or storage service,
  • a content indexing service,
  • a service to provide, encourage or facilitate social or other interaction between persons,
  • a service to allow the use of a search engine.

Matter that is transmitted, posted or stored using an online service is referred to as “digital matter”.

Exemptions from liability for “mere conduits”

Two statutory conditional exemptions from liability are proposed for digital intermediaries who are “mere conduits”. These comprise Recommendations 1 and 2.

Recommendation 1 is that there be a conditional, statutory exemption from defamation liability in respect of third-party content, for certain “passive” digital intermediaries. Digital intermediaries providing “caching services”, “conduit services” and “storage services” would be exempt from liability. These include ISPs, cloud service provides and email providers. From a policy perspective, this is a recognition that entirely passive digital intermediaries should not be liable for third party content.

To have the benefit of this exemption, the digital intermediary must not be the author, originator, or poster of the relevant online material. Importantly, the exemption also requires the intermediary to establish that the intermediary did not:

  1. initiate the steps required to publish the matter, or
  2. select any of the recipients of the matter, or
  3. encourage the poster of the matter to publish the matter, or
  4. edit the content of the matter, whether before or after it was published, or
  5. promote the matter, whether before or after it was published.

Recommendation 2 is that there be a conditional, statutory exemption from defamation liability for providers of search engines. Even if the search engine is made aware of the search result complained of, the provider will not be liable for it. By way of example, Google would not be liable for search result snippets.

This safe harbour will apply to search results generated using the search engine from search terms inputted by the user of the engine only. “Terms automatically suggested by the engine” are not exempted from liability, which means that autocomplete search term suggestions are explicitly excluded from the protection afforded to search engine providers.  A search engine provider would face different risks in relation to defamatory search result snippets, compared with autocomplete search suggestions.

Similarly, if the search results are promoted by the search engine because of payment the search engine provider received in respect of those search results, the search engine provider would not be exempt from liability in respect of that material. The basis for this is a matter of policy – the exemption is only available for the digital matter in respect of which the provider is said to be “content neutral”, that is, content in which it has no monetary or other particular interest in promoting, outside the search engine’s normal functioning. This is not to say however, that the search engine provider would not be able to rely on either of the further defences set out in Model A or Model B below.

Recommendations 1 and 2 would be implemented by way of a new s 9A in the MDPs.

Defences for digital intermediaries

There are two alternative defences proposed for digital intermediaries who are not mere conduits of digital matter, which comprise recommendations 3A and 3B. The fork in the road is significant – each model would have different ramifications for internet users and providers of online services.

Recommendation 3A is to provide for a safe harbour defence for digital intermediaries if the complainant has sufficient information about the originator to issue a concerns notice or commence proceedings against the originator. This is known as “Model A”.

To establish the defence the digital intermediary must prove that it:

  1. was a digital intermediary in relation to the publication, and
  2. had an easily accessible complaints mechanism (similar to that contemplated by the Anti-Trolling Bill), and
  3. within 14 days of receiving a “duly given” complaints notice:
    1. provided the plaintiff with sufficient identifying information about the poster (with the poster’s consent) of the matter. For consent of the poster to be given, the defendant must provide the poster with a copy of the complaints notice;
    2. took reasonable access prevention steps in relation to the digital matter. An access prevent step in relation to the publication of digital matter, means a step to remove, block, disable or otherwise prevent access by some or all persons to, the matter.

A complaints notice is only duly given if:

  1. before giving the notice and after taking reasonable steps to obtain the information, the plaintiff was unable to sufficient identifying information about the poster. Such information is sufficient if it enables both a concerns notice to be given to, and defamation proceedings to be commenced against, the poster; and
  2. the notice was in writing and set out:
  3. the name of the plaintiff,
  4. the location where the matter could be access, for example, a webpage address,
  5. an explanation of why the plaintiff considered the matter to be defamatory and, if the plaintiff considered the matter to be factually inaccurate, a statement to that effect,
  6. the harm that the plaintiff considered to be serious harm to the plaintiff’s reputation caused, or likely to be caused, by the publication of the matter,
  7. the steps taken by the plaintiff to obtain sufficient identifying information about the poster of the matter, and
  8. the notice was given using the defendant’s complains mechanism or given to the defendant in another way permitted by s 44 of the MDPs.

This would be a new s 31A of the MDPs.

Digital intermediaries will need to give careful consideration to the practical ramifications of these requirements. For example, the information that must be provided by the digital intermediary must enable the giving of a concerns notice and commencement of proceedings. Section 44 of the MDPs provides for the manner in which a concerns notice may be delivered to a person, including by person or by way of an email address or postal address specified by the person for the giving or serving of documents.

These requirements will affect commercial providers of online services as well as individual internet users. Digital intermediaries will potentially include a wide array of actors in online communications. In order for individuals to have the benefit of this defence in respect of third-party comments on their page, they will need to have established a mechanism for the submitting of complaints notices, and will need to respond to those complaints, including by providing identifying information in respect of posters on the page.

This defence will also fail if the provider of the digital intermediary was actuated by malice in providing the online service. Malice in this context is explored in the Discussion Paper. The examples given of what may constitute malice in this context are illustrative:

  1. A person sets up a Facebook group entitled ‘Principal X is a terrible school principal – list his faults here so we can get him fired’.
  2. A social media platform launches in Australia with the promotional tagline ‘Free speech, no take down, to the limit of the law’. Users are encouraged to use pseudonyms, and not contact details or identify are required. The platform has a moderation policy which operates within the boundaries of the proposed reform, but simply says that the platform, as a general policy will not engage with complaints and will just take down the material before 14 days has elapsed.

Finally, there will be a protection for digital intermediaries, so they are able to moderate content. The digital intermediary may still have access to the safe harbour if they took steps to detect or identify, remove, block, disable or otherwise prevent access to the relevant digital matter.

Recommendation 3B is a new innocent dissemination defence for digital intermediaries in relation to third-party content. This is the alternative proposal to Model A. The digital intermediary would have a defence until the point where they are given a written complaints notice and they remove the content within 14 days. This is known as “Model B”.

Model B is in largely the same form as Model A, but there is no automatic defence if the complainant knew the originator’s identity or with reasonable steps available to an ordinary person could have identified the originator. Model B would not provide for a process by which the digital intermediary can ask the originator for consent to disclose identifying information to the complainant. Under Model B, after there has been a failure to remove the digital matter complained of, the complainant could seek a remedy from the originator, the internet intermediary or both. The internet intermediary must take the material down to have the benefit of the defence.

Contrast this with Model A, whereby the digital intermediary can also get the benefit of the defence if it provides identifying information (with consent) to the complainant or removes the material. The intermediary can keep the material up if identifying information is made available to the complainant.

With either option, the intermediary must have a mechanism for receipt of complaints notices, but in the case of Model A, the complaint notice would also need to set out the steps taken to identify the originator.

Carving out defamation law from the operation of s 235 of OSA

Recommendation 4 is that the Commonwealth Government consider whether it is desirable to exempt defamation law from s 235(1) of the Online Safety Act 2021 (Cth) (OSA).

In early 2022, s 235(1) of the OSA came into effect. It provides that a law of a state or territory, or rule of common law or equity has no effect to the extent that it:

  1. subjects an Australian hosting service provider to liability for hosting or carrying ‘internet content’ where they are not aware of the nature of the internet content, or
  2. requires the internet content host or internet service provider to monitor, make inquiries about, or keep records of, internet content that is hosted or carried.

The precise operation of this section in the context of defamation law is unclear. This recommendation is made on the basis that clarity would avoid complex disputes in defamation litigation to test the applicability of this section to digital intermediaries.

New court powers to order blocking of online defamatory content

Recommendation 5 is to empower courts to make orders (interim or final) for a person who is not a party to a defamation dispute, to remove, block or disable access to the online matter within scope. Critically this is not something that can be made only against an intermediary but in relation to any non-party, so long as the order relates to digital defamatory material. There would no change to the existing threshold for the granting of interim orders. An order could be issued to any non-party with a role to play in limiting access to digital defamatory material including an “immune” non-party such as an ISP (see recommendation 1). For example, an order could be made against an ISP to prevent the publication of digital matter to internet users, if an Applicant successfully obtains injunctive relief against a defamation defendant in respect of that digital matter. Conceptually, this reform is similar to the s 115A blocking mechanism under the Copyright Act 1968 (Cth).

Changes to preliminary discovery in the defamation context

Recommendation 6 is to impose new requirements in circumstances where a Court is considering whether or not to order preliminary discovery or so-called “Kabbabe orders”. Such orders are typically used to obtain information from an intermediary about the originator to enable the commencement of proceedings against the originator. This recommendation recognises the potential abuse of such orders and/or the chilling effect on whistle-blowing disclosures. The Court would need to consider:

  1. the objects of the MDPs; and
  2. privacy, safety or public interest considerations.

Changes to offers to make amends in online context

Recommendation 7 is to amend the mandatory requirements for the content of an offer to make amends (under the MDPs) to allow the intermediary to offer to simply prevent access to the digital matter complained of. This would replace the requirement to offer to publish a reasonable correction or clarification (which is plainly, unfeasible for many types of internet intermediaries).

Next Steps

The proposed reforms will be the subject of further consultation. Stakeholders are invited to provide written submissions by 9 September 2022. This is an important opportunity for stakeholders to work through, and make submissions about, the practical ramifications of the reforms.

Tags

online, defamation, internet, internet law, media law, digital platforms inquiry, online regulation, australia