LEGAL & REGULATORY COMPLIANCE CONSULTANTS

Online Safety International

 Online Safety / Online Harms - International Perspectives

1. Australia

2. European Union

AUSTRALIA

The Online Safety Act 2021, enforced by the eSafety Commissioner, introduced new and expanded obligations on a wide range of online service providers.

Through the Basic Online Safety Expectations Determination, obligations are imposed on online  service providers to take steps to minimise illegal content and activity on their services, as well as other ‘online harms’ such as bullying and abuse, and to implement measures to enforce their own terms of service and provide mechanisms for reporting abuse.

In addition, the Act introduces the ‘Adult Cyber Abuse Scheme’, which grants the eSafety Commissioner the power to require the removal of online abuse (other than defamatory content) that targets an Australian adult with the intention of causing serious physical or psychological harm which in all the circumstances is menacing, harassing or offensive within 24 hours of provision of a notice, with civil penalties or fines of up to AUS $555,000 for failure to comply. Individual posters can also be held to account with civil penalties or fines of up to AUS $111,000 for failure to remove offending material.

This supplements the pre-existing Cyberbullying Scheme relating to children, which has also been strengthened under the Act, and now requires the removal, usually within 24 hours of provision of a notice, of material which is intended to target an Australian child, and which has the effect of seriously humiliating, harassing, intimidating, or threatening the child.

Online service providers have 24 hours from provision of a notice to remove intimate, i.e. depicting private body parts and private activities and even the absence of clothing of religious or cultural significance, images and videos, including those that are fake or digitally altered or merely tagged.

More generally, the Online Content Scheme created by the Act enables the eSafety Commissioner to issue removal and/or remedial notices in connection with two classes of content under the National Classification Code, thereby achieving consistency of approach between online and offline content: Class 1 material, which includes child sexual exploitation material, pro-terrorist material, and material that promotes or incites crime; and, Class 2 material, which includes non-violent sexual activity, or anything that is ‘unsuitable for a minor to see’.

Industry associations (BSA | the Software Alliance (BSA), the Australian Mobile Telecommunications Association (AMTA), Communications Alliance, the Consumer Electronics Suppliers Association (CESA), the Digital Industry Group Inc (DIGI), and the Interactive Games and Entertainment Association (IGEA)) are consulting between 01 September and 02 October 2022 on draft codes relating to the protection of Australians from ‘Class 1’ content, i.e. child sexual exploitation, pro-terror, extreme crime and violence, crime and violence and drug-related content.

The Act also imposes obligations in relation to transparency and reporting, and grants various powers of enforcement to the eSafety Commissioner.

In addition, the Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019, permits the eSafety Commissioner to issue notices to content and hosting providers alerting them to the fact that they are providing access to Abhorrent Violent Conduct material, that is to say material that promotes, incites, instructs in or depicts abhorrent violent conduct, such as kidnapping, rape, torture, murder, attempted murder and terrorist acts, which is likely to cause significant harm to the Australian community. Certain exemptions are provided, including in relation to news or current affairs reporting in the public interest by someone working in a professional capacity as a journalist.

Separately, the eSafety Commissioner may issue a non-enforceable blocking request, or a blocking notice for a period of up to 3 months requiring the blocking of access to material. Consecutive blocking notices can be issued. Failure to complu oth a blocking notice can currently be subject to a civil penalty of up to AUS $111,000 for individuals and up to AUS $555,000 for companies. Blocking notices may be subject to appeal initially to the eSafety Commissioner and subsequently the Administrative Appeals Tribunal.

Various regulatory guidance on compliance with the requirements is available.

The eSafety Commissioner has published Safety by Design assessment tools, to enable companies to conduct a ‘health check’ of their current compliance and identify further improvements to online safety.

EUROPEAN UNION

On 27 October 2022, Regulation (EU) 2022/2065 on a Single Market For Digital Services and amending Directive 2000/31/EC (‘the Digital Services Act’) was published in the Official Journal and comes into force 20 days thereafter.

The Act takes a scaled approach to compliance, depending on the nature and scale of the online platform, and is intended to enhance transparency obligations, grant users more control over their experience, establish mechanisms to report and pursue complaints regarding content which is illegal and/or in breach of the platform’s terms and conditions including in some cases by external complaints adjudication, force platforms to take action against repeat posters of illegal content, protect children online, improve the labelling and transparency of targeting of online advertising and enable the monitoring and enforcement of compliance. The Act draws inspiration from some of the mechanisms established in the GDPR.

Member States are required by 17 February 2024 to designate a Digital Services Co-ordinator to be responsible for all matters relating to supervision and enforcement of the Act in that Member State (Article 49), and the Act sets out the mandatory powers of (Article 51) and requirements for the authority (Article 50). Measures are in place to establish the primacy of Co-ordinators, and to enable cross-border complaints handling and resolution (Articles 56-60). This is supported by the establishment of a European Board for Digital Services comprised of Digital Services Co-ordinators.

The Act serves to largely replicate the intermediary liability provisions applicable to information society services acting as a ‘mere conduit’, or involved in ‘hosting’ or ‘caching’ content, that had been set out in Articles 12 – 14 of the Electronic Commerce Directive (Directive 2000/31/EC) (E-Commerce Directive), save that in relation to the hosting defence a new exception is introduced in connection with the liability of online  platforms under consumer protection law in so far as the platform allows consumers to conclude distance contracts with traders in such a way that appears that the product is offered by the platform or someone acting on its behalf (Articles 4 – 6). The Act similarly maintains the prohibition on any general obligation on information society services to conduct general monitoring (Article 8). The Act goes further than the Directive, however, by removing the disincentive against conducting content moderation in respect of illegal content by providing an assurance at Article 7 of the Act that “voluntary own-initiative investigations” and “other measures aimed at detecting, identifying and removing, or disabling access to, illegal content” or otherwise to comply with EU or compatible national laws conducted “in good faith and in a diligent manner” would  not prevent reliance on the intermediary liability defences.

The Act imposes obligations on Member States in relation to the content of any judicial or administrative order for the removal of illegal content to be served on information society services (Article 9) or for the provision of information (Article 10).

Information society service providers must appoint and publish the contact details of a single point of contact for electronic communications for government (Article 11) and for users of the service (Article 12) (albeit not for any third parties who may be affected by content on the service).

The Act replicates the EU representative provisions contained in Article 27 of the General Data Protection Regulation (‘GDPR’) Regulation (EU) 2016/679, requiring intermediary service providers which are not established in the EU but which offer services in the EU to appoint a legal representative in one of the Member States in which it offers its services with that representative being notified to the relevant Digital Services Coordinator, with the representative potentially being held liable for non-compliance. Similar to the position under the GDPR, this is likely to lead to forum shopping with representatives being appointed in Member States where regulators are perceived to be under-resourced or otherwise lenient.

Online intermediaries are required to produce and publish terms and conditions, including in a way which can be understood by minors where services are aimed at or otherwise predominantly used by them, which must include details of content moderation measures, and the terms and conditions must be applied and enforced in a “in a diligent, objective and proportionate manner” which has regard for the fundamental rights of the users of the service (Article 14). Additional obligations are imposed on providers of “very large” online platforms and search services to translate those terms and conditions into the official language of each Member State in which they operate, as well as a summary of the terms and conditions and remedies and redress mechanisms.

To enhance transparency, online intermediaries (other than those deemed to be micro or small enterprises and not “very large” online platforms) are required to publish transparency reports at least annually detailing information regarding orders to remove illegal content, orders to provide information regarding illegal content, the deployment of content moderation including automated tools, and complaints regarding content moderation (Article 15). Additional reporting requirements are imposed on “very large” online platforms and search engines, as well as obligations in relation to the frequency and timing of reports (Article 42).

Hosting providers are required to implement electronic reporting mechanisms not only for service users but enabling anyone to report potentially illegal content and the Act mandates the fields that ought to be made available (Article 16). If the reporter provides their contact details, they are required to be provided with an acknowledgement of receipt of the report and subsequently of the provider’s decision in respect of the report, including if that decision was taken or contributed to by automated means. The effect of such notices is to give rise to actual knowledge or awareness for the purpose of the defence against intermediary liability.

Where users have provided their contact details to online platforms, users are required to be provided with a statement of reasons where their content is the subject of content moderation or they ae otherwise subject to account restrictions in relation to the posting of illegal content or breaches of terms and conditions (Article 17), unless it relates to deceptive high volume commercial content or is the result of a judicial or administrative order pursuant to Article 9 of the Act.

The Act imposes an obligation to report to the relevant law enforcement or judicial authorities any information giving rise to a reasonable suspicion that a criminal offence has taken, is taking or will take place which involves a threat to the life or safety of a person (Article 18).

With the exception of small and micro entities (unless they are designated very large online platforms), additional obligations are imposed on online platforms:

  • They are required to implement internal complaints handling systems for users and any reporter enabling them to lodge a complaint for up to 6 months after a decision in relation to action taken regarding content deemed to be illegal content or otherwise in breach of the platform’s terms and conditions, and to provide reasoned decisions which are not solely based on automated decision making and details of further redress mechanisms (Article 20).

  • Digital Services Co-ordinators in Member States are required to certify out-of-court dispute settlement bodies meeting specified requirements and platforms must submit to the mechanism of any certified body to which its users and/or any reporter submits a complaint, regardless of whether a complaint has been determined or resolved (Article 21). The services of the out-of-court dispute settlement body must be free to complainants and charges to platforms must be reasonable and not exceed their own costs. Complaints should be determined by the certified out-of-court dispute settlement body within 90 days or, in relation to highly complex disputes, within 180 days. Such bodies are required to report annually to the Digital Services Co-ordinator and may have their certification revoked.

  • Entities can apply to the Digital Services Co-ordinator to be designated as a trusted flagger to online platforms, and platforms must establish mechanisms to give priority to notices submitted by such trusted flaggers (Article 22).  Trusted flaggers can have their designation revoked by the Digital services Co-ordinator, to whom poor conduct can be flagged by online platforms.

  • The Act mandates that users who frequently post illegal content must, after being provided with a warning, have their account suspended by online platforms (Article 23).

  • To prevent misuse of the notice and internal complaints systems, frequent reports of manifestly unfounded complaints are required to be suspended by online platforms from making reports and complaints for a reasonable time (Article 23).

  • Additional transparency measures are imposed, requiring the publication of information regarding the number disputes submitted to the out-of-court dispute settlement bodies and the number of suspensions imposed, as well as information on average number of monthly active recipients of the service in the EU in the preceding 6 months (Article 24).

  • The Act attempts to tackle so-called ‘dark patterns’ by imposing obligations on user interface design not to deceive, manipulate or distort or impair the ability of users to make free and informed decisions in relation to the use of the service (Article 25).

  • Online platforms will be required to ensure that adverts are clearly identifiable as such, with the beneficiary of the advert as well as its funder being clearly identified, with access from the advert to information about why the advert was presented to the relevant user (Article 26).

  • Recommendation algorithms are also targeted, with online platforms being required to inform users in terms and conditions of the main parameters used in recommendation systems, with details of how to modify them, including where the system affects the relative order of recommended content (Article 27). “Very large” online platforms are required to offer at least one recommendation option which is not based on profiling (Article 38).

  • Platforms which are accessible to minors are required to implement “appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors”, and are prohibited from serving advertising based on profiling where they believe a user is a minor (Article 28).

Regardless of the size of the platform, users and other bodies mandated to exercise their rights can lodge complaints with the Digital Services Coordinator (Article 53).

Where online platforms make available the option for users to conclude distance contracts with traders, except where the platform qualifies as a micro or small enterprise and is not designated a “very large” online platform, the Act imposes additional obligations:

  • Online platforms must obtain, and satisfy themselves as to the accuracy of, information from the trader and confirmation that it will only offer products and services compatible with EU law, and must make that information available to users and retain it for 6 months after the relationship ends (Article 30).

  • Platforms must design their services in a way that enables traders to meet their pre-contractual obligations to consumers (Article 31).

  • An obligation to inform users who purchased goods or services in the preceding 6 months that the platform providers subsequently becomes aware were illegal, and to issue a public notice where it is unable to do so (Article 32).

“Very large” online platforms and search engines are defined as those which have “a number of average monthly active recipients of the service in the Union equal to or higher than 45 million”, although this figure may be adjusted having regard to the population of the EU.

“Very large” online platforms and search engines are required to conduct risk assessments in relation to systemic risks relating to illegal content, impact on fundamental rights, civic discourse and electoral processes, and public security, and gender-based violence, the protection of public health and minors and serious negative consequences to the person’s physical and mental well-being (Article 34) and then to take measures to mitigate the identified risks (Article 35).  

Additional obligations are imposed on “very large” online platforms in relation to advertising, including retaining data regarding the advert, its targeting and its reach for a year after its last publication (Article 39).

They are required to have their compliance with their obligations audited annually at their own expense (Article 37).

Digital Services Co-ordinators are granted rights to require “very large” online platforms and search engines to provide information regarding their compliance with the requirements of the Act (Article 40).

“Very large” online platforms and search engines are required to establish independent compliance functions to monitor compliance with the Act, with the head of the function being notified to the Digital services Co-ordinator and the Commission (Article 41).

“Very large” online platforms and search engines must pay an annual supervisory fee to the Commission, which can itself institute investigations and exercise investigatory powers against “very large” online platforms and search engines and can ultimately impose fines (Article 65 - 83).

Member States are granted the right to set their own penalties for non-compliance with the Act, subject to maximum of 1% of the annual worldwide turnover of the provider of intermediary services concerned in the preceding financial year in respect of failure to comply with certain information obligations and otherwise 6 % of the of the annual worldwide turnover of the provider of intermediary services concerned in the preceding financial year (Article 52).

Failure to comply also gives rise to a right to compensation on the part of users where damage or loss is suffered as a result of an infringement (Article 54).

Users and other bodies mandated to exercise their rights can lodge complaints with the Digital Services Coordinator (Article 53).

The Act grants the Commission the power online platforms to take certain actions in response to crises, which are defined as extraordinary circumstances leading to a serious threat to public security or public health in the EU or significant part of it (Article 36).

The Commission also has a number of responsibilities in promoting voluntary standards (Article 44), the creation of codes of conduct (Article 45) (including in relation to online advertising (Article 46) and accessibility (Article 47)), and crisis protocols (Article 49).