Online Safety, Content Moderation & Regulation
Safe-harbour provisions intended to protect online hosts and other intermediaries from liability in respect of third party and user generated content are being gradually chipped away. Liability for data protection, defamation and other unlawful content can now befall organisations which fail to impose appropriate monitoring and moderation of content, or don’t respond swiftly and effectively when faced with a complaint.
First the EU’s Digital Services Act and now the UK’s Online Safety Act 2023 only serve to expand the scope of organisations who need to take these issues into account, and increase the obligations on affected organisations, bringing additional regulation, complexity and cost.
We advise organisations on their obligations, assist them to put in place appropriate procedures, and advise on their responses to specific decisions and complaints, balancing competing principles and rights.
We also engage with organisations in relation to what forthcoming legislation could mean for their organisation, and support them in their lobbying efforts.
Access our Online Safety and Online Harms Resources page.
Follow our dedicated online safety Twitter/X account.
The new Labour government has today (23 October 2024) introduced the Data (Use and Access) Bill in the House of Lords, in the latest attempt to reform the UK’s data protection laws as set out in the UK GDPR, Data Protection Act 2018, Data Protection Act 1998 and Privacy and Electronic Communications Regulations.