Choking off harmful online pornographic content
“The Independent Pornography Review calls for the prohibition of certain forms of pornographic content, including incest content, and the prohibition or enforcement of the existing prohibition on non-fatal strangulation or choking content.
Drawing on the approach to child sexual exploitation and abuse material, the proposals set out in the Independent Pornography Review would impose additional obligations on pornographic content producers, platforms and the wider ecosystem supporting them, including payments providers and financial institutions, expanding on the requirements of the Online Safety Act 2023.
Developers of generative AI models and apps would also be impacted by proposed AI regulations to ban nudification apps and requirements to introduce safeguards.
Proposals to permit adult performers/creators to withdraw consent to the publication of material go further than existing legal entitlements.”
The Independent Pornography Review led by an Independent Lead Reviewer Baroness Bertin, which was launched in July 2023, published its report ‘Creating a Safer World - the challenge of regulating online pornography’ on Thursday 27 February 2025.
The Review’s objectives as set out in its Terms of Reference included to “review obligations on online pornography providers, including new rules included in the Online Safety Act, and how this compares to existing physical and broadcast media regulation, to assess the case for options for aligning the online and offline regulation of pornographic content” and “what changes to enforcement or the criminal law may be needed”, with a particular focus on “the current regulatory landscape, including relevant regimes such as the Video Recordings Act 1984, the Communications Act 2003 and the Licensing Act 2003”.
In 2020, a literature review commissioned by the Government Equalities Office had concluded that “there is substantial evidence of an association between the use of pornography and harmful sexual attitudes and behaviours towards women” albeit it recognised that “The evidence base cannot establish a direct causal link between the use of pornography and harmful sexual attitudes or behaviours” and “The nature and strength of this relationship varies across the literature, and there are many potential moderating (potentially even mediating) variables that require further investigation”.
How is pornography currently regulated in the UK?
The Obscene Publications Act 1959 makes it an offence to publish, whether by way of showing, playing, transmitting, selling, lending or otherwise distributing, any film, recording, written content, photograph, or other material that would “tend to deprave and corrupt persons who are likely, having regard to all relevant circumstances, to read, see or hear the matter contained or embodied in it”.
The Video Recordings Act 1984 creates various offences relating to the supply or possession with a view to supply of unclassified video recordings shown as moving pictures on a disc, tape or other storage device, and the supply of such R18 rated video recordings other than in a licensed sex shop. The British Board of Film Classification (BBFC) classifies video recordings in accordance with the Video Recordings Act 1984, as well as video content distributed by certain video-on-demand (VOD) and streaming services under a voluntary self-regulatory licensing arrangement and content distributed via UK mobile telephone networks under the voluntary BBFC Mobile Classification Framework, according to its categories of U, PG, 12A, 12, 15, 18 and R18. The R18 category relates to video recordings that may be shown only in specially licensed cinemas, or supplied only in licensed sex shops, and to adults only, and is intended for explicit pornographic works of consenting sex or strong fetish material involving adults but which falls short of constituting a criminal offence. It may also determine that video recordings are unsuitable for classification. The BBFC Classification Guidelines suggest that this will be the case in relation to video recordings that: are in breach of the criminal law, including material judged to be obscene under the current interpretation of the Obscene Publications Act 1959; material (including dialogue) likely to encourage an interest in sexually abusive activity, which may include adults role-playing as non-adults; the portrayal of sexual activity that involves real or apparent lack of consent; any form of physical restraint that prevents participants from indicating a withdrawal of consent; the infliction of pain or acts that are likely to cause serious physical harm, whether real or (in a sexual context) simulated; penetration by any object likely to cause physical harm; and/or, sexual threats, humiliation or abuse that do not form part of a clearly consenting role-playing game. Online pornographic content providers are not subject to these Guidelines.
The Licensing Act 2003 governs performance of works in cinemas, and provides at section 1(1)(c) and Schedule 1 that the provision of regulated entertainment, being the provision of entertainment including the exhibition of a film in the presence of an audience for the purposes of entertainment to members of the public or a section of the public for consideration and with a view to profit, is a licensable activity. Section 20 provides that it shall be a mandatory condition of a licence that the admission of children shall be restricted in accordance with the recommendations of the British Board of Film Classification (BBFC) or, if not specified, the licensing authority.
Section 63(1) Criminal Justice and Immigration Act 2008 makes it an offence to possess an extreme pornographic image, other than contained within a classified work, which is to say an image which (i) must reasonably be assumed to have been produced solely or principally for the purpose of sexual arousal, (ii) either portrays such that a reasonable person looking at the image would think that any such person or animal was real (a) an act that threatens a person's life, (b) an act which results, or is likely to result, in serious injury to a person's anus, breasts or genitals, (c) an act which involves sexual interference with a human corpse, (d) a person performing an act of intercourse or oral sex with an animal (whether dead or alive) or portrays in an explicit and realistic way such that a reasonable person looking at the image would think that the persons were real (a) an act which involves the non-consensual penetration of a person's vagina, anus or mouth by another with the other person's penis or (b) an act which involves the non-consensual sexual penetration of a person's vagina or anus by another with a part of the other person's body or anything else, and, (iii) is grossly offensive, disgusting or otherwise of an obscene character.
Part 5 Online Safety Act 2023 imposes specific duties on publishers of regulated provider pornographic content with links to the UK, requiring that they ensure that children are not normally able to encounter such content in relation to the service by implementing highly effective age verification or age estimation or both, and to maintain relevant records. The obligation under s.81 Online Safety Act 2023 came into force on 17 January 2025, and the regulator Ofcom published its Age Assurance and Children’s Access Statement on 16 January 2025 and is affording affected services until July 2025 at the latest to bring themselves to full compliance in the expectation that measures are already being implemented.
Non-consensual intimate image abuse (‘revenge porn’) and sexually explicit deepfakes
The Online Safety Act 2023 introduced amendments to the Sexual Offences Act 2003 to create offences at sections 66A - D of sharing or threatening to share photograph of an individual in an intimate state where that individual does not consent and the sender does not reasonably believe that they consent or where it is sent with the intention of causing alarm, humiliation or distress.
The offence under section 33 Criminal Justice and Courts Act 2015 of disclosing or threatening to disclose, private sexual photographs and films without consent and with intent to cause distress was consequently repealed by the Online Safety Act 2023.
In September 2024, Baroness Owen introduced a Private Members Bill, the Non-Consensual Sexually Explicit Images and Videos (Offences) Bill (HL Bill 26), to introduce further offences to the Sexual Offences Act 2003 of taking, or soliciting the taking of, a non-consensual sexually explicit photograph or film and of creating or soliciting a non-consensual digitally produced sexually explicit photograph or film.
During the House of Lords Report stage of the Data (Use and Access) Bill on 28 January 2025, the government committed to bringing forward further amendments at Third Reading to criminalise the creation of sexually explicit deepfakes further to campaigning by Baroness Owen. At Third Reading on 05 February 2025, while the government did put forward an amendment to the Data (Use and Access) Bill to criminalise the non-consensual creation of purported intimate images of adults, the Government was defeated when Baroness Owen secured the provision’s extension to address the solicitation of such images, the removal from the provision of a “reasonable excuse” defence in circumstances where a defendant intentionally creates the purported intimate image, the victim does not consent and the defendant does not reasonably believe that the victim consents and, the amendment of the sentencing provisions to provide for a term of imprisonment in addition or in the alternative to the originally proposed fine.
The government introduced its Crime and Policing Bill in the House of Commons on 25 February 2025, section 56 and Schedule 8 of which would amend the Sexual Offences Act 2003 by creating new offences of taking or recording an image or film of an individual in an intimate state without consent and either with the intention of causing alarm, distress or humiliation or without a reasonable belief that they consent and, the installation of equipment to commit such an offence.
What does Baroness Bertin’s Independent Pornography Review recommend for the regulation of online pornography?
The Review concluded that “The current criminal justice response is ineffective in tackling illegal pornography online” finding “the Obscene Publications Act 1959 and Criminal Justice and Immigration Act 2008 (which includes the possession of extreme pornography offence) to be largely ineffective in leading to the charging, and later prosecution, of those disseminating illegal pornographic content online”.
The Independent Pornography Review’s recommendations, which are concerned with protecting against violence against women and girls, seek to achieve parity between the approach to offline and online regulation and, draw on the approach to the prevention of child sexual abuse and exploitation content, include, in so far as relevant to content regulation, online safety and the regulation of artificial intelligence include:
Either (i) a Safe Pornography Code should be introduced under the Online Safety Act 2023 (preferred) or (ii) a new publication offence should be enacted to prohibit degrading, violent and misogynistic pornographic content and content encouraging an interest in child sexual abuse (suggested to include content marketed as featuring ‘barely legal’ teens), supported by CPS guidance;
New duties should be incorporated into the Online Safety Act 2023 to require platforms to address harmful pornographic content in the same way as illegal content;
Fees should be levied on pornography companies under the polluter pays regulatory principle;
Mandating platforms to adopt specific safety-by-design measures, which it is suggested could include incorporating consent messaging in content, keyword monitoring for “problematic search terms” including “young”, “girl” or “drunk”, algorithms which avoid the promotion of harmful content, and spending limits on pornographic content;
The criminalisation of the possession, distribution and publication of non-fatal strangulation or choking pornography under the Obscene Publications Act 1959;
The criminalisation of the ‘taking’ and ‘making’ of non-consensual real or deepfake intimate images;
Platforms should be required to implement education and awareness campaigns about intimate image abuse;
The establishment of a body to conduct content audits, to ensure platforms hosting pornographic content are tackling illegal and prohibited content effectively, with powers to expedite concerns to Ofcom for investigation and enforcement, with an expansion of the role of the BBFC suggested;
Design and implementation of an accreditation scheme indicating compliance with legal and regulatory requirements to tackle illegal and prohibited pornographic content online, which could support access to services and reduce the incidence of de-banking and financial exploitation or discrimination;
Industry should collaborate on a ‘watch-list’ of types of pornographic content which are restricted, or purposely made harder to find through downranking, so that it is only available to users if they intentionally seek it out by not making it available on the home page, for example, including step-incest content;
Measures should be implemented both within the industry and among its wider ecosystem to enable the swift removal of illegal and legal but harmful pornographic content, being based on content that would not receive a BBFC classification for video release or cinema exhibition, but with an expanded list of categories identified in the Report as including:
material (including dialogue) likely to encourage an interest in sexually abusive activity, which may include adults role-playing as non-adults
the portrayal of sexual activity which involves real or apparent lack of consent and any form of physical restraint which prevents participants from indicating a withdrawal of consent;
the infliction of pain or acts which are likely to cause serious physical harm, whether real or (in a sexual context) simulated;
penetration by any object likely to cause physical harm;
sexual threats, humiliation, or abuse which do not form part of a clearly consenting roleplaying game;
racist content;
content in respect of which a performer has withdrawn their consent;
A sanctions regimes should be imposed under the Online Safety Act 2023 on the industry, with senior manager accountability and penalties including terms of imprisonment, and its wider ecosystem in respect of failure to ensure the swift removal of illegal and legal but harmful pornographic content, with site removal or disruption where warranted and with the suggestion that ancillary service providers could be expected to take commercial action ahead of regulatory or legal action;
The Advertising Standards Authority (ASA) and Committees of Advertising Practice (CAP) should review their approach to the regulation and enforcement of advertising on online pornography sites, to ensure that sites are not featuring prohibited adverts;
An ombudsman or commissioner should be established to receive reports of incidents of intimate image abuse and wider control, coercion, and trafficking in the pornography sector, and to provide support and mediate between victims, law enforcement, health and support services, as well as to spearhead research;
Consistent industry wide safety protocols, processes, and safeguards should be designed and implemented by companies hosting pornographic content to ensure that all performers/creators are consenting adults, are aged 18 plus and have not been exploited or coerced into creating content, including for example written and documented proof of consent, age verification requiring multiple pieces of identification, and bank account checks;
Adult performers/creators should be entitled to withdraw consent for the publication of content in which they appear and to have content removed from sites without requiring any justification for the removal;
Industry should develop clear and standardised processes to facilitate content removals at the request of performers/creators;
Platforms hosting pornographic content should have robust protocols and processes to prevent and respond to stolen content, including easy reporting and removal tools, which could use the hashing and digital stamping methodologies utilised to tackle child sexual exploitation and abuse material;
A legislative review of the regime governing illegal pornography online, including the criminal justice response, should be undertaken;
Incest porn (but not “step-incest” porn) should be banned by including it within the definition of extreme pornography under section 63 Criminal justice and Immigration Act 2008, on the basis that it “could potentially encourage devastating attitudes and actions” while acknowledging that “much of this evidence is anecdotal”;
Individuals who upload illegal pornographic content or child sexual abuse material (CSAM) should be prevented from uploading further user generated content (UGC);
Tools utilised in the fight against CSAM should be required to be used proactively by hosts of pornographic content to identify and remove intimate image abuse (IIA) content;
Government should urgently explore what proactive technology could be most effective to identify and tackle deepfake/ AI-generated IIA and CSAM;
‘Nudification’ or ‘nudify’ apps should be banned at device level if necessary to prevent download;
Developers of AI models should build safety mechanisms into AI tools that allow sexually explicit content, so as to ensure illegal pornographic content and CSAM material is not created, including by recognising porn and banning certain prompts
The government should consider should impose requirements on developers of AI models to build in safeguards against illegal pornographic content and CSAM material in future AI regulation legislation; and,
A single focused online safety regulator should be empowered and Ofcom’s capacity to undertake that role should be reviewed, assessing if any legacy areas under Ofcom would be more suited to a different regulator enabling it to focus on regulating the online landscape.
Will the Government implement the Independent Pornography Review’s recommendations?
The Review took longer than anticipated to report and the government has, in the meantime, taken steps to progress certain measures now included in the Review’s Recommendations as set out above.
Unlike Baroness Bertin, in the government’s response to the Review, the government takes the view that graphic strangulation pornography is already unlawful, being contrary to the prohibition on extreme pornography at section 63 Criminal Justice and Immigration Act 2008 which prohibits the portrayal in an explicit and realistic way of any act which threatens a person’s life, but acknowledges that this is in practice not being enforced and commits to “take urgent action to ensure pornography platforms, law enforcement and prosecutors” tackle “graphic strangulation pornography”, which it described as an “increasingly prevalent harm”. Subject to that, announcing the Report’s conclusions, the Government committed to take forward the findings of Baroness Bertin’s review.
One proposal that is of particular interest is that adult performers should be entitled to withdraw their consent to the publication of pornographic content in which they appear, and in the case of professionals will have been paid to appear. This would go further than what would be required by data protection legislation, or the Ofcom Broadcasting Code. Where processing is based on consent, this should be capable of being withdrawn as easily as it was given but the special purposes exemption for processing for the purposes of journalism, art or literature at Schedule 2 Part 5 paragraph 26 Data Protection Act 2018 could apply but this would require the controller to reasonably believe that publication would be in the public interest, which may invite some regulatory or judicial scepticism even having regard to the public interest in publication itself. Article 21 UK GDPR creates a right to object to the processing of personal data based on the legitimate interests lawful basis at Article 6(1)(e) UK GDPR and requires the cessation of processing only in circumstances where the controller cannot establish compelling legitimate grounds to continue. The right to object doesn’t apply in circumstances where the processing is necessary for the performance of a contract to which a data subject is party. Section 8 of Ofcom’s Broadcasting Code applicable to broadcasters requires that in the context of programme making, any infringement of privacy must either be undertaken with the individual’s consent or otherwise warranted and if the individual withdraws their consent then filming, recording or live broadcasting must be stopped. the proposals would therefore have potentially significant financial implications for the industry and even performers themselves.
Proposals to restrict legal but harmful content will have some of the greatest impact on bondage, discipline, dominance and submission, masochistic and sadistic (BDSM) content.
The expectation that allegations of non-compliance should be escalated to ancillary service providers in the expectation that they will effectively disrupt access to or the operation of services, by removing payment services for example, pending the conclusion of legal or regulatory investigations is a concern and presents potential issues of defamation, contractual interference and breaches of human rights.
Other measures proposed could require accounts and verifications, which would raise significant cyber security and privacy concerns and require platforms to step up measures to protect both users and those appearing in the pornographic content they host.
Find out more about our Online Safety, Online Harms, Content Moderation and Content Regulation services.
Our Online Safety & Online Harms Resources page has links to relevant documents regarding the passage and implementation of the Online Safety Act 2023.
Find out more about our responsible and ethical artificial intelligence (AI) services.