Search Results: How Wikipedia's challenge to the Online Safety Act makes a case for Outcomes-Based Influence
by Harry Sidnell on 29 Aug 2025
This blog will explain the background to the UK High Court’s recent decision in Wikimedia Foundation v Secretary of State for Science, Innovation and Technology, and will outline lessons learned for firms wishing to influence future policy (regarding the OSA and beyond). It does not constitute legal advice, but instead emphasises the advantage of consistent, politically-conscious messaging when engaging legislators. The blog therefore focuses in particular on the first and second limbs of the claim, which were largely based on Ofcom’s advice to the Secretary of State — advice in itself informed by prior stakeholder engagement. Our conclusion, given Wikimedia’s example, is that companies may be best served by showing, pre-emptively, how burdensome regulation can harm their business and the public good, defined in political terms, rather than by making a more purely technical argument about the letter of the law and then hoping for favourable advice from an already overstretched regulator.
Introduction
Earlier this month the English and Welsh High Court handed down its judgment in Wikimedia Foundation v Secretary of State for Science, Innovation and Technology [2025] EWHC 2086, in what is believed to have been the first litigation launched in relation to the Online Safety Act 2023 (OSA). The claimants, Wikimedia (best known for hosting Wikipedia, the tenth-most visited website in the world) and a Wikipedia moderator, challenged Secretary of State Peter Kyle’s decision to make Regulation 3 of Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025. This challenge was launched on the following grounds:
- That the Secretary of State failed to comply with a duty imposed by Paragraph 1(5) of Schedule 11 to the OSA to take into account the “likely impact of the number of users of the user-to-user part of a service, and its functionalities, on the ease, speed and breadth of the dissemination of user-generated content”.
- The decision was irrational because it was based on flawed reasoning.
- The decision is incompatible with Art. 8,10 and 11 of the European Convention on Human Rights (ECHR)
- The decision is incompatible with Art. 14 ECHR because it fails to distinguish between different categories of online provider[s].
Though the claimants were ultimately unsuccessful in each limb of the appeal, the case highlights the importance of clear and outcomes-based engagement with regulators and government throughout the secondary-legislative process, and potentially gives preliminary clues on any future secondary legislation made following statutory advice from a regulator.
The judgment
Legislative background
The OSA received Royal Assent in 2023. It imposes a range of duties on social media companies and search services, imposing legal duties upon them to protect their users from illegal content and content harmful to children. Platforms are to be categorised as Category 1, 2A or 2B services, depending on the likelihood of content hosted on the platform “going viral”. Duties on Category 1 services entail a duty to give users a choice about whether to verify their identity and about what type of content they see — including whether they are able to see content from unverified users — as well as a duty to protect free speech.
Within six months of the OSA’s coming into force, the UK government’s Office of Communications, or Ofcom, was required to research the following:
- “How easily-, quickly- and widely-regulated user-generated content is disseminated by means of user-to-user services;
- The number of users and functionalities of such user-to-user services, and;
- Such other characteristics of that part of such services [...] considered to be relevant to the Category 1 thresholds.”
Ofcom was then required to advise the Secretary of State for Science, Innovation and Technology regarding suitable criteria for designating Category 1 services — and regarding relevant regulations which the Secretary could then make. The regulations were required to relate to:
- “The number of users of the service
- Functionalities of that part of the service
- Any other characteristics of that part of the service [...] that the Secretary of State considers relevant”
In making the regulations defining the criteria for categorisation, the Secretary of State was obliged by Paragraph 1(5) of Schedule 11 to the OSA to take into account the “likely impact of the number of users of the user-to-user part of a service, and its functionalities, on the ease, speed and breadth of the dissemination of user-generated content”. Regulation 3, which came into force at the end of February 2025, described the thresholds for Category 1 services thus:
(1) The Category 1 threshold conditions are met by a regulated user-to-user service where, in respect of the user-to-user part of that service, it —
(a)
(i) has an average number of monthly active United Kingdom users that exceeds 34 million, and
(ii) uses a content recommender system, or
(b)
(i) has an average number of monthly active United Kingdom users that exceeds 7 million,
(ii) uses a content recommender system, and
(iii) provides a functionality for users to forward or share regulated user-generated content on the service with other users of that service.
Ofcom’s research and advice
Ofcom based much of its research on the functionalities by which user-generated content can be disseminated. One of the functionalities it focused on was “content recommender systems”; another was the ability to forward and reshare content (herein “sharing functionalities”). It identified these two functionalities as systems that operate to “increase dissemination of content easily, quickly and widely … [A]dditionally, the effects of these features are likely to be increased further as the user base increases and when these features operate in combination” (emphasis added).
Ofcom provided its mandatory advice to the Secretary of State on 29 February 2024. At [41], Johnson J summarised two key points of this advice:
- “Where services have a very large number of users, a content recommender system alone is sufficient for content to be disseminated easily, quickly and widely [or];
- Where services have a lower but still considerable number of users, a content recommender system alone may not be sufficient to disseminate content quickly, easily and widely.”
Thus, Ofcom recommended two sets of thresholds for prescribing Category 1 services: either services with a large user-base which use a content recommender system, or services with smaller user-bases which use both content recommender systems and sharing functionalities.
The Secretary of State’s decision
On 18 March 2024, the (then) Secretary of State was provided with a Submission making “it clear that Category 1 duties were not primarily aimed at pornographic content or the protection of children [but at capturing] services that have a significant influence over public discourse.” Officials had further discussions with Ofcom, who advised that there “was very little information available on how content recommender systems work across different types of service, and that it would be extremely difficult to make robust regulations that differentiated between different types of service.” Ofcom and the Department for Science, Innovation and Technology (DSIT) continued to hold further talks regarding the categorisation process: a workshop was attended by officials from both organisations on 22 May 2024, whilst a range of options for excluding less-risky platforms from Category 1 designation were explored throughout the 2024 election period. However, officials ultimately concluded that Ofcom would need to conduct further research on content recommender systems in order to avoid the risk of introducing unintended loopholes into the categorisation process, which would lead to “significant delay” in the making of the regulations.
On 18 July 2024, the new Secretary of State was first provided with a Ministerial Submission in relation to the regulations. He was informed that several firms, including Wikimedia, had criticised Ofcom’s proposed thresholds for prescribing Category 1 services, and of his statutory requirement to take into account the likely impact on viral dissemination of the number of users of the service, and the functionalities of the service, as well as any other characteristics or factors he considered relevant.
On 31 July 2024, a further Ministerial Submission recommended following Ofcom’s advice; the Minister “reluctantly’’ agreed, admitting ‘‘our hands are largely tied by the constraints of the Act”. On 17 September 2024, the Secretary of State agreed with the recommendation to approve Ofcom’s proposed conditions. In its evidence, Ofcom stated it did not, as part of its statutory research, provide advice about the specific impact of each of examined functionality, but instead undertook its research at a high level of generality. This generality was communicated to the Secretary of State, such that he “was not under a misapprehension that content recommender systems worked in the same way across different types of service, or that they affected viral dissemination to the same degree.”
The claimants, therefore, made the following submissions:
- “The Secretary of State failed to consider the full likely impact of content recommender and sharing functionalities on viral dissemination, as required by Paragraph 1(5) of Schedule 11. Ofcom’s advice, and therefore [the decision], was based on an erroneous assumption that such functionalities are integral to a service and always interoperate, excluding consideration of circumstances where they are used for ancillary purposes or not in conjunction with one another”. [75]
- “Regulation 3 captures services with content recommender systems and sharing functionalities, even if those functionalities are not integral to the service and therefore do not give rise to a risk of viral dissemination”. [77]
In its evidence, Wikimedia described how its content recommender systems are generally used for the purposes of moderation and editing and are ancillary to the core function of the Wikipedia platform. This position was uncontested.
Decision – Ground 1
The first limb of the claimants’ argument was that the Secretary of State failed to comply with [the] duty imposed by Paragraph 1(5) of Schedule 11 to the OSA to take into account the “likely impact of the number of users of the user-to-user part of a service, and its functionalities, on the ease, speed and breadth of the dissemination of user-generated content”.
In granting the claimants permission to claim judicial review on this ground (but then dismissing the claim), Johnson J referred to Ofcom’s advice to the Secretary of State and to the content of the various meetings between the regulator and DSIT officials. He was satisfied that the Ministerial Submissions and further contact between Ofcom and DSIT had equipped the Secretary of State with an “appreciation of the benefits and risks of not accepting Ofcom’s advice, and/or seeking further advice”. He was also content that the Secretary of State appreciated that advice regarding the impact of content recommender systems and sharing functionalities was being provided “at a high level of generality”.
The claimants further argued that Ofcom’s advice “necessarily assumed that the two functionalities would be integral to the service and would operate and interact to disseminate content to users”, with the effect that the Secretary of State “restricted himself to consideration of the impact of content recommender systems and forward and share functions which are integral to a service, and which operate in conjunction with each other.” However, Johnson J was not persuaded that this amounted to a breach of the Secretary of State’s duties under Schedule 11 to the OSA. Instead, he wrote, Ofcom was required to undertake an “indicative assessment” of the effect of various functionalities on the viral dissemination of content. At no point in compiling its research did Ofcom purport to “comprehensively cover all permutations” of the types and potential uses of content recommender systems or sharing functionalities, an understanding which was reflected in Ofcom’s advice to the Secretary of State. The Secretary of State was only required to assess the “likely impact” (emphasis added) of functionalities generally on content dissemination; in Johnson J’s view, “nothing in the statutory language implies that the Secretary of State is required to consider each different functionality of each different service in each different sector”. Thus, this ground of the claim was dismissed.
Decision – Ground 2
The second limb of the claimants’ argument was that the decision [to make the regulations] was irrational because it was based on “flawed reasoning”. Similarly to Ground 1, Johnson J granted permission to claim judicial review on this ground, and then dismissed the claim. The decision to dismiss this ground was primarily based on the fact that Wikimedia (nor, indeed, any other service) has yet been categorised under the OSA, and that the Secretary of State retains powers both to exempt the description of user-to-user service(s), if he considers these services’ risk of harm [to the public] to be low, and to amend the regulations at a later date. According to Johnson J, when the Secretary of State was making the regulations, “it was impossible to know, on the information available, precisely what the impact would be on individual services [and] there was an opportunity later to amend the regulation, or to exempt particular types of service, if it turned out that the regulation had undesirable consequences”. In the absence of “obvious alternative criteria” capable of predicting the impact of categorisation on each individual service, the Secretary of State’s decision to accept Ofcom’s advice and make the regulations was not irrational.
In short, then, Grounds 1 and 2 of Wikimedia’s claim were dismissed, as Johnson J was not persuaded the Secretary of State had failed in his statutory duty to consider the likely impact of the functionalities examined by Ofcom on viral dissemination. Although Wikipedia’s use of the functionalities described is unequivocally ancillary to its core purposes, the Secretary of State’s duty in making the regulations was not to consider the potential impact of categorisation on all platforms — but rather to consider Ofcom’s advice and act on the basis of the likely effect of content recommender systems and sharing functionalities, at a high level of generality. The judge was satisfied that the Secretary of State had complied with his statutory duties and had acted in a rational manner based on the information available to him at the time.
However, in making his concluding remarks, Johnson J stressed that his judgment “does not give Ofcom and the Secretary of State a green light to implement a regime that would significantly impede Wikipedia’s operations”, and that a later decision by Ofcom to categorise Wikipedia as a Category 1 service could still be challenged by means of Judicial Review. The claimants, therefore, still have time to persuade Ofcom or the Secretary of State that they should not be categorised as a Category 1 service, alongside other platforms that may wish to make representations regarding their own categorisation. Therefore, it is likely that this case is far from the last act in the ongoing drama of the categorisation thresholds, let alone regarding the OSA in general. Platforms such as 4Chan have already stated their intent not to comply with the Act, highlighting the difficulty Ofcom may face in enforcing its provisions and associated fines on services not domiciled in the UK, whilst others have pointed to the increased use of VPNs by consumers in response to the introduction of OSA and the difficulties this creates for platforms. Thus, Ofcom and DSIT, and potentially the courts, will likely have to contend in the coming months and years with a wide range of evolving technical concerns with the OSA.
We now turn to consider how the case highlights the importance of outcomes-based engagement with regulators, both in respect of future implementation of the OSA and in shaping public policy more generally.
Lessons for engagement
It is unknowable — and particularly without access to detailed records of communications between officials and business — whether the High Court’s recent decision might have been different had Ofcom’s engagement with industry taken a different path. It is possible Ofcom, given the resources, time and information available to it, could only ever (sensibly) have arrived at its eventual conclusions as to the risk posed by content recommender and sharing functionalities. It is equally impossible to predict the outcome of future litigation on either the OSA or of any other decision as far as secondary legislation. And yet Wikimedia’s judicial review of the categorisation regulations is potentially instructive of the importance of outcomes-based engagement with regulators.
As we have seen, Johnson J’s decision to dismiss the claimants’ argument on the grounds of rationality was largely founded on the fact that no decision on categorisation has yet been made; and it was buttressed by the reminder that Wikimedia could potentially have a right to future legal action if Ofcom ‘‘impermissibly’’ categorises Wikipedia as a Category 1 service. Additionally, the Secretary of State retains the power to amend the regulations and/or exempt services from their scope, whilst Ofcom could elect to conduct further research into functionalities affecting viral dissemination. Hence, from a lobbying perspective, Wikimedia (and other platforms with an interest in the OSA) would likely benefit from proactively seeking engagement opportunities with both regulators and government officials. Whilst platforms should emphasise the technical aspects of how they use content recommender systems and sharing functionalities, Ofcom has already described these systems as “opaque” and prone to change. Further, the increasingly apparent need for Ofcom to better understand the long-term impacts of public reaction to the OSA, such as increased VPN adoption, will likely limit the time the regulator can spend on technology- and platform-specific investigations. Therefore, using Wikimedia’s case as an example, a better strategy for platforms might be to broadly emphasise how excessively burdensome regulation could impinge on Wikipedia’s business operations despite its unequivocally low-risk status, and the role Wikipedia plays in upholding free speech and access to information. Given the current government’s manifesto pledge to streamline regulation and place growth at the centre of its political agenda, it would be a political and public-relations embarrassment for a platform of such global renown as Wikipedia to be forced to limit its UK operations because of its categorisation under the OSA, even if this did prove legally sound. Sometimes the political, rather than (merely) legal, front is more promising.
The importance of outcomes-based engagement is not unique to this case. The imposition on a regulator of a duty to conduct research, with a view to informing regulations to be made under an overarching piece of primary legislation, is a tried-and-tested method of policymaking in the UK: the Communications Act 2003 and the Gambling Act 2005 imposed such duties on Ofcom and the Gambling Commission, respectively. It is therefore feasible that future UK secondary legislation on the OSA will follow this basic pattern: ie, asking a regulator to consult and then using that consultation to provide advice to the Secretary of State, who then bases regulation on the advice. Often faced with a chorus of competing voices, opposing technical positions and limited time, it is likely that regulators — as with Ofcom in recommending categorisation thresholds under the OSA — will have to provide advice at a high level of generality. The uncertainty in waiting for binding decisions or for legal challenges to be settled can hurt business. Hence, companies facing a very real prospect of financial or operational harm may be better-served by tying the outcomes they face to a government or regulatory priority, rather than simply trying to state their technical case and hoping the regulator’s advice does not leave them with no options but legal action or last-ditch lobbying efforts.
Inline is well-placed to connect businesses with regulators in a range of traditional and emerging technology sectors, leveraging the expertise of its staff to not only understand and advance technical concerns to government and regulatory stakeholders but to situate client concerns in the overarching political context and to establish longstanding relationships that survive across changing administrations. If you are interested in engaging with Ofcom in relation to the OSA, or in pre-emptively building relationships with regulators in your sector, we would be happy to help.
Topics: UK politics, Online Platforms, Technology, Innovation, online safety act
Written by Harry Sidnell
Harry provides policy analysis, monitoring and advice to tech clients from Inline’s London office. Before joining Inline, he worked in the analytical department of a major outsourced insurance buyer and as a legal researcher with the Gatehouse Chambers Construction Team. Harry holds a Bachelor of Laws (LLB) from the University of York.
Comments