Letter to Ofcom Regarding the Online Safety Act
This letter (lightly edited for formatting and clarity) was sent to Ofcom's Online Safety Team on 2025-01-27 to clarify open questions regarding the Online Safety Act (OSA). The OSA appears to pose significant hazards to community forums, Fediverse instances, online art, and web infrastructure—for instance, hosting providers and object storage services like S3. Woof.group is trying to figure out whether we are a “regulated service” under the OSA, and if so, how to comply. Ofcom responded on 2025-02-05.
Dear Ofcom Online Safety Team,
I write with follow-up questions regarding the Online Safety Act and how it might pertain to Woof.group, a Mastodon instance of approximately seven hundred users. I have spent much of my recent nights and weekends reviewing hundreds of pages of Ofcom's guidance, the text of the act itself, and consulting with people in the UK in an attempt to understand what our risks might be and how we ought to comply. I have many questions which the guidance has yet to clarify, and would appreciate your specific insight. Your answers would also be of use to tens of thousands of other Fediverse operators who now find themselves in similar positions.
1. What constitutes a “significant number” of UK users?
Following further discussions with individuals more familiar with UK law, I am no longer certain what “a significant number” means. Woof.group has ~740 monthly active local users—that is to say, people with an account on woof.group itself. We have 1531 local users with logged IP addresses. Of those, 185 have at least one IP which resolves to a “GB” country code. There are likely some UK people who access woof.group content via RSS readers, the public web interface, the API, or via federation. For a number of reasons (e.g. automated crawlers, NATs, the difficulty of mapping requests to human beings, federation dynamics, etc.) we have no meaningful way to measure how many UK people fall into this category.
In Ofcom's judgement, does this constitute “a significant number”?
2. Do non-commercial, public-interest services have “target markets?”
Woof.group does not sell anything to users, or third parties, and does not run advertisements. We provide a free community service, and are supported entirely by donations denominated in US dollars. We are a US corporation, and until now, have had essentially nothing to say regarding the UK. I believe, and would like Ofcom to confirm, that the UK does not form “one of the target markets” for Woof.group.
3. Does pornography, or “extreme pornography”, constitute a “material risk of significant harm” to individuals in the UK?
I firmly believe that our user-created pornography poses little harm to anyone. However, the text of the OSA repeatedly refers to pornography as “harmful to children”, and Ofcom goes to great lengths to ban access to it. In Ofcom's judgement, does the presence of pornography on a service (which is, we stress, explicitly oriented towards 18+ individuals) pose “a material risk of significant harm”, for purposes of evaluating whether a service is regulated under the OSA? If not, does “extreme pornography” bring a service under scope?
4. What exactly is “extreme pornography”?
The OSA requires that “extreme pornography” be “grossly offensive” or “obscene”, which are “given [their] ordinary meaning[s]“. We are unsure how to establish whether material is grossly offensive or obscene, and would appreciate concrete examples, ideally with reference images.
We are aware that up until recently, physical restraint, watersports, female ejaculation, fisting, spanking, caning, aggressive whipping, and verbal or physical abuse, regardless of consent, were considered illegal pornography in the UK. Does Ofcom intend to ban these kinds of acts again? If not, which acts specifically?
The ICJG defines “extreme pornography” to include any act “which results, or is likely to result, in serious injury to a person’s anus, breasts or genitals”. Does this include fisting? Sounding? Piercing? Punching? Branding? Cutting? Ball beating? Where does the threshold for “serious injury” fall—light or heavy bruising? Abrasion? Swelling? Needle punctures? “Hamburger back”? Scarring? As SM practitioners, we are used to doing painful, scary-looking things to breasts, anuses, and genitals which are nevertheless quite safe and recoverable. However, we are unclear as to how Ofcom and UK courts would interpret those acts.
The ICJG also bans “acts which depict physical endangerment with a material risk of death” including “hanging, strangulation, suffocation and causing life threatening injury.” Is an image of a man's gloved hands over the mouth and nose of another “suffocation”? What about gripping the neck firmly, a tightened collar, or a headlock? An oven bag over the head? Leatherfolk know that these acts and images of them can be arranged with essentially no risk of death. Will Ofcom and the courts agree?
Would, for example, Sotheby's images of Robert Mapplethorpe's “X Portfolio” be considered grossly offensive or obscene? Are its images of scars, needles through nipples, mouth-filling gags, fisting, sounding, and bloody genitals, “extreme pornography”? Will Ofcom fine Sotheby's for publishing this media? Can they still offer it for auction?
5. How are we to interpret reports of non-consensual imagery?
Paragraph 10.17 of the ICJG indicates that “[a] user report which suggests that the content depicts non-consensual acts should be taken as reasonable grounds to believe that there was no consent, unless there is evidence to the contrary.” Suppose a user has consensual penetrative sex, uploads a photograph of that sex to Woof.group, and an anonymous user from (e.g.) another Mastodon instance files a moderation report stating “this is not consensual”. How do moderators establish whether there is evidence to the contrary? Is a statement from the user who posted the image sufficient? If not, does this provide a harassment vector by which anyone can take down posts or accounts by flooding the provider with reports that material is non-consensual?
6. Must services which allow pornography take down public endpoints?
Most Mastodon instances, like Woof.group and Mastodon.social, allow pornography. Almost without exception, these instances also come with public HTTP endpoints which do not require any kind of login or session to access. As shown in the attached image, there are public web pages, RSS feeds, and API endpoints by which users could access material. These users might be under 18 years of age—as with any public HTTP endpoint, there is simply no way to know. Are instances which allow pornography required to take down their public web pages, RSS feeds, and APIs?
7. Must services which allow pornography deny federation?
Woof.group, like most Mastodon servers, is a part of the Fediverse: a loose network of tens of thousands of instances, each run by different people, which exchange content via the ActivityPub protocol. These instances run all kinds of software: Mastodon, Pixelfed, Akkoma, Lemmy, and so on. Just like email, users on one instance can follow, view posts from, and interact with users on other instances.
Even if an instance were to perform age assurance on its own users, it has no way to establish whether users on other instances are also adults. As shown in the attached image, a 17-year old user of mstdn.ca could see content from users on mastodon.social, which allows pornography. In this scenario, whom does Ofcom hold responsible?
A 17-year old person in the UK could run their own server; the technical skills required are within the reach of many teens. Imagine a UK 17-year old were to run an instance for themselves in a datacenter in Brazil. Through that single-user instance, they follow a user on Woof.group, and see pornography. Is the 17-year old liable for having run their own service which allowed them to view porn? Is Woof.group liable for having allowed another instance—which we have no knowledge or control over, and which appears to be located outside the UK—to federate with us?
In order to robustly deny access to children, it would seem that Fediverse instances which allow pornography must shut down federation altogether. What exactly is Ofcom's position on federation?
8. Does Ofcom expect public object stores like AWS S3 to ban pornography?
Object storage services like S3 or DigitalOcean Spaces offer companies and individuals the ability to store and, via public HTTP endpoints, access arbitrary files. Under the OSA, these appear to be file-sharing services with significant links to the UK. They are also used by many online services (including Woof.group) to store images and video, including pornography. For example, here is one of Mapplethorpe's photographs of fisting, available to any web browser, hosted on AWS S3.
Does Ofcom consider S3 (et al) to be in violation of the OSA? If so, how does Ofcom propose AWS respond? Requiring S3 to perform age assurance would break a huge part of the web: S3's public endpoints have no concept of user accounts, so both S3 and its users would be forced to fundamentally redesign. Banning object stores at the network level in the UK would also render a good swath of the web unusable. Banning pornography would require S3 to identify and remove it. What is Ofcom planning to do?
9. Will community services be given notification and a chance to respond prior to fines and imprisonment?
A good number of sites are planning to shut down due to Ofcom's threats to levy fines of up to 18 million, or to imprison operators. I ask again: can Ofcom promise to contact operators, and give them a chance to respond, before imposing crushing penalties? Doing so would significantly lower all of our stress levels.
10. How does Ofcom intend to enforce the OSA against US corporations and citizens?
Publishing pornography is constitutionally protected speech in the US, but Ofcom has been clear that it expects “overseas ... services run by micro-businesses” to be affected by (e.g.) age assurance rules. What would happen if Woof.group, as a US company, hosted in the US, with US owners, failed to comply with Ofcom's rules? How does Ofcom plan to enforce rulings against US entities?
As I am now orders of magnitude beyond the suggested four hours it should take to understand compliance obligations, I would very much appreciate your help.
Perplexedly yours,
—Kyle Kingsbury
Online safety
Under the Online Safety Act, Ofcom's job is to make online services safer for the people who use them. We make sure companies have effective systems in place to protect users from harm.www.ofcom.org.uk
Assessing the Online Safety Act
Ofcom, the UK regulator in charge of implementing the Online Safety Act (OSA), recently released new guidance for service providers. It is unclear how Ofcom intends to enforce the OSA against small communities like Woof.group. The potential costs of non-compliance are severe, and Woof.group does not have the funds to mount a legal defense should Ofcom choose to prosecute. We may be blocked by, or forced to block, the UK.The OSA has already triggered the shutdown of several community forums. Woof.group is at particular risk because we allow users to share and view a broad variety of content, including advanced mathematics, sewing, and pornography. We may also run afoul of the OSA's prohibition of “extreme pornography”. If the OSA applies to Woof.group, we believe most Fediverse instances will also be at risk.
Is Woof.group in Scope?
Woof.group is based in the US. However, the OSA covers service providers worldwide if they “have links with the United Kingdom”. The most relevant category for Woof.group, a “Part 3 service”, is defined in part 2, section 4:(5) For the purposes of subsection (2), a user-to-user service or a search service “has links with the United Kingdom” if—(a) the service has a significant number of United Kingdom users, or
(b) United Kingdom users form one of the target markets for the service (or the only target market).
Woof.group has ~740 monthly active local users—that is to say, people with an account on Woof.group itself. We have 1531 local users with logged IP addresses. Of those, 185 have an IP which resolves to a “GB” country code. There are likely some UK people who access Woof.group content via RSS readers, the public web interface, the API, or via federation. For a number of reasons (e.g. automated crawlers, NATs, the difficulty of mapping requests to human beings, federation dynamics, etc.) we have no meaningful way to measure how many UK people fall into this category.Is 185+ people, or on the order of 12% of our registered users, “significant? We have no idea. However, Ofcom indicates that the OSA will cover overseas ... services run by micro businesses.
We believe the UK does not form a “target market” for Woof.group. Woof.group does not sell anything to users or third parties and does not run advertisements. We provide a free community service, and are supported entirely by donations.
(6) For the purposes of subsection (2), a user-to-user service or a search service also “has links with the United Kingdom” if—(a) the service is capable of being used in the United Kingdom by individuals, and
(b) there are reasonable grounds to believe that there is a material risk of significant harm to individuals in the United Kingdom presented by—
(i) in the case of a user-to-user service, user-generated content present on the service or (if the service includes a search engine) search content of the service;
We are unsure whether Woof.group meets this test. Like almost every web site, we are capable of being used by individuals in the UK. We do not believe our content poses a “material risk of significant harm” to anyone. However, the Online Safety Act goes to great lengths to define pornography (section 61.2) as “harmful to children”. Is that risk “material”, and is the harm of pornography “significant”? We don't know.Age Verification
Per Ofcom's Statement on Age Assurance and Children's Access,[A]ll services which allow pornography must have highly effective age assurance in place by July 2025 at the latest to prevent children from accessing it. The Act imposes different deadlines on different types of provider. Services that display or publish their own pornographic content, including certain Generative AI tools, must begin taking steps immediately to introduce robust age checks. Services that host user-generated pornographic content must have fully implemented age checks by July.
Earlier this month, Woof.group wrote to Ofcom to request clarification about legal risks and whether Ofcom was likely to enforce the OSA against small web forums. Ofcom's response reiterated this guidance:Please note that if your service allows pornography, it will need to use highly effective age assurance to prevent children from accessing that content.
Woof.group does not allow members under 18 years of age. We take aggressive moderation action, including referral to law enforcement, for instances of CSAM. However, performing the “highly effective age assurance” required by the OSA is essentially impossible. Mastodon, the software we run, has no concept of age verification. We would need to pay a third-party service to perform age verification of our users, fork Mastodon, disable federation and all public interfaces, integrate that age verification service, and test and maintain our fork indefinitely. Woof.group runs on a shoestring budget and volunteer hours. We do not have the funds, technical expertise, or engineering time to enact these measures. Maintaining a fork would make it significantly harder to deploy upstream security patches.Even if we could comply, doing so would break functionality our members and friends on other instances depend on. Nor will Woof.group ban pornography: we believe in the expression of sexuality, and in particular queer leathersex, as a public good. Our paintings, our family portraits, our videos of floggings and marches, our thirst traps and sex work are at the vital heart of queer culture. Woof.group exists to make these things possible, not to repress them.
Extreme Pornography
The OSA also proscribes “extreme pornography”. Until recently, the UK banned a long list of our favorites, including spanking, caning, aggressive whipping, physical or verbal abuse (regardless of consent), physical restraint, female ejaculation, watersports, and fisting. Ofcom's Illegal Content Judgement Guidance defines extreme pornography on pages 158-160. It forbids realistic, “grossly offensive, disgusting or otherwise of an obscene character” images of certain types of acts. “Grossly offensive” and “obscene” are “given [their] ordinary meaning[s]“. We do not know how to apply these terms practically.Proscribed acts include any “which results, or is likely to result, in serious injury to a person’s anus, breasts or genitals”. It is unclear what constitutes a serious injury. Is a chest-punching or branding scene against the rules? CBT? How will a UK court evaluate the safety of fisting?
The guidance also proscribes acts which threaten a person's life:
‘Acts which threaten a person’s life’ mean acts which depict physical endangerment with a material risk of death. Non exhaustive examples include explicit and realistic hanging, strangulation, suffocation and causing life threatening injury.
It is perfectly safe to press a hand over someone's mouth for the five seconds it takes to snap a photograph. Does the resulting image depict “suffocation”? Leatherfolk would likely judge such an image to have a vanishingly small chance of death. Will Ofcom and the courts rule similarly?The guidance also bans images of non-consensual penetration. How do providers establish a lack of consent? Paragraph 10.17 explains that any user report qualifies:
A user report which suggests that the content depicts non-consensual acts should be taken as reasonable grounds to believe that there was no consent, unless there is evidence to the contrary.
This raises the prospect that images of events which were consented to in real life, and which a reasonable leather audience would implicitly understand as consensual, could be judged non-consensual on the basis of an anonymous third party report. This seems a clear vector for harassment. How are moderators to establish evidence to the contrary? Can we simply ask the poster if the media in question was actually consensual? Do we need to secure affidavits from anyone who posts images of penetrative sex?Can Fediverse Instances Even Comply?
The Online Safety Act assumes services know what human beings are using their services, where they are located, and can control their access. Services which require a login to view content can sort of verify the ages of logged in users. This is how the Online Safety Act imagines a “child” (legally, any person under 18) might access pornography hosted on a Mastodon instance, like mastodon.social:
One might imagine mastodon.social (or any other instance) could comply with the OSA's regulations on pornography by forcing logged-in users to go through an age-assurance process, somehow finding a way to pay for it, defining a new kind of metadata on posts which identifies pornography (or other content “harmful to children” under the OSA), figuring out a way to federate that metadata via ActivityPub, forcing local and remote users to tag their content appropriately (an enormous moderation burden)—look, we all know this is not realistic, but someone might at least pretend.
Even if this were possible, this is not how Fediverse instances work. Instances generally have public web pages for much of their content, which can be viewed by anyone regardless of whether they have an account. There are RSS feeds which anyone can subscribe to; again, no login required. There are also public API endpoints which anyone on the internet could use to request content.
There is no way to establish the age of any of these users. Each instance allowing pornography would need to shut down its public API, RSS feeds, and public web pages. But this would also not be sufficient, because Fediverse instances are federated using the ActivityPub protocol. This is the entire point of the Fediverse. People on Woof.group can follow people on Mastodon.social, who can favorite posts on Pixelfed.social, who can comment on photos posted on Woof.group. A 17-year old could create an account on some other instance which does not perform age verification, and follow a porn account on mastodon.social from there. Or they could run their own instance—teens run all kinds of servers.
There is no way for a Fediverse instance to tell whether all the users of another instance are over 18. Instances cannot simply declare this fact: Ofcom is clear that self-attestation is not sufficient for age assurance. Even if the other instance performed age assurance, it would still need to shut down its own public API, RSS feeds, and web pages.
When pornography is federated from one instance to another, it is not clear whether the OSA would consider the originating instance, the receiving instance, or both to be illegal. If services are expected to robustly deny access to children, they might be required to disable federation entirely.
Plans
The OSA's restrictions on user-to-user pornography takes effect in July. However, many of the OSA's other provisions—like performing detailed risk assessments and submitting them to Ofcom, come into effect in on March 16. We must decide on a course of action by that time.Woof.group is still attempting to figure out whether it is considered a regulated service under the OSA. We have written letters to Ofcom attempting to ascertain this fact, and to determine answers to some of the questions above; we await Ofcom's response. We are also attending an upcoming discussion with Ofcom.
Woof.group has already invested upwards of thirty hours reading the law, Ofcom's guidance, and corresponding with Ofcom and UK residents in an attempt to understand the law and devise a plan for compliance. There are literally thousands of pages of densely interlinked PDF guidance to review. We remain, like many others, hopelessly overwhelmed by the vagueness and complexity of the act.
If Woof.group is in scope, the potential penalties would be devastating. The UK could impose a fine of up to 18 million pounds or imprisonment. While we doubt the UK would attempt extradition of a US citizen, Woof.group's owner @aphyr visits the UK every few years and prefers his time behind bars to be voluntary and short-lived.
Some sites, like Wikipedia, intend to ignore the act and see what happens. However, Woof.group lacks the financial resources and time to mount a legal defense. It seems most likely we would be forced to block any access from IP addresses associated with the UK. Even if we take no action, Ofcom might order Woof.group banned by UK network providers.
If you are based in the UK, we recommend you export your data by March 15th in case we block access or are banned. You may also wish to ask your representatives for help.
If you run a web site or other kind of online service, you may wish to write Ofcom's Online Safety Team at OSengagement@ofcom.org.uk.
Wikipedia will not perform Online Safety Bill age checks
It says age verification would contravene its commitment to collect minimal data from readers.By Chris Vallance & Tom Gerken (BBC News)