@TCatInReality

This is what I was trying to explain about the UK's #OnlineSafetyAct xxx

mastodon.social/@eff/114976736…


The UK’s Online Safety Act is being sold as a way to protect children. But instead, UK politicians are pushing sweeping censorship and forcing platforms to implement invasive age checks. This is not safety–it's surveillance.
eff.org/deeplinks/2025/08/bloc…

reshared this

in reply to Totts

I've literally spent hours discussing #OnlineSafetyAct this week.

I am very familiar with the arguments, but remain unconvinced. Ultimately, the few anecdotal "harms" mentioned from the few weeks it's been active are all decisions made by tech companies with totally opaque decision-making, yet blaming the government for their interpretation and decision.

Sure, there are risks with data abuse and hacking. But that was already a pervasive risk online, OSA is another degree of it.

1/

This entry was edited (4 months ago)

reshared this

in reply to TC Won't Give In To Lies

So, yes, let's work on better guidance - and demand better clarity from tech companies why they age-gate things. And absolutely, we need better regulation and privacy on the age gating processes.

I find it amazing the tech companies that want unfettered AI flexibility aren't offering solutions to the issues raised about OSA - instead they fight to repeal. I suspect some are making outrageous decisions to feed public anger. The whole thing reeks of AstroTurf to me.

2/

in reply to TC Won't Give In To Lies

.
But what I find totally ignored in the OSA discussion are the real world harms that have gone on for decades - because techbros profit from them.

To me, the fact OSA is finally a start at holding tech companies accountable for a safe product/service is the overriding benefit - and why I support #ImproveOSANotRepeal . FFS, the law is about stopping illegal content from all and underage access to certain harmful things. Those are social goods and there MUST be a way to do that

3/

in reply to TC Won't Give In To Lies

To conclude, to me, #OSA is about basic product/service safety and we will find the right balance.

A key next step is better regs on age gating

But giving up on service safety is a horrible idea ...and only benefits those who we *know* profit from unsafe services

(and we know from whistleblowers, they know they are unsafe and just don't care).

/end

Note: I've now stated my view. Take what you will. I'm not prepared to debate every person who responds, but reply if you'd like.

#OSA
This entry was edited (4 months ago)
in reply to TC Won't Give In To Lies

Quick example to illustrate the point.

If a particular vape shop suddenly started IDing everyone who entered and said government law required it, we wouldn't immediately blame the gov for denying the right to vape and demanding the law change.

We'd ask the vape shop what they are doing and why they think the law required it.

We should ask the same when Reddit (or anyone) age gates a Gaza chat. And with dozens of Gaza subreddits, there may (or not) be a good reason that one is gated.

in reply to TC Won't Give In To Lies

The difference is that we have been doing age verification in person for years, and we can do it without any terrible consequences. Unless you consider local solutions that are trivial to bypass we do not have a way to validate the age of someone using a computer remotely that does not involve uploading personal data to the internet. Uploading personal data to the internet makes internet use far more dangerous.

You can of course blame the capitalists for doing what they think will make them the most money, and I would agree with you, but any system that assumes they will act any other way is broken. One of the biggest problems with this law is that it is making it impossible for non-monetised sites to comply so driving many people to the megacorps who are doing this sort of extreme data collection.

in reply to DavyJones

I fear you missed my point.

I was not talking about the procedural risks of age verification (which I agree is some risk and an area I'd like better regulated).

My point was about taking these restrictions at face value. Many tech companies have a clear incentive to make bad, headline-grabbing decisions and blame the government. So, like my vape shop example - every reporters first question should be "what is happening here that you think merits this legal restriction?"

This entry was edited (4 months ago)
in reply to TC Won't Give In To Lies

If we take your vape shop scenario, if there was a law that said the vape shop had to do something and a privacy policy that said they could sell your data or use it to target you with adverts, would you still need to ask them why they are doing it, or could you assume that they are taking an opportunity to make more money?

That is capitalism, and that is the system we have chosen to run the world. Sure, we could change that, and I am all for that plan, but given that we create the rules that are that companies have to make as much money as possible, and they have an excuse to ask for personal data then we should expect this behaviour. If you think asking the companies why they are doing what they are doing is the best way to make a better world then go ahead, but as long as we have laws that incentivise this then I do not see it changing. I think we need to change the laws.

in reply to TC Won't Give In To Lies

The law as written guarantees that companies will have to be over-cautious with age gating in order to avoid being found in violation of the law. Whether this is intentional or not on the part of lawmakers, that’s the practical effect.

More granular, privacy-preserving methods of age verification would be very expensive and require a long-term concerted effort to roll out. Lawmakers didn’t think of or didn’t care about this either.

1/2

in reply to Misuse Case

In the end it doesn’t matter because it’s setting the groundwork for tracking people’s online activity anyway and broadening the scope of what’s considered “inappropriate” for young people to include information on reproductive health, queerness, and politics.

So “the corporations just don’t want to do this the right way” is a red herring. The OSA is not what it says on the tin after all.

2/2

in reply to Misuse Case

@MisuseCase
While I agree the Tories wrote #OSA rather badly, it is still a critical step toward accountability and safety. So, I disagree with most of what you said.

I'll highlight two:
1) The tech industry is notorious for its lack of caution. I find it absurd to say the law will force over caution. As always, what matters with law is the regulatory environment and courts - both are completely untested with OSA and likely to favour the tech companies.

Con't

in reply to TC Won't Give In To Lies

@MisuseCase
2) The notion that OSA is creating the framework to start "...tracking people’s online activity..." totally ignores the fact it is happening extensively already. And it is happening entirely free from democratic review in the hands of centi-billionaires.

IMO, we should take these very valid concerns about data privacy and tracking and improve OSA, not repeal it and return unfettered control to techbros.

You don't need to agree with me, just recognise there are other views.

in reply to TC Won't Give In To Lies

>> The notion that OSA is creating the framework to start "...tracking people’s online activity..." totally ignores the fact it is happening extensively already. And it is happening entirely free from democratic review in the hands of centi-billionaires.

Then if that’s the problem you want to solve, pass comprehensive privacy legislation that addresses the problem directly, rather than the OSA, which makes it considerably worse.

1/2

in reply to Misuse Case

This legislation does not rein in big tech or techbros, because they are the only entities with the resources to comply with the law at all. It makes it impossible for smaller, independent sites and services to operate.

Maybe this law was sold as reining in the power of big tech and tech bros, but it really doesn’t. It does nothing for privacy, for exploitation of children, or for regulating big tech in a meaningful way, so it’s 0 for 3.

2/2

in reply to Misuse Case

@MisuseCase
Man, you have drunk the Kool- Aid. Read the Act (gov.uk/government/publications…)

Companies need to do a risk assessment. If their content is all ages, they just need a moderation process. No age-gating, no additional data collection.

The question is why so much of the internet fails the risk assessment and requires age gating? IRL isn't like that.

Fact is, it doesn't. It's bad faith techbros telling you it does or their model won't work. Don't believe their hype.

in reply to TC Won't Give In To Lies

I have not “drunk any Kool-aid.” The U.K.’s government decided to brush off the feedback of every advocacy group and tech industry group under the Sun when it let the OSA go into effect. Whatever they claim the law is meant to do, or how it’s meant to work, they were warned “it won’t do that and it will be a disaster.” They went through with it and it doesn’t work and it’s a disaster.
Unknown parent

mastodon - Link to source

TC Won't Give In To Lies

@angiebaby
About 29 million people voted in the last GE. Surely, you see how infinitesimal the risk is by allowing 16-17 year olds to vote.

On the other hand, we know that certain types of exposure can be harmful to a young individual - varying by age.

So, there really is no equating the two.

(BTW, "sexualised nudity" is permitted on age 15 rated content. So maybe check your facts)