Online safety measures for children come into effect today

upday.com 20 godzin temu

Parents and children can expect to "experience a different internet for the first time" as sweeping new online safety measures came into effect on Friday. Technology Secretary Peter Kyle said he had "high expectations" for the changes, while the head of regulator Ofcom urged the public to "judge us by the impact we secure".

The new protections include mandatory age checks to prevent children accessing pornography and other harmful content. However, some campaigners have branded the measures a "sticking plaster", arguing they lack the ambition needed to protect young people online.

Platforms face hefty fines

Under the new rules, social media platforms must ensure their algorithms do not push harmful content about self-harm and eating disorders towards children. Companies that fail to comply face fines of up to £18 million or 10% of their worldwide revenue, whichever is greater.

Kyle told Sky News: "I have very high expectations of the change that children will experience. And let me just say this to parents and children, you will experience a different internet really, for the first time in from today, moving forward than you've had in the past."

Half a million children exposed

The measures, enforced by Ofcom under the Online Safety Act, require platforms hosting pornography or harmful content to implement age verification using facial estimation or credit card checks. Ofcom chief executive Dame Melanie Dawes revealed that half a million eight to 14-year-olds encountered pornography online in the past month alone.

When BBC staff tested the new system and successfully signed up to a pornography site using just an email address, Dawes said sites would be "checking patterns of email use" behind the scenes to verify adult users. She told Radio 4's Today programme: "We've shown that we've got teeth and that we're prepared to use them at Ofcom."

Time limits under consideration

Dawes also backed Government plans to consider limits on children's social media usage. Earlier this week, Kyle said he wanted to tackle "compulsive behaviour", with ministers reportedly considering a two-hour daily limit and potential curfews.

"I think the Government is right to be opening up this question," Dawes told LBC. "I think we're all a bit addicted to our phones, adults and children, obviously particularly a concern for young people."

Mixed response from campaigners

Children's charities including the NSPCC and Barnardo's welcomed the new protections, alongside the Internet Watch Foundation. However, the Molly Rose Foundation, established after 14-year-old Molly Russell took her own life following exposure to harmful online content, criticised the measures.

Foundation chief executive Andy Burrows told Sky News: "We've always had a very simple test for the Online Safety Act, will it stop further young people like Molly from dying because of the harmful design of social media platforms? And regrettably, we just don't think it passes that test."

Major platforms under scrutiny

Ofcom has launched a monitoring programme targeting platforms where children spend most time, including Facebook, Instagram, TikTok, Roblox and YouTube. These sites must submit risk assessments by August 7 and detail their safety measures by September 30.

The regulator warned that safeguards must be "robust and meaningful" rather than treated as "an afterthought" by technology companies.

(PA/London) Note: This article has been edited with the help of Artificial Intelligence.

Idź do oryginalnego materiału