Holywood News

ofcom tells social media sites to protect children from adult content

Free updates are notified at any time

Ofcom will unveil a code of practice this week to prevent children from accessing adult content on platforms such as X and Meta under the Online Security Act (Online Security Act), which has become a potential flashpoint in UK-US trade negotiations.

People familiar with the matter say that the UK media regulator will need to remove or “age” adult content (such as pornography) or find other ways to protect children from certain “legal but harmful” content.

The bill was introduced in phased after entering the law in 2023, marking one of the largest reshuffles of how British people access social media, including Instagram, X and Facebook.

OFCOM CEO Melanie Dawes told the Financial Times last year that the industry faces “significant changes” in how it operates.

In practice, these codes mean how algorithms serve adult content, a complete or difficult new era check to stop access to websites and applications with any adult content under 18.

Social media sites may require the use of strict age verification tools, such as the first time you need credit card details, or the technology that includes age-knowledge facial estimation.

Tech groups will also have many other ways to prevent children from seeing adult material, including “clean” areas, and remove these pornography as often as possible even on social media sites below age limits.

Along with porn, people under the age of 18 should no longer experience suicide, self-harm and eating disorder posts and should be protected by material of disgust, violence, hatred or abuse.

These codes propose practical measures that platforms can take to perform their duties, including configuring algorithms to filter out harmful content from children’s social media feeds and internet searches.

Technology companies must conduct so-called child visit assessments by last week to determine whether their services (or some of them) may be accessed by children. Facebook, Instagram, Snap, X and Tiktok all allow users from the age of 13.

Technology groups must complete a separate assessment of the risks their services pose to children by the end of July and then begin taking steps to mitigate the risks. Companies that violate the bill face fines as high as £18 million or 10% of global revenue.

OFCOM will also propose additional consultations on further measures, including the use of artificial intelligence to address illegal content and the use of hash matching to prevent the sharing of involuntary intimate images and terrorist content.

Hash matching or hash scanning compares certain content (such as video, pictures, or text) with an illegal content database.

Watchdogs will also propose crisis response plans for emergencies such as last summer’s riots.

Parts of the Online Security Act have been enacted, such as ordering social media companies, search engines and messaging applications to quickly remove illegal material and reduce the risk of content appearing.

But online security activists fear the U.S. will demand legislation as part of any trade negotiations with the U.S., given the new expropriation of social media sites that are primarily based on the U.S.

U.S. officials asked about the bill at a meeting with OFCOM last month, while Vice President JD Vance filed a free speech infringement related to U.S. technology companies when Prime Minister Sir Keir Starmer visited the White House in February.

“I can’t imagine that the Keir Starmer government will provide child safety for trade deals because doing so is not suitable for service,” said Baroness Beeban Kidron, a cross-cultivator of the House of Lords and digital rights activist in the UK.

Snap said this is “the goal of supporting the Online Security Act and continue to implement it in cooperation with Ofcom.”

Meta said all teenagers using their platform, including Instagram and Facebook, have moved to new “teen accounts” to help comply with the new regulations, although people aged 17 and 18 can ban the restrictions.

X said it was “taking all necessary steps to ensure compliance with UK law”, and Tiktok said it was also compliant.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button