The plan by regulator Ofcom is one of more than 40 practical steps tech companies will need to implement under Britain’s Online Safety Act, which became law in October.
The platforms must also have robust age checks to prevent children from seeing harmful content linked to suicide, self-harm and pornography, the regulator said.
Ofcom Chief Executive Melanie Dawes said children’s experiences online had been blighted by harmful content they couldn’t avoid or control.
“In line with new online safety laws, our proposed Codes firmly place the responsibility for keeping children safer on tech firms,” she said.
“They will need to tame aggressive algorithms that push harmful content to children in their personalised feeds and introduce age-checks so children get an experience that’s right for their age.”
Social media companies use complex algorithms to prioritise content and keep users engaged. However, the fact that they amplify similar content can lead to children being influenced by increasing amounts of harmful material.
Technology Secretary Michelle Donelan said introducing the kind of age checks that young people experienced in the real world and addressing algorithms would bring about a fundamental change in how children in Britain experienced the online world.
“To platforms, my message is to engage with us and prepare,” she said. “Do not wait for enforcement and hefty fines — step up to meet your responsibilities and act now.”
Ofcom said it expected to publish its final Children’s Safety Codes of Practice within a year, following a consultation period that ends on July 17.
Once it is approved by parliament, the regulator said it would start enforcement that will be backed by action including fines for non-compliance.
Also Read: European Union opens formal proceedings against TikTok under Digital Services Act