The UK government has brought its long-awaited Online Safety Bill (OSB) before Parliament, introducing new criminal sanctions for tech company executives and senior managers alongside further criminal offences.
As it currently stands, the OSB would impose a statutory “duty of care” on technology companies that host user-generated content or allow people to communicate, meaning they would be legally obliged to proactively identify, remove and limit the spread of both illegal and “legal but harmful” content.
Failure to do so could result in fines of up to 10% of their turnover by the online harms regulator, which was confirmed to be Ofcom in December 2020.
The government has claimed and maintains that the OSB will safeguard freedom of expression online, increase the accountability of tech giants and prevent some of the worst abuses online, including racist hate crimes. It has also claimed the OSB will help to keep children safe online.
Alongside its introduction to Parliament on 17 March 2022, the government also announced that it has significantly reduced the two-year grace period on criminal liability for tech company execs, meaning they could be prosecuted for failure to comply with information requests from Ofcom within two months of the OSB becoming law.
A number of new criminal offences have also been included to make senior managers liable for destroying evidence, failing to attend or providing false information in interviews with Ofcom, and for obstructing the regulator when it enters company offices for audits or inspections.
“The internet has transformed our lives for the better. It’s connected us and empowered us. But on the other side, tech firms haven’t been held to account when harm, abuse and criminal behaviour have run riot on their platforms. Instead, they have been left to mark their own homework,” said digital secretary Nadine Dorries.
“We don’t give it a second’s thought when we buckle our seat belts to protect ourselves when driving. Given all the risks online, it’s only sensible we ensure similar basic protections for the digital age. If we fail to act, we risk sacrificing the wellbeing and innocence of countless generations of children to the power of unchecked algorithms.”
The government previously introduced three new criminal offences to the OSB at the beginning of February 2022, and further expanded the number of “priority offences” that tech firms will have to proactively prevent in their services.
While terrorism and child sexual abuse were already included in the priority list, the government re-drafted it to include revenge porn, hate crime, fraud, the sale of illegal drugs or weapons, the promotion or facilitation of suicide, people smuggling, and sexual exploitation.
The government has also extended the scope of the OSB on a number of other occasions in 2022, adding measures to deal with anonymous abuse, obliging porn sites to verify their users are aged 18 or over, and placing a new legal duty on the largest social media firms to prevent fraudulent paid-for adverts from appearing on their services.
With the OSB now brought before Parliament, and the government has said it will approve exactly what kinds of “legal but harmful” content tech companies must tackle.
Responses to the bill Commenting on the OSB’s introduction to Parliament, Matthew Fell, chief policy director at the Confederation of British Industry (CBI), said that the goal of the legislation should be to make the UK an international leader, rather than an outlier, in shaping the future of the internet.
“However, the bill in its current form raises some red flags, including extending the scope to ‘legal but harmful’ content. Not only will this deter investment at a time when our country needs it most, but will fail to deliver on the aims of this legislation,” he said.
“This is an incredibly complex set of regulations and businesses will be combing through the detail over the coming days. Ensuring the bill is feasible for companies to implement is essential – they will work with policymakers to make that happen as the bill moves through Parliament.”
Geraint Lloyd-Taylor, a partner at law firm Lewis Silkin, added that many lawyers have found the amorphous concept of “harms” within it alarming: “The bill leaves a lot of unanswered questions, but it clearly poses a threat to our democratic freedoms, as it would introduce a new form of censorship (by requiring that platforms censor users) without clear boundaries and safeguards.”
Lloyd-Taylor added that while he understands frustration over how long it has taken government to bring the OSB before Parliament – a process which started with the publication of the Online Harms Whitepapers in April 2019 – “the stakes are high, and the government – and Parliament – must get this right”.
“It is not an easy task to tackle online harms without excessively impacting our fundamental rights and freedoms, such as freedom of expression and the right to privacy,” he added.
Executive director of the Open Rights Group (ORG), Jim Killock, further described the “legal but harmful” aspect of the OSB as a “censor’s charter”.
“Civil society groups have raised the warning, Parliament has raised the warning, the government’s own MPs have raised the warning, but the government has ignored them all,” he said. “Failure to remove it will ban Brits from doing normal things like making jokes, seeking help and engaging in healthy debate online.”
Killock added: “There are now lots of new and unworkable ideas tacked on at the last minute, making it a monstrous melange of fit-to-fail duties that will make minority groups less safe online.”
Andy Phippen, a fellow of BCS, The Chartered Institute for IT, and a specialist in ethics and digital rights at Bournemouth University, also questioned how effective the OSB would be at achieving the government’s stated goal of protecting children online.
“The rhetoric around the bill seems to be around making children safe, but the entire drive of the proposed legislation is tech sector regulation,” he said.
“When I talk to young people about online safety, they usually say they need supportive adults who understand the issues, and better education. None have ever demanded that big tech billionaires need to be brought to heel.
“I recently spoke to a group of young people who were clear that good online safety comes from good education and the opportunity to discuss and ask questions.”
Read more about online safety in the UK Firms working on the UK government’s Safety Tech Challenge have suggested that scanning content before encryption will help prevent the spread of child sexual abuse material – but privacy concerns remain. Many of the regulatory bodies overseeing algorithmic systems and the use of data in the UK economy will need to build up their digital skills, capacity and expertise as the influence of artificial intelligence and data increases, MPs have been told. Fact-checking experts tell House of Lords inquiry that upcoming Online Safety Bill should force internet companies to provide real-time information on suspected disinformation, and warn against over-reliance on AI-powered algorithms to moderate content.