The Online Safety Bill will introduce ground-breaking laws to protect some of the most vulnerable in our society online, particularly children. That said, I note the concerns raised around the scope of the Bill, and its application to smaller platforms.
Under the new legislation, in-scope companies will need to remove and limit the spread of illegal content and activity online. This includes illegal content which encourages or incites suicide online, with all companies expected to take swift and effective action against such content. In addition, companies whose services have high-risk functionalities, and which have the largest audiences will also be required to take action on content which is legal, but which may cause harm, such as material which relates to self-harm or suicide. These companies will need to set out in clear terms and conditions what is acceptable on their services and enforce those terms and conditions consistently and transparently.
The tiered approach to the regulations seeks to protect freedom of expression and mitigate the risk of disproportionate burdens on small businesses. The Government also believes it will ensure that companies with the largest online presence are held to account, addressing the mismatch between companies’ stated safety policies and many users’ experiences online.
I am encouraged that the Government is also taking measures to ensure that criminal law is fit for purpose in relation to harmful and dangerous communications online. Social media companies have a responsibility to tackle ‘legal but harmful’ content, such as exposure to self-harm, harassment and eating disorders, set by the government and approved by Parliament.