Search
Search
Friday 06 October 2023
Press contact
A new law to make social media firms more responsible for users’ safety needs to do more to protect those most vulnerable to harm, say researchers from Coventry University.
The Online Safety Bill – which includes new laws around pornography, illegal content on subjects such as suicide and information bereaved parents can gain access to – has been passed by peers in the House of Lords and is now awaiting royal assent from King Charles.
Marcus Maloney and Sarah Kate Merry were part of a team of academics from Coventry University’s Research Centre for Postdigital Cultures that conducted research into the Bill’s effectiveness in making positive cultural change for online behaviours and influencing future policies around harms experienced by young people and adults.
Their report, which was based on workshops held with organisations including the Suzy Lamplugh Trust, the Organisation for the Review of Care and Health Apps, Men’s Health Forum and Samaritans, found significant gaps in the new Bill and made a number of recommendations which were put before various members of the House of Lords.
One of the recommendations was successfully included in the Bill which now means that women and girls are specifically mentioned as a group particularly at risk of online abuse. Coventry University’s researchers were one of several groups asking for this inclusion.
Four other recommendations were put forward, including:
The university’s researchers are now preparing evidence for an inquiry into whether the Government has the necessary means in place once the Bill becomes law.
The goals of the Online Safety Bill are wide-ranging, but ultimately it seeks to protect young people from seeing illegal and 'legal but harmful' content.
This will not be achieved solely through the Bill’s focus on stricter regulation and harsher penalties for platforms that fail to meet the bill’s requirements.
Marcus Maloney, Assistant Professor of Sociology in the Research Centre for Postdigital Cultures
More needs to be done to ensure ongoing work by platforms, including regular risk assessments and use of appropriate content moderation, as well as coordination between platforms and organisations working to protect those who are vulnerable to harm.
Dr Sarah Kate Merry, Research Fellow in Research Centre in Postdigital Cultures
Find out more about the Research Centre for Postdigital Cultures.