Ofcom has issued a warning that social media companies will be mandated to address disinformation and hateful or violence-provoking content in response to a series of racist riots in the UK.
Following the tragic stabbing of three girls in Southport on July 29, 2024, false rumors circulated on social media claiming the perpetrator was an asylum seeker of Muslim faith. This led to Islamophobic far-right riots in several English towns and cities targeting mosques, asylum seeker accommodations, and immigration centers.
Ofcom emphasized the importance of combating illegal content online as a priority. The UK’s Online Safety Act (OSA) will impose new duties on tech firms to protect users from illegal content, including content inciting hatred, violence, or disinformation. Once the act takes effect in late 2024, tech firms will have three months to assess and address the risk of illegal content on their platforms.
While Ofcom’s enforcement powers will focus on platforms’ systems and processes rather than specific content, individual accounts will not be targeted. The government denounced comments made by social media owner Elon Musk inciting civil unrest and violence, emphasizing the need for platforms to fulfill their responsibility to promptly remove criminal content.
While some criminal offenses under the Online Safety Act are already in force, it remains uncertain if these apply to those using social media to incite violence. Despite potential limitations in addressing misinformation and violence incitement, the act aims to enhance online safety and hold tech firms accountable for illegal content on their platforms.
Concerns have also been raised about Ofcom’s power to require encrypted services to use “accredited technology” to detect illegal content, potentially compromising the privacy and safety of encrypted communications. However, Ofcom’s chief executive has assured that the focus is on addressing the root causes of online harm and setting new safety standards for online platforms.