Author:
Professor Lorna Woods OBE, University of Essex, William Perrin OBE, Trustee Carnegie UK and Maeve Walsh, Associate Carnegie UK
Year: 2023

As part of our Tackling Online Harm programme, Carnegie UK has worked with various civil society organisations, international bodies and academics in recent years on codes of practice to address specific aspects of online harm, such as hate crime and violence against women and girls. These codes have all been underpinned by the systemic approach to regulation that we developed in 2018 as a way to reduce harm arising from social media while recognising international human rights. 

We have now developed this work further in this model code - which has many similarities with the UN Guiding Principles on Business and Human Rights and the OECD Guidance for Multinational Enterprises - providing a common framework for a company approach to risk assessment and mitigation, which could be deployed across multiple content domains and jurisdictions. 

The model code is based on a four-stage information flow model which reflects the role of platforms in creating and influencing the flow of content from their users. The four stages comprise: access to the service and content creation; discovery and navigation; user response tools; and platform response. As such, it provides significant flexibility: it allows a company to develop and apply the framework within its own context, is future-proofed and allows for modular development, depending on the service provider and requirements of the local jurisdiction.

Recent Publications