Carnegie UK Trust has joined forces with other organisations with an interest in digital policy and online harms to call for the government to ensure that their proposed Online Harms Bill covers democratic harms.
Following on from an earlier joint statement on the issue at the time of the government’s consultation, we have again worked with Demos, Doteveryone, the Jo Cox Foundation, the Institute for Strategic Dialogue, Glitch and the Fawcett Society to co-produce a briefing note aimed at Parliamentarians, policymakers and others with an interest in online harms.
The government recently published its “initial” response to the Online Harms White Paper which, as we noted at the time, included many welcome clarifications of their proposed approach, including the systemic, risk-based nature of the statutory duty of care and the measures by which this approach would protect freedom of expression and other fundamental rights. Both of these clarifications align closely with our detailed work on a duty of care.
However, there is little indication in this response that the scope of harms covered by the proposed duty of care will extend to democratic harms, which are likely to be exacerbated by many design decisions for which the platforms should be held accountable. These include:
- disinformation – whether intended to subvert democratic or electoral processes or to spread false or misleading content, such as anti-vaccination material;
- manipulation of the information environment, eg through algorithmic design, data collection or attention-grabbing tools, in order to promote and spread such content; and
- the creation of an environment where abuse or intimidation of public figures, particularly women or minorities, can lead to a silencing effect or their withdrawal from democratic participation.
We have set out in our joint briefing note some of the responses that we feel need to be included to address these harms, including greater transparency, stronger redress systems and greater protection for vulnerable groups. Similar processes also apply to another area of harm that is conspicuous by its absence in the government proposals: economic harms, whether the proliferation of scams, promotion and sale of illegal or unsafe products, and other fraudulent activity online. We are working to make the case for their inclusion in the legislation with a number of organisations interested in this area too.
The lead on both these areas of harm does not sit in the Department for Digital, Culture, Media and Sport (DCMS), which developing this legislation. This split in Ministerial and policy responsibility creates the ideal conditions for action on them to fall through the cracks, whether deliberately or otherwise. A Cabinet Office-led consultation on electoral integrity, including a number of proposals with a digital or online dimension, is long overdue. Yet, despite forensic investigations by the DCMS Select Committee in the last session and by Lord Puttnam’s ongoing inquiry into Democracy and Digital Technologies, the DCMS-led online harms consultation response referred back to work on “electoral integrity and related online transparency issues being taken forward as part of the Defending Democracy programme together with the Cabinet Office”. It is all too easy in such a broad and complex policy area for government departments to point the finger at one another, while none of them take the lead. We hope that our joint work with our co-signatories to the democratic harms briefings, along with our allies on action on economic harms, will prevent such a missed opportunity in the development of robust and effective online harms legislation.