Should social network platforms enforce human rights? IGF 2014 Dynamic Coalition on Platform Responsibility

At the 2014 Internet Governance Forum, the Dynamic Coalition on Platform Responsibility met to discuss the role of private intermediaries in enforcing social norms and law.

One of the most interesting points made by Rebecca McKinnon was that corporations have some (limited) responsibility to protect human rights (see further UN, ‘The Corporate Responsibility to Respect Human Rights’ (2012)).

The UN Guiding Principles on Business and Human Rights explain that

The responsibility to respect human rights is a global standard of expected conduct for all business enterprises wherever they operate. It exists independently of States’ abilities and/or willingness to fulfil their own human rights obligations, and does not diminish those obligations. And it exists over and above compliance with national laws and regulations protecting human rights.
Addressing adverse human rights impacts requires taking adequate measures for their prevention, mitigation and, where appropriate, remediation.

The obligation is pretty light, but it provides an interesting way to look at many current regulatory questions. Whether we’re talking about the ‘Right to be Forgotten‘ or responses to hate speech or revenge porn, it becomes very interesting to decouple the obligations of platforms from their potential liability.

The Dynamic Coalition is trying to work through some of these issues. The Coalition’s first steps will be to focus on the compatibility of platform Terms of Service with human rights standards, and evaluate the processes of due diligence that have emerged for enforcing those standards. The group expects to provide a preliminary report by the end of the year – you can get involved by joining the mailing list.

You can watch a video archive of the event below – or see the transcript here.