Promoting Platform Responsibility for Content Management

4 September 2014 - A Workshop on Other in Istanbul, Turkey

Brief substantive summary of the workshop and presentation of the main issues that were raised during the discussions

Nicolo Zingales set the stage with two comment, concerning the difference and relationship between “platform responsibility” and “internet intermediary liability” (IIL). Platform responsibility should be seen as a concept above and beyond IIL: in contrast with IIL, it is not legally enforceable, but meeting a social expectation for the intermediary (for instance, of proactive enforcement in a particular area of law; of creating a balanced framework for adjudication, making available the defenses provided by law; or simply of ensuring the availability of effective remedies for violation of fundamental rights). The two concepts can, however, be related since due diligence obligations can be part of IIL legislation, as conditions for safe harbors (for example, through the adoption of a code of conduct); on the other hand, no or too vague IIL rules may lead to the development of alternative, private “legal systems” with not necessarily responsible Terms of Service (ToS).

Panelists presentations provided a snapshot of the positions to capture the main issues at stake, following which the floor was opened for Q&A.

Robin Gross began by stressing that it is important not to ask platforms to monitor legality, because that is gong to have detrimental effects on innovation and freedom of expression. She also emphasized that platforms are responsible both for their customers and to their customers. On the one hand, the customer has a responsibility not to violate copyright law, and platforms should not encourage illegal behavior, or have knowledge and materially profit from it; on the other hand, however, law shouldn’t make it hard for customers to be legitimate, because that encourages piracy.

Konstantinos Komaitis reminded thatInternet has assigned role and responsibilities to each stakeholder, which should be respected as much as possible in order not to upset the established dynamics. We shouldn't over-regulate, for this may hinder innovation; however, it is also important that we prevent the imposition of top-down solutions by online platforms, and we try to involve users as part of the social contract.

Paolo Lanteri pointed out that WIPO has been using the term responsibility since 2008/09, referring to something broader than liability. WIPO could be the right forum to discuss at least some of the issues at stake, however it has a limited mandate. International treaties provide little guidance on the responsibility of intermediaries; one aspect which could be seen as relevant is article 8 WCT, where it establishes that the provision of physical facilities doesn’t amount to communication. Another is article 14 WCT, requiring member States to have effective enforcement measures including expeditious remedies to prevent infringements and remedies which constitute a deterrent to further infringements.
As early as 2003, the Guide to WIPO Treaties talks about the need for immunities for intermediaries as well as the need to promote cooperation among players (with marketplace solutions to be preferred). However, the limitations and exceptions (including those for access to knowledge, education and research) are a key part of the picture. Combining voluntary agreements with technology has proved to be a successful approach in areas such as access to books for blind and global data management. There is also a growing interest in the context of the Advisory Committee on Enforcement on non-punitive measures, including educational notices, and ADR mechanisms.

Marco Pancini described Google’s focus on two important principles: first, the free flow of information. Second, the need to be a responsible player. Technology can help to find this balance, because it allows to simply monitor, object, and enforce. However, most frequently we are talking about voluntary measures, so we need to have a multistakeholder approach in order to coordinate these voluntary measures. The E-commerce directive can be seen as a best practice; at the same time, discussion should be done on how can we improve it: for example, on the voluntary measures such as the recently launched “follow the money” approach. How can we make sure to achieve effectiveness while the rule of law and effectiveness?

Janine Moolman drew the attention to the serious threats that women receive online and the chilling effects that it will have on speech. In addition, they have a number of significant adverse effects on women such as loss of income, limited mobility, long term psychological harm through revictimisiation and even in some cases, death through suicide or at the hands of their abusers. It is therefore extremely important that conversations about intermediary liability consider these situations. Women who experience these violations most often have very little recourse; for this reason, APC has just completed a research project that looks at the domestic legal remedies and remedies available through internet intermediaries for women who experience online violence against women. APC’s study covered the following companies - Platforms such as Facebook, Twitter, Google+, Youtube, Instagram, WordPress, Pornography websites such as XVideos and Youporn, Internet Portals such as Google-Colombia and Yahoo-Philippines, Telephone, Mobile and Internet Service Providers such as Telecom in Bosnia and Herzegovina, Claro ETB in Colombia, Airtel in DRC and others including national telecom services in Kenya, Pakistan, Mexico, Philippines.

In relation to social platforms, such as Facebook, Google+, Twitter, Orkut and others, the following specific instances of harassment, bullying, threats and other kinds of violence, both psychological and extending sometimes to physical harm, have been found:
- Creation of imposter profiles of women; often to discredit, defame and damage their reputations.
- Spreading private and/or sexually explicit photos/videos; often with intent to harm, and accompanied by blackmail.
- Pages, comments, posts, targeting women with gender-based hate (misogynistic slurs, death threats, threats of sexual violence, etc.)
- Publishing personal identifying information about these women including names, addresses, phone numbers, email addresses without their consent.

In addition, the study found out that the TOS of most companies refer to illegal uses that involve violation of copyright, financial fraud, extortion, child pornography but do not specifically mention any human rights abuses, especially those based on gender, sexuality or related issues. Even if there is a clear definition of unacceptable use in the TOS, this is usually not accompanied by the setting up of a clear, easy-to-access and transparent procedure to deal with complaints of violations by any of the users vis-a-vis others on the website or platform.

She concluded that we have an opportunity to change the conversation, to shift the burden of dealing with technology-related harassment to the companies from the State or national governments where their operations are located, or to the individuals facing harassment. Some of the recommendations for the companies to adopt “responsible” corporate policy on violence against women includes the following:

1) Companies, especially multinational companies, providing internet/telephony services, social and pornography sharing platforms, must abide by the United Nations Guidelines on Business and Human Rights (and the related “Protect, Respect and Remedy” framework). While under the guidelines it is the duty of the State or government to ensure this, the company should make a formal and authentic commitment to upholding human rights, and taking action to prevent and address violence against women and transgender people.

2) The privacy policy of the company should provide adequate protection to those who are vulnerable to forms of violence and harassment via the company’s services. This privacy policy should protect against technology related violence, and not be a veil behind which the aggressor’s acts of harassment and violence are protected.

3) Companies should invest in capacity building of its staff and its personnel in customer service departments, and also have separate procedures for dealing with complaints of technology related violence against women and others who are vulnerable. Most of those employed in these departments are not aware of how technology can be used for violence and harassment, nor are they aware of the duties and the obligations of the company to protect their users against such violence under national law, international guidelines and under their own terms of service

4) Agreements between the user and the company, such as terms of service and privacy policy should make explicit references to technology related violence against women (not only a general prohibition against illegal use) and they should provide a transparent and easily accessible process for filing complaints where the privacy of the complainant is protected.

Nicolas Suzor pointed out that there is insufficient discussion on the various tradeoffs, and insufficient focus on empirics in this debate. While the rule of law requires transparency, public rules, predictability and enforcement in accordance with due process, the problem is that often market deals do not have this legitimacy. He suggested that one of the ways to increase legitimacy is to have representation of users at the negotiating table, and we need transparency in regulation by private actors. In addition, it needs to be borne in mind that the problems of intermediary liability are dependent on the problems of substantive law, which the intermediaries are asked to address.

Joy Liddicoat summarized the discussion emphasizing the important questions of equality between rights, the need to take into account innovation spillover and side-effects and the potential for multistakeholder cooperation in this space.

Conclusions drawn from the workshop and possible follow up actions

Multistakeholder cooperation can work in defining roles and responsibility on online content management. It was repeatedly suggested that multistakeholder cooperation should be encouraged in order to bridge different communities (across different areas of law) and better involve users into the discussion. One way to tackle this important work is through the Dynamic Coalition on Platform Responsibility, that will aim to produce model contractual clauses for online platforms to ensure compliance with human rights. The importance of transparency and of involving users' and civil society in the discussion were recognized as an important components by workshop participants in order to harness the power of multistakeholder cooperation, align the behavior of online platforms with users' needs, generate trust and stir market forces in the right direction.

The discussion should start from a recognition of the UN Guiding Principles and the UN “Ruggie framework”, and furthered by the development of principles of due diligence, accountability, communication and user empowerment. This discussion might also be facilitated by identifying cases of clearly illegal activity, where the conflict of legitimacy and due process can be solved easily. It was suggested that we can think about due process as a pyramid, which builds in safeguards commensurate to the substantive issues at stake. It was also mentioned that a binary approach (due process vs. online harassment; and free speech vs. online harassment) is simplistic, and can be accommodated by a nuanced definition of protected rights.

One of the issues highlighted for further research concern the role of financial intermediaries, and the due process guarantees that apply in case of involvement of these entities as part of the strategies of online content management.

Estimation of the overall number of participants present at the workshop


Estimation of the overall number of women present at the workshop

about half of the participants were women

Extent to that the workshop discuss gender equality and/or women’s empowerment

it was raised by one or more speakers as an important aspect of the workshop’s theme

A brief summary of the discussions in case that the workshop addressed issues related to gender equality and/or women’s empowerment

As stated above in the brief substantive summary, gender issues were raised by Janine Moolman and were considered as integral part of the discussion: the question was mainly the lack of specific rules in online platforms regarding behaviour against women online, and the lack of clear and effective mechanisms to complain against online harassment and have them addressed by the platform.

Reported by

No information provided