[Dcpr] Due Diligence Recommendations

Pär Lannerö par.lannero at metamatrix.se
Fri Jan 9 18:48:29 EST 2015


Thanks again for your efforts!

Please accept my small contribution in the form of comments in the attached copy of your document.

Best regards
Pär Lannerö, CommonTerms




Från: dcpr-bounces at lists.platformresponsibility.info [mailto:dcpr-bounces at lists.platformresponsibility.info] För LB at lucabelli.net
Skickat: den 26 december 2014 16:32
Till: dcpr at lists.platformresponsibility.info
Ämne: [Dcpr] Due Diligence Recommendations

Dear all,

Thanks a lot for your very constructive inputs and critiques with regard to the original draft of the Due Diligence Recommendations. We have significantly amended the original draft compiling your comments and suggestions. You may find the revised version both in attachment and below (the version below does not have footnotes).

By all means, this is not the final version yet and you are all encouraged to share your comments on this second draft so that we can furhter enhance it. Ideally, the final version of the Due Diligence Recommendations should be released on 31 January.

The second comment period is now open and will last until 20 January. Although we have greatly appreciated your comments via personal emails, we kindly suggest that you circulate your future comments on the mailing-list to trigger discussion amongst the DC PR members.  Alternatively, you may also comment the Due Diligence Recommendations using this google doc https://docs.google.com/document/d/1N0_aVbWSNt-S12O0KLebqYVxM6gj5Uz3qqHDUkmlPc0/edit?pli=1#

Lastly, you are all encouraged to share any relevant info (articles, posts, events, etc.) related to online platforms’ responsability to respect human rights, using this mailing-list.

As this year is ending we wish you, your families and friends an excellent year ahead.
All the best,

Luca, Nicolo and Primavera


DRAFT

Dynamic Coalition on Platform Responsibility
________________________________
Due Diligence Recommendations
on Terms of Service and Human Rights
________________________________

Introduction
The following recommendations aim at fostering online platforms’ responsibility to respect human rights, by providing guidance with regard to the adoption of “responsible” terms of service. Besides identifying minimum standards for the respect of human rights by platform operators (standards that “shall” be met), these recommendations  suggest best practices (which “should” be followed) for the most “responsible” adherence to human rights principles in the drafting of terms of service.

                   I.                   Background

Aside from cases of complicity in human rights violations rising to the level of international crimes over which the International Criminal Court has jurisdiction, private entities cannot be held liable under international law for human rights violations, as they are no parties to human rights treaties. Yet, respect of human rights undoubtedly represents an important factor to take into account when assessing the activities of corporations from the perspective of a variety of stakeholders, including governments, investors and increasingly, consumers.

This is especially relevant in the context of online platforms designed to serve the needs of a global community, and forced to satisfy different, often conflicting legal requirements across the various jurisdictions where they operate. Thus, in light of the important role that online platforms are playing in shaping a global information society and the significant impact they have on the exercise of the rights of Internet users, a moral and social expectation has formed that such entities behave “responsibly”, ensuring the respect of the rule of law and the human rights of the Internet users across the globe.

The existence of a responsibility of private sector actors to respect human rights, which was recently affirmed in the UN Guiding Principles on Business and Human Rights and unanimously endorsed by the UN Human Rights Council, is grounded upon the tripartite framework developed by the UN Special Rapporteur for Business and Human Rights, according to which States are the primary duty bearers in securing the protection of human rights, corporations have the responsibility to ensure the respect of human rights, and both entities are joint duty holders in providing effective remedies against human rights violations.

As part of this responsibility, corporations are expected to: (1) make a policy commitment to the respect of human rights; (2) adopt a human rights due-diligence process to identify, prevent, mitigate and account for how they address their impacts on human rights; and (3) have in place processes to enable the remediation of any adverse human rights impacts they cause or to which they contribute.

These recommendations focus on one of the most concrete and tangible means for online platforms to bring that responsibility to bear: the contractual agreement which Internet users are required to adhere to in order to utilise their services (the so called “Terms of Service”), thus becoming platform users. Specifically, the recommendations constitute an attempt to define “due diligence” standards for online platforms with regard to four essential components: privacy, freedom of expression, due process and protection of children and young people. In doing so, they aim to provide a benchmark for respect of human rights in the private governance sphere, both in the relation of a platform’s own conduct as well as with regard to the scrutiny of governmental requests that they receive.  As recently stressed by the Council of Europe’s Commissioner for Human Rights, guidance on these matters is particularly important due to the current lack of clear standards. This applies a fortiori in the context of online platforms, given the crucial role that these entities have in ensuring practical compliance with fundamental rights on the Internet.

   II.            Privacy

The first section of these recommendations provides guidance over the rules that online platform operators should adopt in order to ensure the protection of their users against any unnecessary and unreasonable collection, use and disclosure of personal data.

A.                               Data Collection

Platform operators should limit the collection of personal information to what is directly relevant and necessary to accomplish a specified purpose. They shall also obtain consent for every type of information collected (by category), rather than through a single general-purpose consent form. If consent is withdrawn, the platform is no longer entitled to process such data. Although withdrawal is not retroactive, i.e. it cannot invalidate the data processing that took place in the period during which the data was collected legitimately, it should, in principle, prevent any further processing of the individual’s data by the controller.

In principle, platform operators should also refrain from collecting data by automatically scanning content privately shared by their users. Admissible derogations to this principle include the need to fight against unsolicited communications (spam) and to ensure network security, and should not extend to commercial or advertising purposes in the absence of specific and express platform-user consent.

Platform operators shall always offer their users the possibility to opt out from the tracking of their behavior both by the platform within other services, and by other services within the platform.

In order to facilitate user oversight on the application of these principles, platform operators should  allow their users to view, copy and modify the personal information they have made available to the platform, and are encouraged to do so enabling download of a copy of their personal data in interoperable format.


B.   Data Retention

Platform operators should clearly communicate in their terms of service whether and for how long they are storing any personal data. As a general rule, any retention beyond 180 days should be specifically justified by a function of the platform or by requirements imposed by a  “legitimate law”.

C.   Data aggregation

Aggregation of platform users’ data should only be done subject to express consent. Aggregation of data across multiple services or devices requires extra diligence from the part of the data controller and processor, since it might result in data being processed beyond the original purpose for which it was collected. Although this does not prevent the implementation of cross-device functionalities, it is necessary to ensure that platform users properly understand the scope of the given consent.

D.   Data Use

Platforms shall obtain consent in order to use personal data (including platform users’ contacts and recipients), unless such use is prescribed by a legitimate law.  It is also recommended that such consent is required for platforms to make personal data available to the public through search engines, and to give their  users the option to opt out.

The requirement of consent also apply to personal data over which the platform acquires the right to use the data therein; as a general rule, such use should never be broader than the original purpose for which personal data was shared. Oftentimes, the use of personal data is instrumental to the improvement of existing services, or the development of new services and functionalities. Yet, a broad and open-ended permission on the use of platform users’ personal data for “future services” can lead to abuses, in particular by making it possible for the platform to offer personalised services on the basis of the information provided, and automatically enroll them into services or functionalities that they did not intend to receive at the time of registration. This is in conflict with the right of users to informational self-determination, including the right not to be subject to any decision based solely on automated processing of data or without taking their view into consideration. For this reason, it is recommended that platforms specify in their ToS that the purpose of processing of personal data is limited to the scope of existing services, and the enrolment of platform users into any new service will require their acceptance to new ToS. Platform operators should also give them the opportunity to object to such usage and  demand the rectification of inaccurate data. Furthermore, platform users shall always be able to obtain information about any predictive or probabilistic techniques they have been used to build their profile, and their underlying rationale.

Lastly, platform operators shall always permit their users to delete their account in a permanent fashion. Likewise, if there is no other legal reason justifying the further storage of the data, the data processor shall proceed with the permanent deletion of all or portions of the relevant data associated with the platform user’s account, in a time that is reasonable for its technical implementation.

E.   Data protection vis-à-vis third parties

Platforms play a crucial role in enforcing the protection offered by the legal system against interference with users’ right to privacy. Therefore, platform operators should provide effective remedies against the violation of internationally recognised human rights. For this reason, they should establish clear mechanisms for platform users to gain access to all of their personal data held by a third party, as well as to be informed of the actual usage thereof. Platform operators should also enable their users to report privacy-offending content and to submit takedown requests.They should also implement a system to prevent the impersonation of platform users by third parties, although exceptions would need to be made for the impersonation of public figures in ways which contributes to the public debate in a democratic society.
A second set of concerns pertains to the possibility to preempt any interference with platform users’ personal data, by preventing third parties’ access to platform user’s content and metadata. Firstly, platform operators should allow users to preserve their anonymity vis-à-vis third parties to the extent permitted by legitimate laws, both within the platform itself and within other websites when such platform is used as an identity service. Secondly, it is recommended that platforms enable end-to-end encryption of communications and other personal information, in the context of both storage and transmission. In that respect, best practice is when the decryption key is retained by the platform user, except where the provider needs to hold the decryption key in order to provide the service and the platform user has provided informed consent.
As regards the handing over of platform users’ data upon governmental request, platform operators should specify that they execute such request only in the presence of a valid form of legal process, and release a periodic transparency report providing, per each jurisdiction in which they operate, the amount and type of such requests, and the platforms’ response (in aggregate numbers).

   III.         Due Process

Due process is a fundamental requirement for any legal system based on the rule of law. “Due” process refers to the non-derogability of certain procedures in situations which may adversely affect individuals within the legal system. This includes the application of minimum safeguards such as the clarity and predictability of the substantive law, the right to an effective remedy against any human rights violation and the right to be heard before any potentially adverse decision is taken regarding themselves.
Due process has significant implications with regards to potential amendment and termination of contractual agreements, as well as the adjudication of potential disputes. This section aims to give content the due diligence in this context.

A.                               Amendment and termination of contracts

Terms of Service should be written in plain language that is easy to understand. Wherever possible, the platform operators should provide an accessible summary of the key provisions of the terms of service. The platform operators should give its users meaningful notice of any amendment of the ToS affecting the rights and obligation of the users. Meaningful notice should be provided in a way that enables platform users to clearly see, process and understand the changes. Contractual clauses that permit unilateral termination by platforms without appropriate grounds shall not be used.

In addition, platform operators should consider giving notice even of less significant changes, and enabling their users to access previous versions of the terms of service. Ideally, platforms operators should enable their users to continue using the platforms without having to accept the new terms of service related to the additional functionalities which have been added to the platform, unless this would impose significant costs and complexity to the operators. Meaningful notice should also be given in advance, prior to termination of the contract or services . Besides, to reduce the imbalance between platform users and platforms owners when it comes to litigation, it is recommendable that the ToS be negotiated beforehand with consumer associations or other organisations representing Internet users. In order to prevent wrongful decisions, it is also recommended that platforms make termination of accounts of particular platform users possible only upon repeated violation of ToS.

B.   Adjudication

Disputes can arise both between platform users and between a particular platform user and the platform operator. In both cases, platform operators should provide alternative dispute resolutions systems to allow for quicker and potentially more granular solutions  than litigation for the settling of disputes. However, in view of the fundamental importance of the right of access to court, alternative dispute resolution systems should not be presented as a replacement of regular court proceedings, but only in addition to those. In particular, platform operators should not impose waiver of class action rights or other hindrances to the right of an effective access to justice, such as mandatory jurisdiction outside the place of residence of Internet users. Any dispute settlement mechanism should be clearly explained and offer the possibility of appealing against the final decision.

   IV.         Freedom of Expression

Freedom of expression is a fundamental right consisting of the freedom to receive and impart information in a lawful manner, including by way of association. In the online platform context, the effectiveness of this right can be seriously undermined by disproportionate monitoring of online speech and repeated government blocking and takedown. The following section provides guidance as to how platforms should handle such matters through their terms of service.

A.                               Degree of monitoring

Although there are no rules to determine, in general terms, what kind of speech should or should not be allowed in private online platforms, certain platforms should be seen more as “public spaces” to the extent that occupy an important role in the public sphere. These actors have assumed functions in the production and distribution process of media services which, until recently, had been performed only (or mostly) by traditional media organisations. As a matter of fact, online platforms increasingly play an essential role of speech enablers and pathfinders to information, becoming instrumental for media’s outreach as well as for Internet users’ access to them.

Established principles and best-practices can serve to identify certain red-lines that should not be crossed. As a general rule, platform operators should not impose any restrictions on the kind of content that they host, with the exception of content that is harmful or explicitly forbidden under applicable legitimate laws (e.g. hate speech, child pornography and incitement to violence, as well as other kinds of undesirable content, such as unsolicited communications for direct marketing purposes or security threats) but only if necessary and proportionate to that purpose. It is of utmost importance that the rules imposing such restrictions are not formulated in such a way as to affect potentially legitimate content, as they would otherwise constitute a basis for censorship.

Similarly, although platforms can legitimately remove speech in order to enforce their terms of service, either on their own motion or upon complaint, such terms of service should be clear and transparent in their definition of the content that will be restricted within the platform, including the use of certain screen names. To this end, platforms can legitimately prohibit the use of the name, trademark or likeness of others. In addition, platforms operator should always provide clear mechanisms to notify those platform users whose content has been removed and provide them with an opportunity to challenge and override illegitimate restrictions.

B.   Government blocking and takedowns

Transparent procedures should be adopted for the handling and reporting of governmental requests for blocking and takedown in a way that is consistent with internationally recognised laws and standards. Firstly, platform operators should execute such requests only where there is a legal basis for doing so and a valid judicial order. Secondly, platforms operators should notify their users of such requests, ideally giving them an opportunity to reply and challenge the validity of such requests, unless specifically prohibited by legitimate law. Finally, as already mentioned in the context of government requests for data, platform operators should implement law enforcement guidelines and release periodic transparency reports.

V. Protection of children and young people

A special category of concerns arises in the case of children and young people, towards which platform operators should exercise a higher level of care. Platform operators should adopt particular arrangements, beyond the mere warning for inappropriate content and age verification that can be imposed by legitimate law for certain types of content.

Firstly, although terms of service should generally be drafted in a way that is comprehensible to all, those regulating platforms open to children and young people should include facilitated language or an educational video-clip and, ideally, a set of standardised badges  to make their basic rules comprehensible by all users regardless of their age and willingness to read the actual terms of use. Secondly, it is recommended that platforms provide measures that can be taken by children and young people in order to protect themselves while using the platform, such as utilising a “safer navigation” mode. Thirdly, platform operators should consider providing a specific mechanisms to ensure removal or erasure of content created by children and young people.
As an element of media literacy, all platform users should be informed about their right to remove incorrect or excessive personal data.




Annex 1: Definitions

a) Platform:
For the purpose of these recommendations, platforms are understood as cloud-based public-facing interfaces or “spaces” allowing users to impart and receive information or ideas according to the rules defined into a contractual agreement.

b) Terms of Service:
The concept of “terms of service” utilised here covers not only the contractual document available under the traditional heading of “terms of service” or “terms of use”, but also any other platform’s policy document (e.g. privacy policy, community guidelines, etc.) that is linked or referred to therein.

c) Function of the Platform:
Function that the community has attributed to the platform on the basis of the legal, commercial and social expectations that it has generated. This should not be confused with a platform’s functionalities, which constitute merely one (albeit important) element to identify the overall function(s).

d) Platform Operator
Natural or legal person defining and having the possibility to amend the platform’s terms of service
e) Platform User
Natural or legal person entering into a contractual relationship defined by the platform’s terms of service.
f) Internet User
Natural or legal person who is using Internet access service, and in that capacity has the freedom to impart and receive information. The Internet user may be the subscriber, or any person to whom the subscriber has granted the right to use the Internet access service s/he receives.
g) Data:
Content and/or personal information. Data can belong to both categories simultaneously.

h) Content:
Text, image, audio or video which results from the engagement of a particular platform user with the platform, even on a transient basis. This includes, for example, messages and queries typed by a platform user.

i) Personal Data/Personal Information:
Personal data is any information about an individual that can be used to distinguish or trace an individual’s identity, such as name, social security number, date and place of birth, etc. This is not intended to cover identification which can be accomplished via very sophisticated methods. This notion of personal data equates with that of Personally Identifiable Information (PII), defined as “any information about an individual maintained by an agency, including (1) any information that can be used to distinguish or trace an individual‘s identity, such as name, social security number, date and place of birth, mother‘s maiden name, or biometric records; and (2) any other information that is linked or linkable to an individual, such as medical, educational, financial, and employment information.”
j) Consent:
Consent means any freely given, specific and informed indication of the data subject’s wishes by which s/he signifies  her/his agreement to personal data relating to her/himself being processed. To that end, every user shall be able to exercise a real choice with no risk of deception, intimidation, coercion or significant negative consequences  if he/she does not consent to data aggregation.

k) Express Consent:
Express consent is a type of consent which (in contrast with “implicit” or “implied” consent) requires an affirmative step in addition to the acceptance of the general ToS, such as clicking or ticking a specific box or acceptance of the terms and conditions of a separate document.

l) Privacy:
Privacy is an inalienable human right enshrined in Article 12 of the Universal Declaration of Human Rights, which establishes the right of everyone to be protected against arbitrary interference with their privacy, family, home or correspondence, and against attacks upon his honour and reputation. In the context of online platforms, this encompasses the ability for data subjects to determine the extent to which and the purpose for which their personal data is used by data controllers, including the conditions upon which such data can be made available to third parties (right to informational self-determination).

m) Freedom of Expression:
The right to freedom of expression, enshrined in article 19 of the UN Declaration of Human Rights, includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers. The right to freedom of opinion and expression is as much a fundamental right on its own accord as it is an “enabler” of other rights, including economic, social and cultural rights.

n) Hate Speech:
Although there is no universally accepted definition of “hate speech”, the term shall be understood as covering all forms of expression which spread, incite, promote or justify racial hatred, xenophobia, anti- Semitism or other forms of hatred based on intolerance, including: intolerance expressed by aggressive nationalism and ethnocentrism, discrimination on any grounds such as race, ethnicity, colour, sex, language, religion, political or other opinion, national or social origin, property, disability, birth, sexual orientation or other status. In this sense, “hate speech” covers comments which are necessarily directed against a person or a particular group of persons.

o) Due Process:
Due process is a concept referring to procedural rights which are essential for the respect of the rule of law, comprising: (1) the right to an effective remedy by a competent tribunal for any acts violating one’s fundamental rights granted by the law or the Constitution (ex article 8 of the Universal Declaration of Human Rights); and (2) and the right to an independent and impartial tribunal, in the determination of one’s rights and obligations and of any criminal charge against him/her (ex. art.10 of the Universal Declaration  of Human Rights). In the context of online platforms, the urgency and efficacy of the protection of these rights might require an expansion of the scope of application of these rights beyond the traditional notion of “tribunal”.

p) Legitimate Law:
Laws and regulations should be deemed as legitimate when they respond to a pressing social need and, having regard to their tangible impact, they can be considered as proportional to the aim pursued.
The concept of “legitimate law” which is used to justify a potential restriction shall be interpreted as referring to a law or regulation which does not manifestly fail the requirements for permissible restrictions  identified for freedom of expression (and applicable, mutatis mutandis, to restrictions of other fundamental rights) by the UN Special Rapporteur on the promotion and protection of the right to freedom of expression and opinion, specifically:
(a) It must be provided by law, which is clear and accessible to everyone (principles of predictability and transparency);
(b) It must pursue a legitimate purpose (principle of legitimacy); and
(c) It must be proven as necessary and the least restrictive means required to achieve the purported aim (principles of necessity and proportionality).
If it is manifest that the measure would not pass these three-pronged test, the platform operator should deny the request and, to the extent possible, challenge it before the relevant court.







-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.platformresponsibility.info/archive/dcpr/attachments/20150109/352408cf/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: DRAFT Due Diligence Recommedations_pl.docx
Type: application/vnd.openxmlformats-officedocument.wordprocessingml.document
Size: 52033 bytes
Desc: DRAFT Due Diligence Recommedations_pl.docx
URL: <http://lists.platformresponsibility.info/archive/dcpr/attachments/20150109/352408cf/attachment-0001.docx>


More information about the DCPR mailing list