[Dcpr] Request for Comments - Due Diligence Recommendations

Rebecca MacKinnon mackinnon at newamerica.org
Sun Nov 30 11:36:24 EST 2014


Hello everybody. The Ranking Digital Rights team is formulating some
relatively detailed comments but I will not be in a position to finalize
and send them to the list until tomorrow or the next day. Since the google
doc appears to be read-only I may make some specific comments as tracked
changes in a word doc, working from the version I have.
Best,
Rebecca

On Tue, Nov 18, 2014 at 10:21 PM, Pär Lannerö <par.lannero at metamatrix.se>
wrote:

>  Dear Luca, dear all,
>
>
>
> Thanks for a potentially extremely valuable initiative!
>
>
>
> Having attended no IGF meeting, I may be missing out on some of the
> background, but for what it's worth I added some comments in the draft
> document, and a few minor edits. See attachment. (Couldn't find a comment
> feature in the pad.)
>
>
>
> Best regards
>
> Pär Lannerö, CommonTerms
>
>
>
>
>
> *Från:* dcpr-bounces at lists.platformresponsibility.info [mailto:
> dcpr-bounces at lists.platformresponsibility.info] *För *LB at lucabelli.net
> *Skickat:* den 4 november 2014 14:01
> *Till:* dcpr at lists.platformresponsibility.info
> *Ämne:* [Dcpr] Request for Comments - Due Diligence Recommendations
>
>
>
> Dear all,
>
> Please find below and in attachment, the DRAFT Due Diligence
> Recommendations. You are all encouraged to share your comments on this
> draft via our mailing-list or modifying the draft using this pad (*https://pad.lqdn.fr/p/DC_PR_Due_Diligence_Recommendations
> <https://pad.lqdn.fr/p/DC_PR_Due_Diligence_Recommendations>*).
>
> The draft will be open for comments until 30 November. A second draft will
> be developed compiling your comments and will be shared for a
> second comment-period on 15 December 2014.
>
> The final version of the Due Diligence Recommendations will be presented
> at the 2015 meeting of the DC PR, at the 10th IGF, and will be the basis
> for the elaboration of a set of model contractual provisions for the
> protection of platform users' rights (Platform User's Protections or PUPs).
> Thanks in advance for your comments on this draft.
>
> All the best,
> Luca, Nicolo and Primavera
>
>
>
> *Dynamic Coalition on Platform Responsibility *
>
> *DRAFT Due Diligence Recommendations*
>
>
>
>
>
> *Introduction*
>
>  The following recommendations aim at fostering online platforms'
> responsibility to protect and promote human rights, providing guidance with
> regard to the adoption of "responsible" terms of service. In this context,
> platforms are understood as cloud-based public-facing interfaces or
> "spaces" allowing users to impart and receive information or ideas
> according to the rules defined into a contractual agreement.
>
> Although private companies or corporations cannot be held liable for human
> rights violations, respect of human rights undoubtedly represents an
> important factor to take into account when assessing the activities of
> multinational companies from the perspective of a variety of stakeholders,
> including governments, investors and increasingly, consumers. This is
> especially relevant in the context of online platforms designed to serve a
> global community of consumers. In light of the important role that such
> platforms are playing in shaping a global information society that might
> have significant impact on the rights of Internet users, they are
> increasingly expected to behave responsibly in accordance with minimum
> human rights requirements. In particular, according to the UN Guiding
> Principles on Business and Human Rights, businesses are under the duty to
> respect human rights and to collaborate with states in order to provide
> effective remedies against human rights violations.
>
> These guidelines constitute an attempt to define minimum "due diligence"
> standards for online platforms with regard to three essential components:
> privacy, freedom of expression and due process.
>
>
>
> *Due diligence Recommendations*
>
>
>
> *1) Privacy *
>
> Privacy is an inalienable fundamental right. In the context of online
> platforms, the right to privacy implies the right for users to be protected
> against unreasonable intrusions into their private communications and the
> ability for users to determine the extent to which their personal data can
> be made available to third parties (right to informational
> self-determination), be they public or private entities. In line with the
> Council of Europe' s Convention for the Protection of Individuals with
> regard to Automatic Processing of Personal Data ("Convention 108"),
> "personal data" is defined as "any information relating to an identified or
> identifiable individual". A person is identifiable if additional
> information can be obtained without unreasonable effort, allowing for
> his/her identification by name.
>
> The following recommendations provide guidance over the rules that online
> platforms should adopt in order for their users to be protected against any
> unnecessary and unreasonable collection, use or disclosure of personal
> data.
>
>
>
> *1.1 Data Collection*
>
> Platforms should obtain informed consent prior to all data collection.
> Informed consent should be obtained for every type of information
> collected, rather than as a single general-purpose consent form. If
> consent is withdrawn, platforms should proceed with the permanent deletion
> of all or portions of the data associated with the user's account.
>
> Platforms should also refrain from collecting data by automatically
> scanning private user content, except to the extent necessary for fighting
> against unsolicited communications (spam) or for network security reasons.
> In addition, platforms should always offer users the possibility to opt out
> from the tracking of their behavior in other websites.
>
>
>
> *1.2 Data Retention*
>
> Platforms should clearly communicate in their terms of service whether and
> for how long they are storing any data other than a user's IP address. As a
> general rule, any retention beyond 180 days should be justified by a
> specific function of the platform or by requirements imposed by law.
> Furthermore, platforms' terms of service should allow users to fully and
> permanently delete their account, and prevent any further use of their
> personal data.
>
>
>
> *1.3 Data aggregation*
>
> Aggregation of user's data across multiple services or devices should only
> be done subject to explicit and informed consent. To this end every user
> shall be able to exercise a real choice with no risk of deception,
> intimidation, coercion or significant negative consequences if he/she does
> not consent to data aggregation.
>
>
>
> *1.4 Data Use*
>
> Platforms should obtain free, informed and specific user consent in order
> to use personal data (including users' contacts and interlocutors) , unless
> such use is specifically authorized by law. Oftentimes, the use of personal
> data is instrumental to the improvement of existing services, or the
> development of new services and functionalities. Yet, a broad and
> open-ended permission on the use of platform users' personal data for
> "future services" can lead to abuses, in particular by making it possible
> for the platform to automatically enroll users into services or
> functionalities that they did not intend to receive at the time of
> registration, without an adequate respect for their informational
> self-determination. For this reason, it is recommended that platforms
> specify in their ToS that the use of personal data for "future services" is
> limited to the improvement of existing services.
>
>
>
> *1.5 Data protection vis-à-vis third parties*
>
> Platforms play a crucial role in enforcing the protection offered by the
> legal system against interference with users' informational
> self-determination. Platforms should draft their terms of service in
> accordance with the applicable law, so as to provide effective remedies
> against the violation of rights recognised by international human rights
> instruments. For this reason, they should establish clear mechanisms for
> users to report inappropriate content and submit takedown requests, and
> they should implement a system to prevent the impersonation or unauthorized
> use of trademark and goodwill, in screen names. Furthermore, platforms
> should clearly define circumstances in which they  may act in response to
> user notification, with a timing that is adequate to the protection of the
> rights at stake.
>
>
>
> A second set of concerns pertains to the possibility to preempt any
> interference with users' personal data uploaded on the platform, by
> preventing access to user's content and metadata. Firstly, platforms should
> allow users to preserve their anonymity *vis-à-vis* third parties to the
> extent permitted by law, both within the platform itself and within other
> websites when such platform is used as an identity service. Secondly, it is
> recommended that platforms enable end-to-end encryption of communications
> and other personal information, in the context of both storage and
> transmission. Thirdly, platforms should obtain user's specific consent
> prior to making content available to third parties within the platform, in
> search engines or in other content websites. At a minimum, users should be
> offered the possibility to opt-out from the accessibility of their content
> by third parties.
>
>
>
> As regards the handing over of users' data upon governmental request, it
> is recommended that platforms execute such request only in the presence of
> valid warrant or judicial order, and release a periodic transparency report
> concerning the amount and type of such requests.
>
>
>
> *1.6 Protection of children and young people*
>
> Finally, a special category of concerns arises in the case of children and
> young people, towards which platforms should exercise special care. These
> concerns require particular arrangements, beyond the mere warning for
> inappropriate content and age verification that can be imposed by law for
> certain types of content. Firstly, the terms of service regulating
> platforms open to children and young people should include facilitated
> language to make its basic rules comprehensible by all users regardless of
> their age. Secondly, it is recommended that platforms provide measures that
> can be taken by children and young people in order to protect themselves
> while using the platform. Thirdly, and crucially, platforms are encouraged
> to provide specific mechanisms to ensure removal or erasure of content
> created by children and young people, regardless of whether this is
> specifically imposed by law.
>
>
>
> *2) Due Process*
>
> Due process is a fundamental requirement for any legal system based on the
> rule of law. Due process refers to the underogability of certain procedures
> in situations which may adversely affect individuals within the legal
> system. This includes the application of minimum safeguards such as the
> clarity and predictability of the substantive law, and the users' right to
> be heard before any potentially adverse decision is taken *vis-à-vis*
> themselves. Due process has significant implications with regards to
> potential amendment and termination of contractual agreements , as well as
> the adjudication of potential disputes.
>
>
>
> *2.1 Amendment and termination of contracts*
>
> The platform should give users meaningful notice of any amendment of the
> terms of service affecting the rights and obligation of its users. Best
> practices should also include giving notice of less significant changes,
> enabling users to access previous versions of the terms of service, and
> allowing them to continue using the platforms without being required to
> accept the terms of service of additional functionalities being added to
> the platform. Meaningful notice should also be given prior to termination
> of the contract or services. In order to prevent wrongful decisions, it is
> also recommended that platforms make termination of accounts of particular
> users possible only upon repeated violation of terms of service.
>
>
>
> *2.2 Adjudication*
>
> Disputes can arise both between users and between a particular user and
> the platform. In both cases, the immediacy of the platform enables quicker
> and potentially more granular solutions than litigation for the settling of
> disputes. However, because of the fundamental importance of the right of
> access to court, platforms should generally not impose a particular
> solution at the expense of regular court proceedings, but only in addition
> to those. Any dispute settlement mechanism should be clearly explained and
> offer the possibility of appealing against the final decision.
>
>
>
>
>
> *3) Freedom of Expression*
>
> Freedom of expression is a fundamental right consisting of the freedom to
> receive and impart information in a lawful manner, including by way of
> association. In the online platform context, the effectiveness of this
> right can be seriously undermined by disproportionate monitoring of online
> speech and repeated government blocking and takedown. For this reason,
> platforms play a crucial role as speech enablers in providing a system that
> ensures the protection of minimum safeguards against unjustified
> restriction of online speech.
>
>
>
> *3.1 Degree of monitoring*
>
> Although there are no rules to determine, in general terms, what kind of
> speech should or should not be allowed in private spaces (such as online
> platforms), recognised principles can serve to identify certain red-lines
> that should not be crossed. Firstly, platforms are encouraged to restrict
> unprotected speech, such as hate speech, child pornography and incitement
> to violence, as well as other kinds of undesirable content, such as
> unsolicited communications for direct marketing purposes or security
> threats, as well as any other content which is forbidden by the applicable
> law. Nevertheless, it is of utmost importance that the rules imposing such
> restrictions are not formulated in such a way as to affect potentially
> legitimate content, as they would otherwise constitute a basis for
> censorship.
>
> Similarly, although platforms can legitimately remove speech in order to
> enforce their terms of service, either on their own motion or upon
> complaint, such terms of service should be clear and transparent in their
> definition of the content that will be restricted within the platform,
> including the use of certain screen names. To this end, platforms can
> legitimately prohibit the use of the name, trademark or likeness of others.
> In addition, platforms should always provide clear mechanisms to notify,
> challenge and override illegitimate restrictions.
>
>
>
> *3.2 Government blocking and takedowns*
>
> Transparent procedures should be adopted for the handling and reporting of
> governmental requests for blocking and takedown. Firstly, platforms should
> execute such requests only where there is a legal basis for doing so and a
> valid judicial order. Secondly, platforms should notify users of such
> requests, ideally giving them an opportunity to reply and challenge the
> validity of such requests, unless specifically prohibited by law. Finally,
> as already mentioned in the context of government requests for data,
> platforms should implement law enforcement guidelines and release periodic
> transparency reports.
>
>
>
>
>
>
>
> _______________________________________________
> DCPR mailing list
> DCPR at lists.platformresponsibility.info
> http://lists.platformresponsibility.info/listinfo/dcpr
>
>


-- 

--
Rebecca MacKinnon
Director, Ranking Digital Rights Project, New America Foundation
Author, Consent of the Networked
Co-founder, Global Voices
Twitter: @rmack
Office: +1-202-596-3343
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.platformresponsibility.info/archive/dcpr/attachments/20141130/bc99d2e1/attachment-0001.html>


More information about the DCPR mailing list