[Dcpr] Due Diligence Recommendations

Jorge Morell Ramos mr.pequod at gmail.com
Fri Jan 30 07:25:06 EST 2015


Good morning,

Although a little late to the party, I think I'm still on time for adding
some brief comments.

By the way, very interesting observations and notes, so far.

Here are mine:

- In due time, I agree with Julia on the importance of giving examples and
templates about the subject. We definitively have to go beyond just
guidelines and good practices. I think it is something that usually lacks
in these kind of initiatives and if we add it that could make a big
difference. I'm really looking forward to work on the standard icons. :D

- I agree with the document on the idea of proposing that the services give
access to previous versions of the terms, as a way to control how they
change. In fact, I can give you data from the more than 6000 terms of
service that I've been tracking in 2014, and 56% of them did not inform the
user about the change in any way:
http://terminosycondiciones.es/2015/01/14/quien-cambio-mas-sus-terminos-y-condiciones-en-2014/

- Going back to the idea of creating standards, I think that it could be
interesting to propose the creation of web structure standards about the
way the terms and conditions are hosted and showed to the user. I can tell
you first hand, that right now there no is model or norm about the way the
terms are presented on a web. And that makes it way harder to track them in
huge numbers without constant human intervention. If for example more webs
would adopt a model like www.example/tos or www.example/privacy, monitoring
the amendments of the terms would improve dramatically.

- Finally, quite an interesting article about how the terms and conditions
of services like Twitter or Facebook are affecting the freedom of
expression and the transparency reports of the services are not informing
about that fact:
http://motherboard.vice.com/read/how-twitter-and-facebook-censor-content-without-telling-anyone


And that's i! Keep up the good work and have a nice weekend. :-)

Best,

Jorge.


2015-01-20 12:15 GMT+01:00 N. Zingales <N.Zingales at uvt.nl>:

>
> Dear Julia,
>
> many thanks for your words and your thoughtful comments. Below an attempt
> to address, or at least acknowledge, your concerns:
>
> 1) The objective of the document is to offer some guidance on “platform
> responsibility”. The document needs to have a certain degree of generality
> because it tries to create principles for different kinds of platforms,
> which might require different implementation depending on their needs and
> functionalities. Our hope is to develop the principle's more concrete
> implementation as a next step, launching the elaboration of model
> contractual provisions (for different kind of platforms) based on these
> recommendations,  as it was discussed at the IGF.
> In the document, we try to provide some bright-line rules as much as
> possible (see e.g. the rule about the need to specifically justify data
> retention beyond 180 days, or other “best practices” that we refer to with
> “should") but, probably this could be repeated in a couple more instances
> (e.g. what constitutes meaningful notice and informed consent). So we will
> try to further explore these issues further, or at least provide more
> useful examples, in the next draft.
>
> 2) There are currently 6 mandatory conditions: a) specify every type of
> information collected; b) obtain user consent before tracking (both within
> and outside the platform); c) enable users to get information about any
> predictive or probabilistic technique that may be used to build their
> profile, and their underlying rationale; d) permit users to delete their
> account permanently; e) if there is no further ground for processing,
> proceed with the permanent deletion of all or portions of the relevant data
> within the user account, in a time reasonable for its technical
> implementation ; f) do not use clauses allowing unilateral termination
> (i.e. without notice) without appropriate grounds.
> In response to your other comments under (2):
> - tracking refers here to the collection and aggregation of data across
> different websites from browsing sessions with the aim of creating a unique
> profile: as such, it can occur both because the platform allows third
> parties to track the behavior of users in it, or because it has an
> established mechanism (e.g. Facebook like button or other social plugins)
> to track users into third parties’ websites. It is important to clarify (as
> we will do) that tracking requires previous consent, and our point about
> “opting out” was simply to promote the good practice of giving an
> additional tool for users to protect their privacy.
> -  right to be informed about the use of predictive or probabilistic
> techniques: this is a principle of Convention 108 which has not been
> sufficiently clarified so far. Obviously we cannot expect a disclosure of
> the level of detail that reveals the “secret sauce” of HTE algorithm, but
> at least (1) the use of such techniques; and (2) their basic rationale
> (e.g. Google’s page rank)
> - your points about clarifying minimum due process standards and
> strengthening children protection are well taken, we will work on that.
>
> 3) As mentioned above, we will try to give more examples (including what
> constitutes consent); but the aim of this document is not to provide an
> example for brief and coincise ToS. That could be hopefully one of the next
> steps of the DCPR. Particularly, what should be explored in the future is
> the possibility to define visually intelligible references (e.g. standard
> “icons”) to be associated with different type of contractual clauses
> defining the use of data. “Commonterms" and “Privacy Icons” are already
> setting a very interesting example on that.
>
> 4) the impression of inconsistency is probably generated by the fact that
> we didn’t sufficiently make clear that when we require consent, it is
> sufficient that such consent is obtained in the terms of service through a
> general “I agree” to a general document (e.g. the privacy policy), while in
> two of the situations you pointed to (scanning of user data for advertising
> purposes, and data aggregation) we require an additional element
> (respectively: a best practice to not collect those data, even in the
> presence of consent in the general ToS; and an “express consent” for the
> aggregation, implying an extra affirmative step beyond the traditional “I
> agree”). With regard to the last situation you pointed to (effective
> remedies under II.E), you are right that we would probably be better
> advised to put a “shall” instead of the "should”. Many thanks for the
> constructive suggestion!
>
> 5) It would be in theory possible to define limits on the extraterritorial
> reach of the jurisdiction which can be asserted by one State or another,
> but probably this falls outside of the scope of this document. Even
> admitting that (geographically) far-reaching regulation of platforms'
> rights and duties  has human rights implications, this would seem to be
> more a matter of state sovereignty, rather than the platforms'
> responsibility to respect human rights.  For this reason we thought it
> would be better to leave aside the jurisdictional aspect, because the claim
> that we would make in setting the boundaries of "legitimate law" would
> otherwise lack a solid basis on human rights principles.
>
> 6) Lastly, we didn’t want to isolate copyright as opposed to other
> legitimate laws which can be used to censor speech. For this reason, we
> focused on whether the rules are clear and not likely to provide overly
> broad grounds for speech removal, and wanted to allow platforms to make any
> editorial decisions they wish. The only exception we made to this “hands
> off” stance was for the need to provide mechanisms to remove unprotected
> speech, such as hate speech, because allowing it would constitute in itself
> an interference with other people’s fundamental rights (e.g. right to
> freedom of expression) within the platform.
>
> Thanks again for your comments, and please feel free to send through any
> follow up you might have. We are working on a further revision of the
> document, which will be published by the beginning of February. We'll also
> post soon a version of the doc where everyone who participated can find
> their comments addressed. Stay tuned!
>
> Best,
> Nicolo
>
> ____________
> From: dcpr-bounces at lists.platformresponsibility.info [
> dcpr-bounces at lists.platformresponsibility.info] on behalf of Julia Powles
> [jep50 at cam.ac.uk]
> Sent: Thursday, January 08, 2015 6:33 AM
> To: dcpr at lists.platformresponsibility.info; LB at lucabelli.net
> Subject: Re: [Dcpr] Due Diligence Recommendations
>
> Hi Luca and DC PR members,
>
> I had a few comments on this document I thought I'd send on, taking your
> suggestion to circulate to the mailing-list. It concerns an important
> subject to which you've clearly dedicated considerable work and
> generated a commendable draft. Hopefully these thoughts are of some use
> -- but if they are too wayward, that's ok too.
>
> All best- Julia
>
> Some thoughts:
>
> 1)      Since a specific objective is concision (“the spectrum of rights
> and
> remedies that are granted to platform users through the ToS may be
> difficult to comprehend or even read in its entirety”, also III.A of the
> recommendations) and harmonisation (“and similar platforms may be
> regulated through very different provisions that might be unilaterally
> modified by platform providers”), my impression is that the document is
> both too long, and too inspecific. Length may be necessary, but in this
> case, it does not fully deliver on the laudable promise of offering
> practical red-lines beyond the old OECD rules and five mandatory
> conditions (my point 2 below), nor provide examples about what is meant
> by, for example, specific consent and specific purposes, and what would
> definitely *not* be ok, or definitely *would* be ok. It seems to me that
> somewhat bright lines are what is needed for easy semantic comparison of
> ToS and for clear implementation by programmers and guarantees by
> operators.
>
> 2)      There are, as far as I can tell, only five mandatory conditions -
> is
> this right?:
> (a) specific rather than general consent for data collection by category
> (II.A, II.D), involving real choice with no coercion (Annex 1 (j));
> (b) right to opt out of behavioural tracking – what does this mean?
> Tracking is not defined. And, arguably, it extends to most data
> collection, so this to me seems incompatible with the rest of this
> section (II.A);
> (c) right to find out information about predictive or probabilistic
> techniques used and their rationale – really? Algorithms are famously
> guarded… what do you mean here? (II.D);
> (d) right to permanently delete account (II.D);
> (e) prohibition on platform operator unilaterally terminating account
> without appropriate grounds (III.A).
> Small point: given that III is prefaced as discussing non-derogable
> procedures (i.e. mandatory “shall” conditions), do you want to be more
> specific about what they actually are in this context? Or otherwise
> eliminate that language, as it’s confusing.
> Also, it seems to me that the protection of children should have some
> more mandatory conditions, not least because informed consent has an age
> threshold.
>
> 3)      There should be examples – what would be appropriate for a
> specified
> purpose? Can a platform say: “we collect personal information to make
> our service more relevant and personalised to you”? What does specific
> consent mean? II.D goes some way towards addressing this, but it could
> be clearer, and better integrated with the rest of the document. How
> does this document address and set an example about short, concise ToS
> that do something short of making you sign over your first child to very
> long-winded ToS because there’s no practical alternative?
>
> 4)      Consistency is lacking between many sections. E.g., if specific
> consent is mandatory in II.A & II.D, then it is inconsistent that
> automatically scanning for advertising in para 2 of II.A. is not also
> mandatory, or that II.C or II.E is not – how can use beyond original
> purpose be specifically consented to, or actual usage by third parties?
> And is the implication that specific consent does not extend to
> knowledge and consent to data retention (II.B), as opposed to
> collection? Further it seems inconsistent that “everyone’s right to
> remove incorrect or excessive personal data” is mentioned at the end of
> V, but not at all earlier in the document, at least not as a mandatory
> right in such terms.
>
> 5)      Part of what you are trying to address is global operations, but
> you
> don’t offer protections for users from any state’s meddling, apart from
> the transparency reporting (II.E). Is “legitimate law” the law of the
> data subject, the operator, the operations?! Is there a way of excluding
> every single possible country that might have an interest in a user's
> data?
>
> 6)      The absence of mention of copyright under IV is noticeable, even
> though I think it appropriate for it not to be discussed on the same
> level. Nevertheless, it is the most significant factor in takedowns on
> online platforms.
>
> Anyway, as I say, these are just some thoughts that I hope are helpful
> and which I think would further your objectives, although I recognise
> you are fairly well progressed in the project. Best of luck and hope to
> see you in the year ahead.
>
>
>
>
>
>
>
> > ---------- Forwarded message ----------
> > From: <LB at lucabelli.net>
> > Date: Fri, Dec 26, 2014 at 3:31 PM
> > Subject: [Dcpr] Due Diligence Recommendations
> > To: dcpr at lists.platformresponsibility.info
> >
> > Dear all,
> >
> > Thanks a lot for your very constructive inputs and critiques with
> > regard to the original draft of the Due Diligence Recommendations. We
> > have significantly amended the original draft compiling your comments
> > and suggestions. You may find the revised version both in attachment
> > and below (the version below does not have footnotes).
> >
> > By all means, this is not the final version yet and you are all
> > encouraged to share your comments on this second draft so that we can
> > furhter enhance it. Ideally, the final version of the Due Diligence
> > Recommendations should be released on 31 JANUARY.
> >
> > The second comment period is now open and will last until 20 JANUARY.
> > Although we have greatly appreciated your comments via personal
> > emails, we kindly suggest that you circulate your future comments on
> > the mailing-list to trigger discussion amongst the DC PR members.
> > Alternatively, you may also comment the Due Diligence Recommendations
> > using this google doc
> >
> https://docs.google.com/document/d/1N0_aVbWSNt-S12O0KLebqYVxM6gj5Uz3qqHDUkmlPc0/edit?pli=1
> > [1]#
> >
> > Lastly, you are all encouraged to share any relevant info (articles,
> > posts, events, etc.) related to online platforms’ responsability to
> > respect human rights, using this mailing-list.
> >
> > As this year is ending we wish you, your families and friends an
> > excellent year ahead.
> > All the best,
> >
> > Luca, Nicolo and Primavera
> >
> > DRAFT
> >
> > Dynamic Coalition on Platform Responsibility
> >
> > -------------------------
> >
> > Due Diligence Recommendations
> > on Terms of Service and Human Rights
> >
> > -------------------------
> >
> > Introduction
> >
> > The following recommendations aim at fostering online platforms’
> > responsibility to respect human rights, by providing guidance with
> > regard to the adoption of “responsible” terms of service. Besides
> > identifying minimum standards for the respect of human rights by
> > platform operators (standards that “shall” be met), these
> > recommendations  suggest best practices (which “should” be
> > followed) for the most “responsible” adherence to human rights
> > principles in the drafting of terms of service.
> >
> >       *
> > Background
> >
> > Aside from cases of complicity in human rights violations rising to
> > the level of international crimes over which the International
> > Criminal Court has jurisdiction, private entities cannot be held
> > liable under international law for human rights violations, as they
> > are no parties to human rights treaties. Yet, respect of human rights
> > undoubtedly represents an important factor to take into account when
> > assessing the activities of corporations from the perspective of a
> > variety of stakeholders, including governments, investors and
> > increasingly, consumers.
> >
> > This is especially relevant in the context of online platforms
> > designed to serve the needs of a global community, and forced to
> > satisfy different, often conflicting legal requirements across the
> > various jurisdictions where they operate. Thus, in light of the
> > important role that online platforms are playing in shaping a global
> > information society and the significant impact they have on the
> > exercise of the rights of Internet users, a moral and social
> > expectation has formed that such entities behave “responsibly”,
> > ensuring the respect of the rule of law and the human rights of the
> > Internet users across the globe.
> >
> > The existence of a responsibility of private sector actors to respect
> > human rights, which was recently affirmed in the UN Guiding Principles
> > on Business and Human Rights and unanimously endorsed by the UN Human
> > Rights Council, is grounded upon the tripartite framework developed by
> > the UN Special Rapporteur for Business and Human Rights, according to
> > which States are the primary duty bearers in securing the protection
> > of human rights, corporations have the responsibility to ensure the
> > respect of human rights, and both entities are joint duty holders in
> > providing effective remedies against human rights violations.
> >
> > As part of this responsibility, corporations are expected to: (1) make
> > a policy commitment to the respect of human rights; (2) adopt a human
> > rights due-diligence process to identify, prevent, mitigate and
> > account for how they address their impacts on human rights; and (3)
> > have in place processes to enable the remediation of any adverse human
> > rights impacts they cause or to which they contribute.
> >
> > These recommendations focus on one of the most concrete and tangible
> > means for online platforms to bring that responsibility to bear: the
> > contractual agreement which Internet users are required to adhere to
> > in order to utilise their services (the so called “Terms of
> > Service”), thus becoming platform users. Specifically, the
> > recommendations constitute an attempt to define “due diligence”
> > standards for online platforms with regard to four essential
> > components: privacy, freedom of expression, due process and protection
> > of children and young people. In doing so, they aim to provide a
> > benchmark for respect of human rights in the private governance
> > sphere, both in the relation of a platform’s own conduct as well as
> > with regard to the scrutiny of governmental requests that they
> > receive.  As recently stressed by the Council of Europe’s
> > Commissioner for Human Rights, guidance on these matters is
> > particularly important due to the current lack of clear standards.
> > This applies a fortiori in the context of online platforms, given the
> > crucial role that these entities have in ensuring practical compliance
> > with fundamental rights on the Internet.
> >
> >       *
> > Privacy
> >
> > The first section of these recommendations provides guidance over the
> > rules that online platform operators should adopt in order to ensure
> > the protection of their users against any unnecessary and unreasonable
> > collection, use and disclosure of personal data.
> >
> >       *
> > Data Collection
> >
> > Platform operators should limit the collection of personal information
> > to what is directly relevant and necessary to accomplish a specified
> > purpose. They shall also obtain consent for every type of information
> > collected (by category), rather than through a single general-purpose
> > consent form. If consent is withdrawn, the platform is no longer
> > entitled to process such data. Although withdrawal is not retroactive,
> > i.e. it cannot invalidate the data processing that took place in the
> > period during which the data was collected legitimately, it should, in
> > principle, prevent any further processing of the individual’s data
> > by the controller.
> >
> > In principle, platform operators should also refrain from collecting
> > data by automatically scanning content privately shared by their
> > users. Admissible derogations to this principle include the need to
> > fight against unsolicited communications (spam) and to ensure network
> > security, and should not extend to commercial or advertising purposes
> > in the absence of specific and express platform-user consent.
> >
> > Platform operators shall always offer their users the possibility to
> > opt out from the tracking of their behavior both by the platform
> > within other services, and by other services within the platform.
> >
> > In order to facilitate user oversight on the application of these
> > principles, platform operators should  allow their users to view, copy
> > and modify the personal information they have made available to the
> > platform, and are encouraged to do so enabling download of a copy of
> > their personal data in interoperable format.
> >
> >       *
> > Data Retention
> >
> > Platform operators should clearly communicate in their terms of
> > service whether and for how long they are storing any personal data.
> > As a general rule, any retention beyond 180 days should be
> > specifically justified by a function of the platform or by
> > requirements imposed by a  “legitimate law”.
> >
> >       *
> > Data aggregation
> >
> > Aggregation of platform users’ data should only be done subject to
> > express consent. Aggregation of data across multiple services or
> > devices requires extra diligence from the part of the data controller
> > and processor, since it might result in data being processed beyond
> > the original purpose for which it was collected. Although this does
> > not prevent the implementation of cross-device functionalities, it is
> > necessary to ensure that platform users properly understand the scope
> > of the given consent.
> >
> >       *
> > Data Use
> >
> > Platforms shall obtain consent in order to use personal data
> > (including platform users’ contacts and recipients), unless such use
> > is prescribed by a legitimate law.  It is also recommended that such
> > consent is required for platforms to make personal data available to
> > the public through search engines, and to give their  users the option
> > to opt out.
> >
> > The requirement of consent also apply to personal data over which the
> > platform acquires the right to use the data therein; as a general
> > rule, such use should never be broader than the original purpose for
> > which personal data was shared. Oftentimes, the use of personal data
> > is instrumental to the improvement of existing services, or the
> > development of new services and functionalities. Yet, a broad and
> > open-ended permission on the use of platform users’ personal data
> > for “future services” can lead to abuses, in particular by making
> > it possible for the platform to offer personalised services on the
> > basis of the information provided, and automatically enroll them into
> > services or functionalities that they did not intend to receive at the
> > time of registration. This is in conflict with the right of users to
> > informational self-determination, including the right not to be
> > subject to any decision based solely on automated processing of data
> > or without taking their view into consideration. For this reason, it
> > is recommended that platforms specify in their ToS that the purpose of
> > processing of personal data is limited to the scope of existing
> > services, and the enrolment of platform users into any new service
> > will require their acceptance to new ToS. Platform operators should
> > also give them the opportunity to object to such usage and  demand the
> > rectification of inaccurate data. Furthermore, platform users shall
> > always be able to obtain information about any predictive or
> > probabilistic techniques they have been used to build their profile,
> > and their underlying rationale.
> >
> > Lastly, platform operators shall always permit their users to delete
> > their account in a permanent fashion. Likewise, if there is no other
> > legal reason justifying the further storage of the data, the data
> > processor shall proceed with the permanent deletion of all or portions
> > of the relevant data associated with the platform user’s account, in
> > a time that is reasonable for its technical implementation.
> >
> >       *
> > Data protection vis-à-vis third parties
> >
> > Platforms play a crucial role in enforcing the protection offered by
> > the legal system against interference with users’ right to privacy.
> > Therefore, platform operators should provide effective remedies
> > against the violation of internationally recognised human rights. For
> > this reason, they should establish clear mechanisms for platform users
> > to gain access to all of their personal data held by a third party, as
> > well as to be informed of the actual usage thereof. Platform operators
> > should also enable their users to report privacy-offending content and
> > to submit takedown requests.They should also implement a system to
> > prevent the impersonation of platform users by third parties, although
> > exceptions would need to be made for the impersonation of public
> > figures in ways which contributes to the public debate in a democratic
> > society.
> >
> > A second set of concerns pertains to the possibility to preempt any
> > interference with platform users’ personal data, by preventing third
> > parties’ access to platform user’s content and metadata. Firstly,
> > platform operators should allow users to preserve their anonymity
> > vis-à-vis third parties to the extent permitted by legitimate laws,
> > both within the platform itself and within other websites when such
> > platform is used as an identity service. Secondly, it is recommended
> > that platforms enable end-to-end encryption of communications and
> > other personal information, in the context of both storage and
> > transmission. In that respect, best practice is when the decryption
> > key is retained by the platform user, except where the provider needs
> > to hold the decryption key in order to provide the service and the
> > platform user has provided informed consent.
> >
> > As regards the handing over of platform users’ data upon
> > governmental request, platform operators should specify that they
> > execute such request only in the presence of a valid form of legal
> > process, and release a periodic transparency report providing, per
> > each jurisdiction in which they operate, the amount and type of such
> > requests, and the platforms’ response (in aggregate numbers).
> >
> >       *
> >  Due Process
> >
> > Due process is a fundamental requirement for any legal system based on
> > the rule of law. “Due” process refers to the non-derogability of
> > certain procedures in situations which may adversely affect
> > individuals within the legal system. This includes the application of
> > minimum safeguards such as the clarity and predictability of the
> > substantive law, the right to an effective remedy against any human
> > rights violation and the right to be heard before any potentially
> > adverse decision is taken regarding themselves.
> > Due process has significant implications with regards to potential
> > amendment and termination of contractual agreements, as well as the
> > adjudication of potential disputes. This section aims to give content
> > the due diligence in this context.
> >
> >       *
> > Amendment and termination of contracts
> >
> > Terms of Service should be written in plain language that is easy to
> > understand. Wherever possible, the platform operators should provide
> > an accessible summary of the key provisions of the terms of service.
> > The platform operators should give its users meaningful notice of any
> > amendment of the ToS affecting the rights and obligation of the users.
> > Meaningful notice should be provided in a way that enables platform
> > users to clearly see, process and understand the changes. Contractual
> > clauses that permit unilateral termination by platforms without
> > appropriate grounds shall not be used.
> >
> > In addition, platform operators should consider giving notice even of
> > less significant changes, and enabling their users to access previous
> > versions of the terms of service. Ideally, platforms operators should
> > enable their users to continue using the platforms without having to
> > accept the new terms of service related to the additional
> > functionalities which have been added to the platform, unless this
> > would impose significant costs and complexity to the operators.
> > Meaningful notice should also be given in advance, prior to
> > termination of the contract or services . Besides, to reduce the
> > imbalance between platform users and platforms owners when it comes to
> > litigation, it is recommendable that the ToS be negotiated beforehand
> > with consumer associations or other organisations representing
> > Internet users. In order to prevent wrongful decisions, it is also
> > recommended that platforms make termination of accounts of particular
> > platform users possible only upon repeated violation of ToS.
> >
> >       *
> > Adjudication
> >
> > Disputes can arise both between platform users and between a
> > particular platform user and the platform operator. In both cases,
> > platform operators should provide alternative dispute resolutions
> > systems to allow for quicker and potentially more granular solutions
> > than litigation for the settling of disputes. However, in view of the
> > fundamental importance of the right of access to court, alternative
> > dispute resolution systems should not be presented as a replacement of
> > regular court proceedings, but only in addition to those. In
> > particular, platform operators should not impose waiver of class
> > action rights or other hindrances to the right of an effective access
> > to justice, such as mandatory jurisdiction outside the place of
> > residence of Internet users. Any dispute settlement mechanism should
> > be clearly explained and offer the possibility of appealing against
> > the final decision.
> >
> >       *
> > Freedom of Expression
> >
> > Freedom of expression is a fundamental right consisting of the freedom
> > to receive and impart information in a lawful manner, including by way
> > of association. In the online platform context, the effectiveness of
> > this right can be seriously undermined by disproportionate monitoring
> > of online speech and repeated government blocking and takedown. The
> > following section provides guidance as to how platforms should handle
> > such matters through their terms of service.
> >
> >       *
> > Degree of monitoring
> >
> > Although there are no rules to determine, in general terms, what kind
> > of speech should or should not be allowed in private online platforms,
> > certain platforms should be seen more as “public spaces” to the
> > extent that occupy an important role in the public sphere. These
> > actors have assumed functions in the production and distribution
> > process of media services which, until recently, had been performed
> > only (or mostly) by traditional media organisations. As a matter of
> > fact, online platforms increasingly play an essential role of speech
> > enablers and pathfinders to information, becoming instrumental for
> > media’s outreach as well as for Internet users’ access to them.
> >
> > Established principles and best-practices can serve to identify
> > certain red-lines that should not be crossed. As a general rule,
> > platform operators should not impose any restrictions on the kind of
> > content that they host, with the exception of content that is harmful
> > or explicitly forbidden under applicable legitimate laws (e.g. hate
> > speech, child pornography and incitement to violence, as well as other
> > kinds of undesirable content, such as unsolicited communications for
> > direct marketing purposes or security threats) but only if necessary
> > and proportionate to that purpose. It is of utmost importance that the
> > rules imposing such restrictions are not formulated in such a way as
> > to affect potentially legitimate content, as they would otherwise
> > constitute a basis for censorship.
> >
> > Similarly, although platforms can legitimately remove speech in order
> > to enforce their terms of service, either on their own motion or upon
> > complaint, such terms of service should be clear and transparent in
> > their definition of the content that will be restricted within the
> > platform, including the use of certain screen names. To this end,
> > platforms can legitimately prohibit the use of the name, trademark or
> > likeness of others. In addition, platforms operator should always
> > provide clear mechanisms to notify those platform users whose content
> > has been removed and provide them with an opportunity to challenge and
> > override illegitimate restrictions.
> >
> >       *
> > Government blocking and takedowns
> >
> > Transparent procedures should be adopted for the handling and
> > reporting of governmental requests for blocking and takedown in a way
> > that is consistent with internationally recognised laws and standards.
> > Firstly, platform operators should execute such requests only where
> > there is a legal basis for doing so and a valid judicial order.
> > Secondly, platforms operators should notify their users of such
> > requests, ideally giving them an opportunity to reply and challenge
> > the validity of such requests, unless specifically prohibited by
> > legitimate law. Finally, as already mentioned in the context of
> > government requests for data, platform operators should implement law
> > enforcement guidelines and release periodic transparency reports.
> >
> > V. Protection of children and young people
> >
> > A special category of concerns arises in the case of children and
> > young people, towards which platform operators should exercise a
> > higher level of care. Platform operators should adopt particular
> > arrangements, beyond the mere warning for inappropriate content and
> > age verification that can be imposed by legitimate law for certain
> > types of content.
> >
> > Firstly, although terms of service should generally be drafted in a
> > way that is comprehensible to all, those regulating platforms open to
> > children and young people should include facilitated language or an
> > educational video-clip and, ideally, a set of standardised badges  to
> > make their basic rules comprehensible by all users regardless of their
> > age and willingness to read the actual terms of use. Secondly, it is
> > recommended that platforms provide measures that can be taken by
> > children and young people in order to protect themselves while using
> > the platform, such as utilising a “safer navigation” mode.
> > Thirdly, platform operators should consider providing a specific
> > mechanisms to ensure removal or erasure of content created by children
> > and young people.
> > As an element of media literacy, all platform users should be informed
> > about their right to remove incorrect or excessive personal data.
> >
> > Annex 1: Definitions
> >
> > a) Platform:
> > For the purpose of these recommendations, platforms are understood as
> > cloud-based public-facing interfaces or “spaces” allowing users to
> > impart and receive information or ideas according to the rules defined
> > into a contractual agreement.
> >
> > b) Terms of Service:
> > The concept of “terms of service” utilised here covers not only
> > the contractual document available under the traditional heading of
> > “terms of service” or “terms of use”, but also any other
> > platform’s policy document (e.g. privacy policy, community
> > guidelines, etc.) that is linked or referred to therein.
> >
> > c) Function of the Platform:
> > Function that the community has attributed to the platform on the
> > basis of the legal, commercial and social expectations that it has
> > generated. This should not be confused with a platform’s
> > functionalities, which constitute merely one (albeit important)
> > element to identify the overall function(s).
> >
> > d) Platform Operator
> > Natural or legal person defining and having the possibility to amend
> > the platform’s terms of service
> > e) Platform User
> > Natural or legal person entering into a contractual relationship
> > defined by the platform’s terms of service.
> > f) Internet User
> > Natural or legal person who is using Internet access service, and in
> > that capacity has the freedom to impart and receive information. The
> > Internet user may be the subscriber, or any person to whom the
> > subscriber has granted the right to use the Internet access service
> > s/he receives.
> > g) Data:
> > Content and/or personal information. Data can belong to both
> > categories simultaneously.
> >
> > h) Content:
> > Text, image, audio or video which results from the engagement of a
> > particular platform user with the platform, even on a transient basis.
> > This includes, for example, messages and queries typed by a platform
> > user.
> >
> > i) Personal Data/Personal Information:
> > Personal data is any information about an individual that can be used
> > to distinguish or trace an individual’s identity, such as name,
> > social security number, date and place of birth, etc. This is not
> > intended to cover identification which can be accomplished via very
> > sophisticated methods. This notion of personal data equates with that
> > of Personally Identifiable Information (PII), defined as “any
> > information about an individual maintained by an agency, including (1)
> > any information that can be used to distinguish or trace an
> > individual‘s identity, such as name, social security number, date
> > and place of birth, mother‘s maiden name, or biometric records; and
> > (2) any other information that is linked or linkable to an individual,
> > such as medical, educational, financial, and employment
> > information.”
> > j) Consent:
> > Consent means any freely given, specific and informed indication of
> > the data subject’s wishes by which s/he signifies  her/his agreement
> > to personal data relating to her/himself being processed. To that end,
> > every user shall be able to exercise a real choice with no risk of
> > deception, intimidation, coercion or significant negative consequences
> >  if he/she does not consent to data aggregation.
> >
> > k) Express Consent:
> > Express consent is a type of consent which (in contrast with
> > “implicit” or “implied” consent) requires an affirmative step
> > in addition to the acceptance of the general ToS, such as clicking or
> > ticking a specific box or acceptance of the terms and conditions of a
> > separate document.
> >
> > l) Privacy:
> > Privacy is an inalienable human right enshrined in Article 12 of the
> > Universal Declaration of Human Rights, which establishes the right of
> > everyone to be protected against arbitrary interference with their
> > privacy, family, home or correspondence, and against attacks upon his
> > honour and reputation. In the context of online platforms, this
> > encompasses the ability for data subjects to determine the extent to
> > which and the purpose for which their personal data is used by data
> > controllers, including the conditions upon which such data can be made
> > available to third parties (right to informational
> > self-determination).
> >
> > m) Freedom of Expression:
> > The right to freedom of expression, enshrined in article 19 of the UN
> > Declaration of Human Rights, includes freedom to hold opinions without
> > interference and to seek, receive and impart information and ideas
> > through any media and regardless of frontiers. The right to freedom of
> > opinion and expression is as much a fundamental right on its own
> > accord as it is an “enabler” of other rights, including economic,
> > social and cultural rights.
> >
> > n) Hate Speech:
> > Although there is no universally accepted definition of “hate
> > speech”, the term shall be understood as covering all forms of
> > expression which spread, incite, promote or justify racial hatred,
> > xenophobia, anti- Semitism or other forms of hatred based on
> > intolerance, including: intolerance expressed by aggressive
> > nationalism and ethnocentrism, discrimination on any grounds such as
> > race, ethnicity, colour, sex, language, religion, political or other
> > opinion, national or social origin, property, disability, birth,
> > sexual orientation or other status. In this sense, “hate speech”
> > covers comments which are necessarily directed against a person or a
> > particular group of persons.
> >
> > o) Due Process:
> > Due process is a concept referring to procedural rights which are
> > essential for the respect of the rule of law, comprising: (1) the
> > right to an effective remedy by a competent tribunal for any acts
> > violating one’s fundamental rights granted by the law or the
> > Constitution (ex article 8 of the Universal Declaration of Human
> > Rights); and (2) and the right to an independent and impartial
> > tribunal, in the determination of one’s rights and obligations and
> > of any criminal charge against him/her (ex. art.10 of the Universal
> > Declaration  of Human Rights). In the context of online platforms, the
> > urgency and efficacy of the protection of these rights might require
> > an expansion of the scope of application of these rights beyond the
> > traditional notion of “tribunal”.
> >
> > p) Legitimate Law:
> > Laws and regulations should be deemed as legitimate when they respond
> > to a pressing social need and, having regard to their tangible impact,
> > they can be considered as proportional to the aim pursued.
> > The concept of “legitimate law” which is used to justify a
> > potential restriction shall be interpreted as referring to a law or
> > regulation which does not manifestly fail the requirements for
> > permissible restrictions  identified for freedom of expression (and
> > applicable, mutatis mutandis, to restrictions of other fundamental
> > rights) by the UN Special Rapporteur on the promotion and protection
> > of the right to freedom of expression and opinion, specifically:
> > (a) It must be provided by law, which is clear and accessible to
> > everyone (principles of predictability and transparency);
> > (b) It must pursue a legitimate purpose (principle of legitimacy); and
> > (c) It must be proven as necessary and the least restrictive means
> > required to achieve the purported aim (principles of necessity and
> > proportionality).
> > If it is manifest that the measure would not pass these three-pronged
> > test, the platform operator should deny the request and, to the extent
> > possible, challenge it before the relevant court.
> >
> > _______________________________________________
> >  DCPR mailing list
> >  DCPR at lists.platformresponsibility.info
> >  http://lists.platformresponsibility.info/listinfo/dcpr [2]
> >
> >
> >
> > Links:
> > ------
> > [1]
> >
> https://docs.google.com/document/d/1N0_aVbWSNt-S12O0KLebqYVxM6gj5Uz3qqHDUkmlPc0/edit?pli=1
> > [2] http://lists.platformresponsibility.info/listinfo/dcpr
> _______________________________________________
> DCPR mailing list
> DCPR at lists.platformresponsibility.info
> http://lists.platformresponsibility.info/listinfo/dcpr
> _______________________________________________
> DCPR mailing list
> DCPR at lists.platformresponsibility.info
> http://lists.platformresponsibility.info/listinfo/dcpr
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.platformresponsibility.info/archive/dcpr/attachments/20150130/014c61b6/attachment-0001.html>


More information about the DCPR mailing list