Preparing Europe for the Digital Age has been a priority for the current Commission from its very beginning. For this purpose, a new portfolio was created and given to the Commissioner for Competition, Margrethe Vestager, certainly taking into account her experience with tech giants. 

 

In February of this year, the European Commission published a number of communications on various topics related to digitalization and technology. Among these, the EC has included the Digital Services Act package initiative based on “Shaping Europe’s Digital Future” communication. The Digital Services Act Package is the attempt of the Commission to revamp the current legal framework for digital services and it contains two main pillars – (1) proposing clear rules framing the responsibilities of digital services to address the risks faced by their users and to protect their rights and (2) proposing ex-ante rules covering large online platforms acting as gatekeepers, which now set the rules of the game for their users and their competitors.

 

In June, the Commission took one step further in the process of transforming the Digital Services Act Package into reality, by announcing a consultation meant to identify issues that may require intervention at the EU level, the deadline of which was yesterday, September 8th. 

 

At The Good Lobby, we believe that the voice of civil society organizations must be heard in such consultations for legislative pieces which stand to have a serious impact on human rights and democracy. We also want to ensure that it is not only the visions of tech companies or trade organizations that are taken into account but also the voices of NGOs and citizen movements. We have therefore submitted our feedback, focusing on the matters most aligned with our mission, vision, and expertise, and expressed our support for:

 

 

  • Regulating large online platforms acting as gatekeepers;
  • Imposing on such platforms an ex-ante regulatory mechanism;
  • Including in the Digital Services Act an interoperability requirement for these platforms;
  • Ensuring consumers know why and how they are targeted with certain ads by  creating mandatory ad libraries and ensuring transparency;
  • Giving consumers a right to completely opt-out from personalised ads; and
  • Strong, reinforced requirements on political ads, including a thorough verification of their authors and overall limitations as to their number.

 

The reasons for our stance together with our feedback in full are available below.


 

The Good Lobby is happy to be able to provide its views as part of this consultation. We have decided to focus our feedback on the two issues that are most in line with our mission, vision, and expertise – the role of gatekeepers attained by certain platforms and online advertising, algorithms, and disinformation. Both of these issues have a clear impact on democracy, citizenship, and human rights, thus being important for us to address.

 

Great power should not only bring great profits but also a great responsibility. This is our stance on the way in which the Commission should approach the regulation of the so-called gatekeepers. More than the high degree of market power and the risks this can raise from a competition law perspective which we will not address here, these gatekeeper platforms also have the power to influence opinions, sell users’ data to various third parties, sway elections, and ultimately threaten the very fabric of democracy as we know it. 

 

The vast amounts of data that these platforms possess and the ways in which they monetize them also raise serious concerns related to fundamental rights broadly and privacy in particular. Such data can and is often used in order to target consumers with personalised ads (more on which below) meant to encourage them to take a particular course of action – be it to purchase a product or vote in a certain way. 

 

The list of associated risks is long and serious enough thus far so as not to require further detail, nor a debate on whether regulation is needed. Also taking into account the experience which has shown so far that expecting such platforms to “self-regulate” is not an effective solution, we strongly support the adoption of an ex-ante regulatory instrument for large online platforms acting as gate-keepers. With reference to another impact assessment of the European Commission, we believe that this regulatory instrument should take the form of adopting tailor-made remedies addressed to these platforms on a case-by-case basis where necessary and justified.

 

Further, we share the view that the Digital Services Act should, in order to address the risks posed by the platforms with such power, also contain an interoperability requirement. We also wish to take this opportunity to re-emphasise the importance of data sharing with reference to another document prepared by a High-Level Expert Group and published by the Commission on B2G data sharing.

 

The second issue ties in very closely to the first. As mentioned above, the platforms who act as gatekeepers monetize the vast amounts of data they have gathered from the users by selling it to third parties which in turn use it to better target potential customers through personalised ads. On this issue of the personalised ads, we see two dimensions which we will address separately – first, ads generally and second, political ads which we believe deserve particular attention.


We strongly believe that when it comes to these ads there is a need for the following elements: mandatory ad libraries for all ads, transparency, and a right to completely opt-out from personalised ads. We believe that users have a right to know who targets them, how, and for what purposes. We, therefore, believe it is important that every user has access to clear, easy-to-understand explanations of the way in which optimisation algorithms work. Also for these purposes, and to ensure transparency, we believe that users should have full access to the personal data – including to inferred personal data – and advertising profiles.

 

On the issue of algorithms more broadly, it has to be remembered that algorithms are only as good as the people behind them and that they too can and indeed are biased. Therefore, we suggest that there needs to be an obligation for platforms to conduct and publish human rights impact assessments for algorithms used for targeting ads.

 

Moving on, there is the issue of political ads which is more delicate still. Given their nature and the impact they can have on democracy broadly, stronger, reinforced measures are needed in their case. Firstly, a decision needs to be made on what qualifies as ‘political’ so as to avoid any potential dangerous loopholes. Secondly, not everybody should be allowed to publish such ads. Those who want to do so must be thoroughly verified so the targets of these ads, the voters, know exactly who is behind the ad and who pays for it. 

 

Those who want to use such ads should be thoroughly verified so voters can know exactly who is behind an ad and who pays for it. This ties into the issue of transparency, which is all the more important in the case of such ads. Finally, a limited number of these ads should be allowed so as to avoid the risk that certain organisations create a ‘flood’ of ads which is difficult to scrutinise and keep in check. 

 

Looking at the risks raised by these two issues that we have chosen to focus on, we express our hope that the European Commission will take serious action to regulate them as it is necessary in order to minimise them. 

 

  1. Digital Services Act package – an ex-ante regulatory instrument of very large online platforms acting as gatekeepers Timeline, available at https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/12418-Digital-Services-Act-package-ex-ante-regulatory-instrument-of-very-large-online-platforms-acting-as-gatekeepers
  2.  This was presented as an option 3b in the impact assessment above
  3. See Joint letter to the EU’s Commission: a call to include interoperability provisions in the Digital Services Act, available at https://www.article19.org/resources/joint-letter-to-the-eus-commission-a-call-to-include-interoperability-provisions-in-the-digital-services-act/
  4. Towards a European strategy on business-to-government data sharing for the public interest, Final report prepared by the High-Level Expert Group on Business-to-Government Data Sharing, available at https://ec.europa.eu/digital-single-market/en/news/commission-appoints-expert-group-business-government-data-sharing
  5.  See C. Osborne, “When algorithms define kids by postcode: UK exam results chaos reveal too much reliance on data analytics”https://www.zdnet.com/article/when-algorithms-define-kids-by-postcode-uk-exam-results-chaos-reveal-too-much-reliance-on-data-analytics/ for a discussion on a very recent example coming from the United Kingdom. 

 

Photo Credit: Unsplash