The European law on big technology has been approved. Now comes the hard part

Potential gold The standard for managing online content in the EU – the Digital Services Act – is now a reality after the European Parliament voted overwhelmingly in favor of the law earlier this week. The last hurdle, a mere formality, is for the Council of European Ministers to sign the text in September.

The good news is that this landmark legislation includes some of the most extensive platform transparency and accountability obligations to date. It will give users real control and insight into the content they engage with, and offer protection from some of the most pervasive and harmful aspects of our online spaces.

The focus is now on implementation, as the European Commission begins to seriously develop enforcement mechanisms. The proposed regime is a complex structure in which responsibilities are shared between the European Commission and national regulators, in this case known as Digital Service Coordinators (DSC). It will rely heavily on the creation of new roles, expansion of existing responsibilities and seamless collaboration across borders. What is clear is that, for now, there are simply no institutional capacities to pass this law effectively.

In a “brief overview,” the commission offered some insight into how they propose to overcome some of the more obvious implementation challenges — like how they plan to oversee large Internet platforms and how they’ll try to avoid the problems plaguing the general. General Data Protection Regulation (GDPR), such as inconsistent national regulators and selective enforcement. But their proposal only raises new questions. A huge number of new staff will need to be hired, and the new European Center for Algorithmic Transparency will need to attract world-class data experts and experts to help implement the new algorithmic transparency and data access obligations. The Commission’s preliminary vision is to organize its regulatory responsibilities by thematic areas, including a social affairs team, which will be tasked with overseeing some of the new due diligence obligations. Insufficient resources are a cause for concern here and would ultimately risk turning these hard-won commitments into empty ticking exercises.

One critical example is the obligation for platforms to conduct assessments to address systemic risks on their services. This is a complex process that will have to take into account all the fundamental rights protected by the EU Charter. To do this, tech companies will need to develop human rights impact assessments (HRIAs) – an evaluation process that aims to identify and mitigate potential human rights risks arising from a service or business, in this case a platform – something that they were invited by civil society to work during the negotiations. However, it will be up to the board, composed of the DSC and chaired by the commission, to annually assess the most prominent systemic risks identified and outline best practices for mitigation measures. As someone who has contributed to the development and evaluation of the HRIA, I know that this will not be an easy feat, even with independent auditors and researchers participating in the process.

If they are to have impact, assessments need to establish comprehensive frameworks, concrete impact analyses, evaluation procedures and stakeholder engagement strategies. The best HRIAs incorporate a gender-sensitive approach and pay particular attention to systemic risks that will disproportionately affect those from historically marginalized communities.

This is the most specific way to ensure that all potential infringements are included.

Fortunately, an international human rights framework, such as the UN Guiding Principles on Human Rights, offers guidance on how best to develop these assessments. However, the success of the provision will depend on how platforms interpret and invest in these assessments, and even more on how well the Commission and national regulators enforce these obligations. But in its current capacity, the ability of institutions to develop guidelines and best practices and to evaluate mitigation strategies is nowhere near the scope that DSA will require.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *