It is all part of the Digital Services Act (DSA) passed by the European Union, which has as its main objective the fight against the proliferation of illegal content on the Internet while safeguarding users’ rights.
The Act applies to all digital services that connect consumers to goods, services or content, with new and comprehensive obligations for online platforms to
This regulation is aimed at companies that individually reach more than 10 % of the EU population, i.e. approximately 45 million people, and whose sanctioning regime, in case of non-compliance with the regulation, provides for sanctions by the European Commission of up to 6 % of their annual worldwide turnover, and even the prohibition to operate in the EU single market in case of repeated serious infringements.
Examination of algorithms
The European Commission’s aim with this regulation is to have a clear legal text to ensure that these platforms “live up to their responsibilities to reduce the amount of illegal content and limit other harm, as well as to protect the fundamental rights and safety of users,” says Margrethe Vestager, executive vice-president and head of A Europe Fit for the Digital Age.
Internal Market Commissioner Thierry Breton says these new rules “mark the beginning of a new era” where large platforms “will no longer behave as if they are too big to worry about”.
“Everyone can do business in the EU as long as they comply with these new rules,” says Breton.
Content policy
This regulation is expected to come into operation next summer, and the Seville office will analyse the functioning of the algorithms designed by technology companies to prioritise the information that citizens see on the internet and thus understand why the platforms show some content ahead of others.
The Commission’s idea is that if it first understands how algorithms work, then it will be possible to anticipate and prevent some of the negative effects of algorithmic systems, such as proposing illegal or harmful content to vulnerable groups such as children, according to Brussels.
The European Centre for Algorithmic Transparency, which is expected to begin its work in January, will be made up of some thirty people, twenty of whom will work from the Andalusian capital, and the rest in the centres in Ispra (Italy) and Brussels.
For this monitoring of large platforms, ECAT will consist of a multidisciplinary team, composed of computer scientists, economists, big data and artificial intelligence experts, under the direction of the European Commission, to help ensure that large platforms comply with the ethical requirements imposed on them by the EU.
Reports against illegal information
This new digital services law, which the EU has been working on for two years, will oblige social networks, search engines and platforms such as Amazon to produce annual reports to demonstrate to authorities that they are taking measures to combat illegal information.
In addition, it also prohibits the display of personalised advertising to minors and requires companies to inform users why they are being recommended certain content based on their profile.
On the other hand, it gives users the possibility that the information they see on the internet is not based on their profile.
Another advantage included in the regulation for the benefit of citizens is to prohibit the use of “dark patterns“, tricks designed to mislead and manipulate consumers into making decisions that are likely to be contrary to their interests.
Commissioning
Following the entry into force of the Digital Services Act on 16 November, platforms have three months to report the number of active end-users (17 February 2023) on their websites.
In addition, the Commission invites all online platforms to notify it of the numbers of users, on the basis of which it will assess whether a platform should be designated as a very large online platform or search engine.
After the Commission designates the group to which the platform belongs, entities will have four months to comply with the obligations under the Digital Services Act, including the first annual risk assessment exercise and its communication to the Commission.
In short, it is a step forward in the search for large internet companies to take responsibility for ensuring that the content that reaches citizens is free of harmful or false content.