Opinions - 28.01.2025 - 09:00
Many people use large digital platforms such as Instagram, TikTok, YouTube or X to receive information and entertainment. The risks of consuming information via social media are well known and proven: recommendation algorithms can reinforce the formation of so-called ‘filter bubbles’, which leads to radicalisation and endangers democratic debate. How do algorithms promote such bubbles? And how can users escape them? Researchers are investigating these questions, but they face challenges when it comes to analysing the fundamental causes of these dangers.
As part of its digital strategy, the European Commission has created a basis for stricter rules with the Digital Services Act (DSA). In Switzerland, too, the Federal Council is discussing a similar law, initiated by an interpellation from National Councillor Jean Tschopp from the Canton of Vaud.
The Digital Services Act aims to take measures against systemic risks such as the spread of fake news, the manipulation of users and the dissemination of hate speech by large platforms such as Instagram, TikTok, YouTube or X. This is to be achieved by increasing transparency obligations for large platforms.
Transparency reports disclose the practices used by the platforms to select and display content for users. In addition, there are content moderation databases that provide insights into the extent to which hate speech or censorship is spread before important political events. Ad targeting repositories, which show which ads EU citizens see and how personalised ads are implemented, are also an important tool for transparency. The following applies to all three aspects: this data is provided for the entire European Union, but not for Switzerland and not for users in Switzerland.
Furthermore, the Digital Services Act provides access to information about the platforms' internal mechanisms, such as recommendation mechanisms and their criteria – data that platforms have not previously been required to disclose. This extended access is intended to promote independent research to better identify and minimise risks. The legal basis for this is Article 40 of the Digital Services Act, which is currently in a public consultation phase. The Online Safety Act in the UK has a similar objective, defining which data researchers can view and which data platforms must make available.
Despite the Digital Services Act, Swiss researchers face additional hurdles when accessing data from the European Union. In practice, it remains to be seen how this will affect universities and colleges – especially for empirical research that relies on good data availability. However, we know from our own experience that the hurdles are currently too high to effectively investigate the practices of large platform operators. These hurdles should be removed.
Access rights to platform data, as provided for in the Digital Services Act in the EU, are an essential and central element in ensuring public and independent supervision of social media, independently of regulations and prohibitions. Such rights must also be discussed in Switzerland and put on the Federal Council's agenda.
Due to the restrictions mentioned above, researchers are increasingly resorting to alternative methods to collect data about platforms and their recommendation systems. One example is the tool SOAP (System for Observing and Analyzing Posts) developed at the University of St.Gallen, which offers valuable insights into the dynamics of filter bubbles on social media. Such tools are crucial for analysing critical developments such as the upcoming federal elections in Germany or changes in content moderation on platforms like Facebook and Instagram and their impact on digital worlds.
The insights gained can help to develop sound guidelines for algorithmic systems as part of the Digital Services Act. Current EU proceedings against TikTok – for example in connection with risks during the elections in Romania – underscore the urgency of better data access and intensive research. Independent research and innovative approaches are essential to understanding the challenges of social media and finding long-term solutions – both in Switzerland and internationally.
An article by Full Professor Prof. Dr. Simon Mayer, of Interaction and Communication based Systems at the University of St Gallen, Assistant Professor Prof. Dr. Miriam Buiten, Law in combination with Economics, doctoral candidate Luka Bekavac and Dr. Aurelia Tamò-Larrieux of the University of Lausanne.
Image: Adobe Stock / Thipphaphone
More articles from the same category
This could also be of interest to you
Discover our special topics