close

Research - 29.02.2024 - 09:30 

AI regulation: how should Switzerland deal with the technology?

Following the adoption of the AI Act by the EU, the question also arises for Switzerland: Do we need an AI law or sector-specific regulations? What risks to fundamental rights and democracy do current developments entail, and what experiences have other countries had? We spoke with two of the HSG experts, who will be discussing these topics at the AI conference on 22 March 2024 in Zurich. Isabelle Wildhaber and Melinda Lohmann are two of the six co-founders of the HSG's Law & Tech Lab.
[Translate to English:] Nach dem Erlass des AI Act durch die EU stellen sich auch in der Schweiz Fragen. Interview mit Isabelle Wildhaber und Melinda Lohmann, HSG Law & Tech Lab.

Following the adoption of the "AI Act" by the EU, Switzerland is also faced with the question of whether and how we want to regulate artificial intelligence (AI). What exactly is the AI Act about?

Isabelle Wildhaber: The EU AI Act is a horizontal regulation that ties in with artificial intelligence (AI) technology as such and regulates it. It follows a risk-based approach, i.e. different regulations are provided for depending on the risk category: the higher the risk, the stricter the regulation or even a ban. The European Parliament, the Council and the Commission agreed on a final version of the AI Act in December 2023. All member states accepted this version in February 2024, meaning that nothing stands in the way of it coming into force in mid-2024.

How sensible are these rules from your point of view?

Melinda Lohmann: At HSG, a team of experts subjected the EU AI Act to a stress test in the summer of 2023. In this "Grand Challenge", teams from all over the world competed in a competition to assess specific AI products or AI services in the light of the new AI regulation. The HSG Grand Challenge competition showed how extensive the regulations are and how difficult they are to implement. In our view, it is important that legislators take a close look at AI and consider where there are regulatory gaps. This has already been done before with the emergence of new technologies such as nuclear energy or genetic engineering. In our opinion, however, the EU AI Act is too comprehensive and too far-reaching. It will mean the end for some companies and thus also prevent innovation. In Switzerland, we need to think carefully about how we want to proceed.

What should Swiss regulation of AI in the public and private sectors look like?

Isabelle Wildhaber: It is important to take a differentiated look at the various use cases and not to lump them all together. That's why we recommend sector-specific regulations for Switzerland. We don't need a horizontal AI law. Different principles apply in the public sector than in the private sector. For example, when using an app with an AI-based chatbot, citizens have more extensive rights to information than private users. Different problems arise in road traffic than in medicine or the media.

What risks to fundamental rights and democracy do current developments entail?

Melinda Lohmann: Like all new technologies, AI brings opportunities and risks. As lawyers, we deal a lot with risks for professional reasons, but it is also important for us to emphasise the innovation potential of AI, e.g. to solve the climate crisis. Regulation must be designed in such a way that it does not stifle innovation, but rather enables safe AI that complies with fundamental rights. Regulation can also mean legal certainty and thus stimulate innovation. One risk of AI systems such as HR recruiting, for example, is discrimination. AI systems are created with historical data and this data often contains a bias that is then perpetuated in the system. Of course, the AI itself is not biased and only "wants" to hire men, for example, but it works with this historical data that we humans provide and which contains bias. It is important that we are sensitised to these possible effects.

What experiences can Switzerland learn from in other countries?

Isabelle Wildhaber: Switzerland is in the luxurious position of waiting to regulate and observing other countries. What is happening in the EU is certainly the most relevant for us and will affect us  whether we like it or not. On the one hand, the AI Act will have a legal impact beyond the EU, while on the other, the so-called Brussels Effect will ensure that standards are set (the EU has of course now driven a stake into the regulatory landscape with its AI Act). 

In your opinion, when is artificial intelligence risky and should it be banned in Switzerland?

Melinda Lohmann: Dystopian scenarios such as social scoring and similar practices must be prevented altogether. In principle, however, we believe that we should be very careful about imposing a blanket ban on innovation at this early stage and that a ban should only be imposed for specific applications.

The Law & Tech Lab is organising a conference on the question of how AI should be regulated in Switzerland. What is the aim of the conference?

Isabelle Wildhaber: The Federal Council has given DETEC the task of providing an overview of various regulatory options in Switzerland by the end of 2024. The aim of our conference on the regulation of artificial intelligence on 22 March 2024 at Technopark Zurich is to critically discuss these options with interested parties and discuss a Swiss AI strategy. Experts from the worlds of business, politics, law and research will take part in the conference. Federal Councillor Beat Jans, Head of the Federal Department of Justice and Police (FDJP), will also be a guest. He will give a keynote speech and discuss aspects of Swiss AI regulation with us. The conference is hosted by the Law & Tech Lab at the University of St.Gallen (HSG). The Lab brings together six HSG law professors who jointly advance research and practice in Law & Tech.

Finally, how useful is AI currently for your own research work? 

Melinda Lohmann: First and foremost, AI is a fascinating subject of research for us. But of course, we also enjoy experimenting with new tools such as ChatGPT ourselves. Especially in teaching, you can try out innovative formats and let students gain important experience in using AI tools and other "LegalTech" tools.

Discover our special topics

north