Research - 08.01.2026 - 11:30
Social media is a place of free expression. However, it can be misused for the dissemination of hate speech and offensive comments. Some online platforms only intervene in discussions to moderate them once harmful content is already online. This type of moderation is often perceived as censorship that does not address the actual causes of uncivilised behaviour.
Researchers wanted to try a new approach by making platforms aware of the emotional consequences of their comments and thus encouraging a more conscious, civilised approach to digital exchanges. So-called “emotion monitoring dashboards” seek to help stimulate people's empathy while they use social media. The idea behind the attempt to reduce hate speech with the help of empathy-promoting symbols: Social media users who are more aware of their own feelings might rather try to communicate in a more reflective manner. Or even refrain from posting negative comments in the first place.
For their study, the researchers developed two types of dashboards using the open source tool text2emotion: one for “self-monitoring”, which showed users the emotional tone of their own posts, and a “peer-monitoring” dashboard, which broke down the feelings of other commenters. In a study with 211 participants, the effectiveness of these tools was evaluated in a simulated online discussion on the topic of abortion. In addition to Dr Naim Zierau from the Institute of Information Systems and Digital Business (IWI-HSG) at the University of St.Gallen, researchers from ETH Zurich, Seoul National University in South Korea and Bern University of Applied Sciences also participated in the study. “We are interested in how AI can help curb social problems such as cyberbullying or hateful, hurtful comments”, says Naim Zierau.
Their study results suggest that these intervention methods can actually increase users' emotional awareness and reduce hate speech. However, an unexpected side effect was that the dashboards also led to a stronger expression of other negative emotions such as anger, fear and sadness when discussing sensitive topics. Naim Zierau found the gender-specific response particularly interesting: While female participants expressed less hate speech despite increased negative emotions, the opposite trend was observed among male participants – they reduced anger and fear but left more hate speech online. When self-monitoring, some users also expressed concerns about the accuracy of the emotion analysis and felt that the dashboard was censoring them in some cases.
The study provides important insights for designing healthier digital interactions. It shows that cross-platform designs that promote empathy and self-reflection can have a positive impact. However, transparency is crucial: users want clearer explanations of how the algorithms calculate emotions in order to build trust and not feel censored.
But how do researchers see the chances that social media companies will really want to integrate emotion dashboards permanently to shape up ethical standards? There are already some platforms that provide support in the form of sentiment analysis. Specifically, these analyses show whether a post contains words with positive or negative connotations. It is conceivable that more complex systems could be built on this basis. Such as determining emotions, for example. Naim Zierau concludes: “Future research should improve emotion recognition by incorporating contextual nuances and investigating the combined effects of self- and peer-monitoring,” he stated. “We want to develop easy-to-use tools that not only combat hate speech, but also enable genuine understanding and deeper connections between users on social media in the long term.”
The paper entitled “Emotionally Aware Moderation: The Potential of Emotion Monitoring in Shaping Healthier Social Media Conversations” can be found here. The team of authors: Naim Zierau (University of St.Gallen), Xiotian Su (ETH Zurich), Soomin Kim (Seoul National University, South Korea), April Yi Wang (ETH Zurich) and Thiemo Wambsganss (Bern University of Applied Sciences).
Image: Adobe Stock / terovesalainen
More articles from the same category
This could also be of interest to you
Discover our special topics
