Journal of Online Trust and Safety https://tsjournal.org/index.php/jots <p>The Journal of Online Trust and Safety is a cross-disciplinary, open access, fast peer-review journal that publishes research on how consumer internet services are abused to cause harm and how to prevent those harms. </p> en-US trustandsafetyjournal@stanford.edu (Journal of Online Trust and Safety) trustandsafetyjournal@stanford.edu (Journal of Online Trust and Safety) Wed, 29 Apr 2026 00:00:00 +0000 OJS 3.3.0.7 http://blogs.law.harvard.edu/tech/rss 60 Public Support for Misinformation Interventions Depends On Perceived Fairness, Effectiveness, and Intrusiveness https://tsjournal.org/index.php/jots/article/view/267 <p>The proliferation of misinformation on social media has concerning possible consequences, such as the degradation of democratic norms. While recent research on countering misinformation has largely focused on analyzing the effectiveness of interventions, the factors associated with public support for these interventions have received little attention. We asked 1,010 American social media users to rate their support for and perceptions of ten misinformation interventions implemented by the government or social media companies. Our results indicate that the perceived fairness of the intervention is the most important factor associated with support, followed by the perceived effectiveness of that intervention and then the intrusiveness. Interventions that supported user agency and transparency, such as labeling content or fact-checking ads, were more popular than those that involved moderating or removing content or accounts. We found some demographic differences in support levels, with Democrats and women supporting interventions more and rating them as more fair, more effective, and less intrusive than Republicans and men do, respectively. It is critical to understand which interventions are supported and how they are perceived, as public opinion can play a key role in the rollout and effectiveness of policies.</p> Catherine King, Samantha Phillips, Kathleen Carley Copyright (c) 2026 Journal of Online Trust and Safety https://creativecommons.org/licenses/by-nc-sa/4.0 https://tsjournal.org/index.php/jots/article/view/267 Mon, 23 Mar 2026 00:00:00 +0000 Protecting Young Users on Social Media: Evaluating the Effectiveness of Content Moderation and Legal Safeguards on Video-Sharing Platforms https://tsjournal.org/index.php/jots/article/view/251 <p>Video-sharing platforms such as TikTok, YouTube, and Instagram implement content moderation policies to reduce the exposure of minors to harmful videos. As video has become the dominant and most immersive form of online content, assessing how effectively these systems protect younger users is increasingly important. This study evaluates the effectiveness of video moderation for different age groups on TikTok, YouTube, and Instagram, based on a focused set of experimental accounts. Accounts were created for simulated users aged 13 and 18, and 3,000 recommended videos were analyzed in two interaction modes: <em>passive scrolling</em> and <em>search-based scrolling</em>. Each video was manually assessed for the severity of the harm using a unified harm classification framework. While low-severity harm was the most prevalent form encountered, the results show that accounts configured as 13-year-olds encountered harmful videos more frequently and rapidly than accounts configured as 18-year-olds. On YouTube, 15% of videos recommended to 13-year-old accounts during passive scrolling were classified as harmful, compared to 8.17% for adult accounts, with exposure occurring within an average of 3:06 minutes. This exposure appeared without user-initiated searches, highlighting weaknesses in algorithmic filtering. Results from our targeted study point to gaps in video moderation systems, suggesting the need for more effective safeguards to better protect minors from harmful online content.</p> Fatmaelzahraa Eltaher, Rahul Krishna Gajula, Luis Miralles-Pechuán, Patrick Crotty, Juan Martínez-Otero , Christina Thorpe, Susan Mckeever Copyright (c) 2026 Journal of Online Trust and Safety https://creativecommons.org/licenses/by-nc-sa/4.0 https://tsjournal.org/index.php/jots/article/view/251 Mon, 23 Mar 2026 00:00:00 +0000 “I Tend to Run to Problems That People Run Away From” https://tsjournal.org/index.php/jots/article/view/285 <p class="p2">Platform-side Trust and Safety (T&amp;S) is the crucial paid work of responding to and mitigating harmful content and behavior online and beyond. It is characterized by complexity, ambiguity, urgency, and trade-offs based on competing values across a constantly morphing landscape of technologies, abuses, and actors. Interviews with 47 T&amp;S professionals suggest that their expertise is rooted in affective-relational skills: seeking multiple perspectives, reflexivity, curiosity, and collaboration. Furthermore, our findings suggest T&amp;S professionals are motivated by a desire to protect users and spaces, the intellectual challenges inherent in the work, and the caliber of their colleagues. However, the fundamental challenges of the work are compounded by other conditions: both internal and external misperceptions about T&amp;S, responsibility with limited autonomy, and organizational structures. In contributing a more nuanced and grounded perspective of platform-side T&amp;S, we argue that emotion is not a liability, but rather an essential asset in T&amp;S work. We call for valuing the affective skills and motivations T&amp;S professionals bring to their work with the aim of a shift from coping toward well-being and from individually-borne responsibility toward organizational support.</p> <p> </p> Toby Shulruff, Amanda Menking Copyright (c) 2026 Journal of Online Trust and Safety https://creativecommons.org/licenses/by-nc-sa/4.0 https://tsjournal.org/index.php/jots/article/view/285 Wed, 22 Apr 2026 00:00:00 +0000