CTS participated in the 2024 Terrorism and Social Media (TASM) international conference, held on 18-19 June at Swansea University’s Bay Campus in Wales, UK.

The conference was a dynamic and insightful gathering of experts from diverse fields, all focused on the critical intersection of terrorism and online platforms. As the head of the CTS//circle.responsibleComputing at our Center for Society and Technology (CTS), I was fortunate to attend and contribute to this important event in terrorism and social media community. My participation was fully sponsored by the CTS, reflecting our center’s commitment to engaging with the most pressing issues at the nexus of society and technology.

Key Themes and Insights from Day One

The first day of the conference was centered around the evolving tactics of extremist and terrorist groups in the digital landscape. The presentations covered a range of topics, including the exploitation of new technologies and Web3, the strategic use of gaming platforms and visual media, and the spread of polluted information through malign influence campaigns.

One of the standout moments was the keynote by Brian Fishman, co-founder of Cinder and former Global Head of Dangerous Organizations at Facebook (now Meta). Fishman provided an in-depth analysis of how extremist organizations have adapted to changing online environments, offering both historical context and forward-looking insights. His emphasis on the importance of preemptive strategies in combating these groups resonated deeply with the audience.

The breakout sessions that followed offered a wealth of knowledge on specific subtopics. In particular, discussions on the role of gaming platforms as recruitment tools for extremist groups were eye-opening. As gaming communities continue to grow, the potential for these platforms to be co-opted by bad actors is a concern that requires immediate attention from both policymakers and tech companies.

Day Two: Focusing on Responses and Interventions

The second day shifted focus from the problem to potential solutions. Presentations examined various responses, including the effectiveness of current regulatory regimes, the use of AI to detect extremist content, and the balance between counterterrorism measures and the protection of human rights.

I had the honor of presenting my research titled “Content Moderation Interventions in the Age of Borderline Social Media Content: A Bot-Powered Approach to Influence User Attitude and Engagement with Borderline Content.” This project, co-authored with Marten Risius from Neu-Ulm University of Applied Sciences (HNU) and Sabine Matook from the University of Queensland (UQ), explores how automated systems can be used to moderate borderline content that may not overtly violate platform policies but still contributes to harmful narratives.

The reception to our presentation was overwhelmingly positive, and it was gratifying to see our research spark meaningful discussions. I was able to connect with several extremism prevention workers from Germany and researchers from the UK and the US, laying the groundwork for potential future collaborations.

Sandpit Event: Collaborating on Innovative Solutions

The conference concluded with a sandpit event on 20 June, designed to foster collaboration among attendees. This event was a unique opportunity to work intensively with other experts on developing project proposals that address the challenges discussed during the conference.

I collaborated with Daniel Levenson from Swansea University on a proposal titled “Shifting the Conversation on AI and Terrorism: A Value-Sensitive Approach for Stakeholder Consensus.” Our project focuses on addressing the lack of consensus on mitigating the misuse of AI technologies by terrorist actors. We aim to shift the discourse towards a value-sensitive approach that balances security with ethical considerations, fostering dialogue among stakeholders to develop actionable policy recommendations.

The event was expertly facilitated, culminating in each team pitching their project ideas and submitting project proposals the week after. A month after the conference, we were thrilled to learn that our proposal had won the £5,000 seed funding. This grant will enable us to continue our research, which is vital in the current digital age where AI is increasingly being integrated into both everyday life and the malicious activities of extremist groups.

Final Thoughts

Attending the TASM conference was an invaluable experience, offering not only the chance to share my research but also to engage with leading experts and practitioners in the field. The conference underscored the importance of interdisciplinary collaboration in tackling the complex issues surrounding terrorism and social media. The insights gained and connections made will undoubtedly inform and enhance the work we do at CTS.

As we move forward, the discussions and ideas from TASM 2024 will play a crucial role in shaping our approach to these challenges. I look forward to continuing our efforts to develop responsible and effective strategies for addressing the misuse of technology by extremist groups, with the support and collaboration of our new partners.