Händer på en laptops tangentbord utomhus.

Foto: Unsplash

She explores the relationship between security, social sustainability and AI

In her doctoral thesis, Irja Malmio analyses AI in terms of how the technology influences – and is influenced by – people, societies, and politics.
“I discuss both the opportunities and risks associated with AI, and highlight that AI systems are not neutral, they reflect the values and priorities of the environment in which they are developed", she says.

A new dissertation examines artificial intelligence (AI) from the perspective of how the technology both influences and is influenced by people, societies, and politics.

“I discuss both the opportunities and risks associated with AI, highlighting that AI systems are not neutral. They reflect the values and priorities of the environments in which they are developed”, says Irja Malmio, doctoral candidate in Systems Science for Defence and Security the Swedish Defence University.

Artificial intelligence is rapidly transforming how we live, work, and communicate. But as AI becomes an increasingly integral part of society, new questions emerge, particularly regarding its impact on social justice and security.

Risks and opportunities of AI

"By analysing AI as a socio-technical system –a blend of technological and social factors – I explore both its promise and its perils”, she explains.

AI can help address major societal challenges, such as improving healthcare, reducing social inequalities, and contributing to climate action. At the same time, it can also exacerbate existing problems such as discrimination, polarisation, and intolerance. These types of negative effects are referred to in the study as socio-technical harms.

“For instance, biases in AI algorithms can reinforce discrimination and AI-driven disinformation can deepen societal divides, ultimately threatening security. The technology itself is not neutral, it is shaped by the society in which it is developed”, says Malmio.

Social sustainability and security are interlinked

The dissertation demonstrates that social sustainability – justice, democracy, and human well-being – is closely tied to security. When AI is developed without regard to these values, it can increase insecurity.

"A society that is fair and inclusive is also more secure in the long run. But in practice, these goals often come into conflict, for example when surveillance is used to enhance security but simultaneously threatens individual freedoms”, says Malmio.

The research is based on four articles that examine AI from various theoretical and practical perspectives. It reveals how values, culture, and power dynamics influence the design and use of AI in real-world contexts.

Ethics must be central

One key conclusion is that ethical considerations must be placed at the heart of AI development. The technology should not only be efficient, it must also promote justice, democratic values, and respect for human rights. The study also underscores the importance of a holistic approach to AI regulation, incorporating both technical and social dimensions.

"It’s not just about what the technology can do, but about what kind of society we want to build with it”, says Malmio.

The way forward: Regulation with a holistic perspective

The research also provides a conceptual framework aimed at policymakers and societal planners.

“By understanding the links between AI, social sustainability and security, we can harness AI’s potential to build a more just and secure world, while minimising its risks. We need proactive measures to address socio-technical harms and ensure that AI serves as a force for societal good rather than becoming a source of division and insecurity”, she concludes.

Publication

Irja Malmio (2025): Imagining the impossible - Conflicting Norms and Values on Social Sustainability, Security, and Artificial Intelligence

Irja Malmio defended her PhD thesis at the Division of Risk Management and Civil Protection, Department of Civil and Environmental Technology, Lund University, Lund University, on 21 May 2025.

Page information

Published:
2025-05-21
Last updated:
2025-05-21
Share: