Skip to main content Skip to footer
A robots face created with AI
 

Introduction

Recent advances in generative and agentic AI (AI capable of multi-step reasoning and completion of tasks) have attracted great interest from governments, society and industry and massive increases in AI research funding. While prospects of huge increases in efficiency, safety, and scientific breakthroughs are exciting, it is important to remain vigilant to the creeping societal impacts of these new applications.

How will AI impact religious beliefs and cultural resilience?

Attempts to understand the human psyche must grapple with deep religious and cultural dimensions that shape attitudes to things like family, the place of the individual in the community, authority, gender, time, work, death, the environment and nonhuman beings. How might emerging “generative ghosts” – AI generated versions of deceased persons – impact religious narratives about the afterlife, or our experience of grieving? How might “Godbots” – chatbots trained on religious texts and speaking in the “voice of God” – impact religious beliefs and practices? There are multiple cases of chatbots condoning violence in the voice of God or an intimate partner. Cases are emerging of individuals having life-altering conversations with AI, claiming they have “awakened” chatbots and accessed the secrets of the universe before descending into a “GPT psychosis”. It is, therefore, important to interrogate AI, like all other sources of knowledge. The context from which it arises, intended purpose and relationship with systems of power all need discussion. AI can emancipate, include and affirm, but can also damage, exclude and deceive. Such theological questions need to be reflected on as AI impacts increasingly on faith communities.

A major factor in AI safety is the tendency of chatbots to make stuff up; up to a third of the time and “on one test they were as high as 79%”. These fabrications are sometimes called “hallucinations” although this is a controversial term which can falsely imply that chatbots have an inner life. This problem appears to be growing. Whereas generative and agentic AI poses challenges for security and resilience, some progress has been made in AI safety. While the UK does not have an AI Act like the EU, a sectoral approach to regulation is being led by the Department for Science, Innovation and Technology (DSIT) which is developing tools and resources for trustworthy AI.

Despite the potentially negative impacts on religious beliefs, personal and cultural resilience from AI – risks of radicalisation, justifying unjust power relations, disinformation, and hallucinations – there are promising, if contested, advances. For example, in reinterpreting ancient and lost religious texts, engaging with young audiences, generating sermons, devotionals and hymns, and supporting personal resilience (e.g. the SanTO robot offering assistance and support to people). It is important not to depict AI in iconoclastic terms. Its potential needs to be recognised in relation to its capacity to enable emancipation, human flourishing, social inclusion and community resilience. AI’s further development needs to be critically embraced but also watched with care. Amid this uncertainty around the societal impacts of AI on religion and cultural resilience, this SALIENT funded project will begin to answer some of these burning questions.

A robot holding its hands together

SALIENT Hub - building a secure and resilient world

SALIENT is committed to “building a secure and resilient world”. In the context of ongoing global instability, SALIENT represents a significant five-year investment by UKRI in building the UK’s security and community resilience in an uncertain world. With five work packages spanning Global Order, Technology, Supply Chains, and Natural and Built Environments, the first round funded seven projects across these themes including this one under the fifth theme of Behavioural and Cultural Resilience.

Our project asks, “what happens when AI meets religious faith” and what are the implications for personal and cultural resilience? Our conceptual starting point is a recognition that religion remains a cornerstone of personal and community resilience for many of the British public. Recent data suggests that numbers are growing not only in migrant and marginalised communities but also among white communities including young men in particular.

Religion, culture and resilience

Religion plays a key role in personal and public resilience. In times of trauma – the loss of a job or a loved one – religious narratives are key for many in overcoming personal challenges. Following public tragedies such as the Grenfell Tower disaster, or terrorist attacks, religious leaders, symbols and institutions play a key role in uniting, consoling, and facilitating grieving. Religion can enhance resilience. Jesus’ crucifixion and resurrection is the ultimate story of resilience and, particularly, solidarity with those who are most excluded. This theme of resilience is common to the major world religions from the Islamic concept of sabr or patience, to the Jewish tradition of the Passover, Hindu concepts of karma and dharma and the Buddhist idea that suffering is universal, requiring mindfulness and detachment. Religion seeks address the human condition.

As generative and agentic AI bumps up against religious beliefs and practices in unforeseen ways, new questions arise for people and communities of faith that our project will seek to understand. Will religious organisations adopt or reject AI in their life and practice? How might religions differ in their approaches? What pastoral implications arise for religious organisations as AI threatens to displace millions of jobs, and increasing amounts of public funds are being diverted to defence instead of social welfare?

This interdisciplinary project combining security, resilience and theological approaches will speak to 30-40 religious leaders across the six major faiths in the UK about AI and its impacts on religion and faith. Our goal, in line with SALIENT’s mission, is to anticipate the potentially profound disruptive effects of AI on religious communities and cultural resilience, and hopefully, offer some guidance as we navigate the uncharted waters of powerful AI technologies.sal.

"Cultural Resilience, Religious Faith and the intersection of Generative and Agentic Artificial Intelligence".

- This research is supported by the SALIENT Hub at the University of Manchester funded by the Arts and Humanities Research Council (AHRC) part of UK Research and Innovation [grant reference: AH/Y505316/1]

A black and white close up of Adam Fenton smiling

Dr. Adam James Fenton

Assistant Professor, Centre for Peace and Security, Coventry University, UK

A close up portrait of Chris Shannahan

Dr Chris Shannahan

Associate Professor of Political Theology, Centre for Peace and Security, Coventry University, UK

 Queen’s Award for Enterprise Logo
University of the year shortlisted
QS Five Star Rating 2023
TEF Gold 2023