Christians join faith leaders in raising ethical concerns about artificial intelligence
Christian leaders in the field of science, technology and theology have added their names to an open letter calling for faith and belief communities to part of the national and international discussion around Artificial Intelligence.
The letter to science minister Michelle Donelan has been signed by Graham Budd, executive director of The Faraday Institute for Science and Religion, Rev Prof Philip McCormack, Principal of the Centre for Digital Theology at Spurgeon's College, London, and Nathan Mladin, senior researcher at the Theos think tank, among others.
They were among the 30 faith and belief leaders who took part in a meeting supported by the Abu Dhabi Forum for Peace and the Vatican's Pontifical Academy for Life, and hosted by Google. The meeting, which took place ahead of the UK's AI Safety Summit last week, was chaired by former home secretary Sajid Javid.
At their next meeting planned for December, they will together launch a new UK-based AI for Faith and Civil Society Commission "with the aim of harnessing the opportunities of Artificial Intelligence for human flourishing while protecting communities from potential harm".
Writing to Donelan after their meeting and ahead of the summit, they said that faith and belief communities, along with civil society, have an important role to play in helping to develop an ethical framework for AI.
They warned that the implications of AI raise "significant ethical and arguably existential questions that demand our collective attention", and that there is "clearly a risk ... that short-term commercial and economic interests will outweigh long-term social and ethical concerns if we do not find ways to engage a wide range of religious and cultural perspectives".
Admitting they "may not be experts on AI", they nonetheless argue that it is "imperative" that the "diverse set of concerns, viewpoints and recommendations" from faith, belief and civil society leaders "are given due consideration".
"It is our shared belief that, in addition to bridging the gap between technological experts and the broader public, faith or belief and civil society organisations serve as critical watchdogs holding both AI developers and policymakers accountable," they said.
"We also discussed how faith or belief and civil society groups are often the first to identify harms which may affect specific communities."
The letter makes a number of recommendations, including a call to ensure accountability and the development of ethical guidelines, as well as efforts towards "closing the AI literacy gap".
"There is a lack of AI literacy across civil society sectors and especially among faith or belief communities," they write.
"This must be acknowledged and urgently addressed to avoid widening the technology gap and preserve people's right to make informed choices about AI risks."