We publish a Parish Magazine 4 times a year and in each Magazine there is a letter from the vicar or a member of the Ministry team.
Letter from the Vicarage
Recently reading a compelling new biography of Martin Luther, I was reminded of just how much time and effort was spent in the late medieval world worrying about one's relationship with God, with death, judgement and hell very much part of the equation. The post-Enlightenment world has led to a seismic shift, with people worrying less about God and more about their relationships with one another. In the political sphere - it was an age of revolution - came the collapse of the old order but also at the personal level - agony columns used to be full of angst about psychotherapy and sex. Novelists have used their characters to plot these changes through careful and compelling stories. Then even more recently non-human animals became the focus of attention, with some primates even being afforded the equivalent of human rights. This has led to changing attitudes towards eating, farming, hunting, and even zoos; some of that conversation can be acrimonious and even quite violent. It seemed acceptable in some quarters for people to consider harming or injuring other people in defence of non-humans.
A new conversation is now taking place as we consider how to relate to nonhumans in the shape of robots with artificial intelligence. In most supermarkets, it is possible to go through the store and on through the checkout quite legitimately without interacting with another human person. Google is experimenting with driverless cars and lorries; within a generation it is said that the latter will become commonplace for long-distance haulage. It has, of course, been possible for some time to travel on tram and rail systems that are driverless and think nothing of it. Already, where robotics is a flourishing academic discipline, we hear stories of a generation of 'sentient' robots that can learn, adapt, take decisions. This is no longer the preserve of sci-fi films and novels. These sentient robots will, we are assured, work for us, alongside us, assist us and interact with us, not only in defence and transport but in the oil and gas industries, health care, renewable energies, space exploration, to name but a few. What this will mean for meaningful employment for most citizens is difficult to envisage at this moment in time. What we do know is that it raises moral and practical issues about the right to work, the dignity of labour, and so on. Full employment would become nigh-on impossible; that poses a serious challenge to societies as well as individuals given that employment has become so intertwined with identity and personal fulfilment.
Another challenging arena is so-called lethal autonomous weapon systems. Drones programmed and guided from airbases within the United States and the United Kingdom have already been operational in combat zones. The United States is probably not the only government funding research programmes that will create autonomous weapons: fast, lightweight autonomy, tiny rotorcraft that will move at high speed through a building, designed to eliminate every person they meet. Or aerial vehicles designed to carry out strike missions when enemy signal-jamming makes communication with, and so control by, a human commander impossible.
The moral issues are relatively easy to see. The 1949 Geneva Convention requires any attack to satisfy three criteria: military necessity, discrimination between combatants and non-combatants, and proportionality between the value of the military objective and the potential for collateral damage. It would be fair to say that these criteria have not always, or even often, been satisfied when humans are deciding. How, we might ask, would autonomous weapon systems judge for themselves on the two last criteria?
With ethically more demanding relations with non-human animals on one side, and on the other the risk for humans of being supplanted by autonomous robots, there will be plenty for academics in the disciplines of anthropology, IT and computer studies, philosophy and theology to research; and plenty for our elected politicians to think about in the coming decades.
For some, this explosion of artificial intelligence poses the possibility of a threat to humans, real enough that some scientists are advocating for precautionary measures. Should governments create Robotics Commissions to monitor and regulate developments so that we don't innovate irresponsibly?
While concerns mostly centre on economics, government, and ethics, there is also a spiritual dimension to what we are making and doing. If we create other things that think for themselves, and act for themselves, then we raise serious theological issues that will require careful reflection. History lends credibility to this prediction. When Galileo promoted heliocentrism in the 1600s, it famously challenged traditional Christian interpretations of certain Bible passages, which seemed to teach that the earth was the centre of the universe. When Charles Darwin popularised the theory of natural selection in the 1800s, it challenged traditional Christian beliefs about the origins of life. The trend has continued with modern genetics.
Churches and their more traditional theologians do not have a particularly good track record in engaging in conversations like this because it is easier to revisit old questions than to focus on new ones. Yet one might argue that any non-biological, non-human intelligence will present a greater challenge to religion and human philosophy than anything else we have so far encountered. Despite the challenges, artificial intelligence need not necessarily undermine faith. Part of the point of religion is to recognise that I, as a contingent human person, am not God and so I do not have all the answers and will inevitably be wrong about things. This simply confirms what a conflicted Augustinian friar, Martin Luther, discovered: that life is about trusting God and not trusting in our own understanding, which is invariably flawed and partial.
With best wishes to you all,
Nicholas P Anderson
Nicholas P Anderson