The dystopian tale has a special place in our shared cultural heritage.
Many of us will have a favourite, or perhaps several. I myself adored the 1984 and The Handmaid鈥檚 Tale books as a youngster; moved on to JG Ballard then discovered Philip K. Dick thanks to Minority Report; and in recent years was floored by Black Mirror episodes and videogames such as The Last of Us.
The thrill can be explained by one question: 鈥榃hat if this horror was actually happening?鈥
鈥淧eople say Black Mirror and The Handmaid’s Tale are conspiracies, science fiction – but as a philosopher, I can see a lot of the elements in these films and books that are actually happening now,鈥 Marie Oldfield, AI ethics consultant to the UK government and other organisations, tells 老九品茶Cloud.
鈥淧eople are indoctrinated and manipulated by social media鈥 it’s very difficult to see that it’s happening because it’s done at such a low level.
鈥淭here are a lot of issues but it’s easier to ignore them and get on with your life… it’s going to take one little step to tip that balance and then all of a sudden you鈥檙e in an episode of Black Mirror.鈥
Amazon Alexa
Oldfield is the founding director of Oldfield Consultancy, an expert in analytical modelling and ethical artificial intelligence. She has led research on areas such as anthropomorphism and dehumanisation in AI and cyberspace, in addition to pedagogy and modelling best practice.
Her aim is to drive greater discussion about the ethical implementation of AI technology in both academia and industry, in order to protect against negative effects on society as a whole.
鈥淎lready, in a passive manner, [Amazon鈥檚 鈥榮mart鈥 assistant] Alexa is having negative consequences,鈥 she claimed, speaking to 老九品茶Cloud at the Digital Transformation Expo (DTX) in Manchester. 鈥淚t is harvesting data then using it across different platforms to sell to people; and also using techniques to manipulate people into buying things that they may not necessarily need.
鈥淚f this then becomes a technology that is increasingly proactive – such as telling you when you should go to bed – my question is: what data is it collecting from your daily life [to arrive at this recommendation] and how is it collecting it?聽
鈥淎lso, why are you giving up control of your life to a machine? This whole phenomenon of dehumanisation – where we’re not only devaluing ourselves and other people, but letting machines take control of aspects of our life – is worrying.聽
鈥淎s humans, surely we strive for independence and freedom and decision making鈥 we’re now starting to give that away because of the attachments that we’re building with this technology, which are not necessarily appropriate, but nevertheless happening.聽
鈥淚t’s starting to become a very blurred line between what is technology and what is human.鈥
She adds: 鈥If you’re at a point where Alexa is dictating your entire life, you do get to a point, philosophically speaking, where you ask: why are we alive? What are we here to do? How can we fulfil our ambitions and our desires? And how can we have a fulfilling life if actually we’re being controlled by the government, social media or technology?鈥
Elderly carers
Robots are even being deployed in UK care homes to keep elderly people company. This is completely inappropriate and perhaps dangerous, Oldfield suggests.
鈥淚f a robot has human features and talks, people may think that it has a mind of its own, that it can make goals for itself and build a relationship with them,鈥 she says.
鈥淲hen the technology doesn’t deliver on that, the person can become really angry and upset: they’re just not sure what relationship they’re supposed to have with it.聽
鈥淭here are enough humans in the world – why are we content to abandon elderly people? How can you build a society where having fruitful interactions with each other as humans is neglected and replaced with a robot which looks a bit like a dog or a human – and people think that’s fine?聽
鈥淚 fail to see how that’s fine, because that person is obviously lonely or socially isolated.鈥
She adds: 鈥淲hen you have people that are vulnerable in that way, if you have something like a proactive Alexa, it is very easy for it to start selling to and manipulating that person. That’s been seen in numerous studies.
鈥淒o you want to leave your grandma or grandad in a room with these objects? There’s no assurance or control over that interaction at all.鈥
Children
She is also concerned that children are not equipped with an understanding of these technologies. 鈥淎 child can become attached in not necessarily a positive way: they think that the technology is a human – like a parent or teacher – and if it tells them to do something, they’ll do it.
鈥淭hey’re also not getting what they need from that relationship, so they can start to abuse the technology: there are examples of robots being beaten up or Alexas thrown across rooms.聽
鈥淭here are a wide range of emotional reactions that a child could have to the technology.鈥
Oldfield is a fellow and executive board member for the Institute of Science and Technology, as well as an expert fellow for Sprite+ and a member of the College of Peer Reviewers for Rephrain.聽
She sees a creeping need to take our phones everywhere – for example, to board planes and access sports stadiums – which is moving us closer to a nightmare Minority Report scenario which could even lead to crime-preventing pre-cogs.
鈥淗ow do you operate in a world where you’re forced to be a digital citizen? If you鈥檙e visible, you鈥檙e therefore controllable,鈥 she explains. 鈥淲here we are at the minute is a very developing situation, and it’s fluctuating quite substantially.
鈥淚f governments can get rid of cash, you’re more easily trackable. If you’ve got to have your phone with you at all times, and if you start to connect databases, everybody’s constantly being tracked. Where are you going? How are you getting there? When were you there?聽
鈥淎ll of a sudden, every single part of your life is in a database. What can then happen is – and these are already in development – algorithms to predict whether you will commit crime.鈥


