Time to trade AI power for responsibility
August 2018
Our lives and outlook, and those of our children, are fast being shaped by digital. These changes are unplanned, largely unregulated and already happening. This, warns Adam Thilthorpe, leaves us reliant on the ethical fortitude of developers. He says we urgently need a clear ethical framework for digital innovation.
In a digital world with ubiquitous technology, our lives are increasingly shaped by unintended consequences. Trying to get a measure on this, or even some semblance of control back on our own lives, is proving not only difficult but a challenge that has everything to do with the very real issue of ethics in IT.
Unintended consequences in our digital world shape our physical reality. When Mark Zuckerberg and friends were kicking around the original ideas for Facebook, they just had in mind a book of pictures of the students in Harvard – literally, a face book. Today, Facebook has grown to be one of the largest corporations in the world and, it is alleged, has been used to undermine the world’s largest democracy.
We’re now raising a generation who won’t recognise a world without communal artificial intelligence. Whether it’s Apple’s Siri, or Amazon’s Alexa, parents are being confronted by Ai that disrupts the natural ‘call and response’ of learnt conversation in the home to such an extent that we ask whether it’s still appropriate to teach children to say please and thank you.
Or is the opposite true? It is said that true digital natives can clearly distinguish the difference between human interaction, simple voice recognition and even natural language understanding. But do we really believe that?
It’s not just about being polite. According to a NSPCC/Children’s Commissioner report, 40% of 11 year-olds ‘sext’, and with half of 11-16 year-olds reporting seeing online pornography. How can that be good for the future of human interpersonal relationships?
What role do all of us, parents, educators and regulators have?
We’re seeing the daily use of biometrics at our borders and in our courts. Police forces are experimenting with Ai software that can interpret images, match faces and analyse patterns of communication, all with the aim of speeding up the examination of mobiles.
These are not planned changes, these are in use, here, now. Do you remember being asked if you wanted, let alone consented, to these incremental but important changes to the way that we conduct our lives? No, me neither.
Yet step by technical step, we are seeing a change to the fundamental relationship between citizen and state. Instead of presumed innocent are we now simply all un-convicted people?
As our technologies move so, inevitably, public policy, legislation and our regulators lag far behind. Nowhere was that more starkly evident than when Mark Zuckerberg appeared in front of a US Senate committee in the Spring (pictured). As one commentator put it, ‘part of the problem was the clear ignorance, if not befuddlement in the face of technology displayed by most senators, many of whom are of a ripe vintage’.
So where does that leave us? Sadly, at the mercy of the ethical fortitude of those developers, designers, coders and makers who are forging ahead in this digital age, if not at our behest, certainly then at least with our enthusiasm for greater integration and insight.
Let’s face it, what’s more useful: online ads for a bulk buy of nappies that I’ll never click, or ads for the new road bike I’ve been promising myself?
These developers, designers and coders and makers are the very people that need to understand not only the intentions and motivations, but, importantly, also the potential for unintended consequences. IT people must be great sociologists…
The chances are that, if you’re reading this, you know some or all of this already. You’ll be in the know, and the chances are that you’ll already have your own opinions about the various issues I’ve raised. That’s what I’d expect.
But the big question for me, is how do those of us who work in, or at the edges of, some of this technology, raise these big, difficult questions with politicians, with civil society leaders and with the public at large? Whose role is it to ensure that the magnitude and complexity of the world that is being created around us?
The US tech giants – not a great track record so far. Our own governments and regulators, perhaps. What about our national news media?
For me, it’s simple. We need those who work in the sector, who are developing these technologies, to understand that they owe it to their families, and to society at large, to develop within an ethical framework.
With great power comes massive responsibility.
Adam will be talking at DigitalAgenda’s Power & Responsibility Summit at London’s British Library on 4 October. Secure 30% off your summit ticket now.