Whenever I’m asked, “what does digital mean?” I turn to Tom Loosemore’s pioneering definition: “Applying the culture, processes, business models and technologies of the internet era to respond to people’s raised expectations.”
In my view this is the closest anyone has got to demonstrating that digital is not just about technology, but requires a mindset change, a different approach to processes and ways of working and, above all, an understanding of the end user.
But to what extent have we seen this applied in government over the past few years?
The launch of the Government Digital Service (GDS) in 2011 sparked a whole movement that radically changed the way central government approached public service design and delivery.
At the time I was responsible for digital at the Department of Health and, along with GDS and my Whitehall “Digital Leader” peers, worked on the first iteration of digital by default government — although in health and care we insisted on the term “digital first” instead.
It may have taken 10 years and a global pandemic for some of the ideas we proposed to become reality (better late than never…) but despite GDS’ best efforts, people in government continued to use the terms digital, technology and IT almost interchangeably.
When the Technology profession was created alongside the Digital profession the tensions on and off stage became only too apparent. The traditional technologists saw the new digital people as arrogant upstarts focusing on the front end and ignoring the importance of the boxes and wires behind the scenes.
Thankfully this did begin to change… Now we have the Digital, Data and Technology (DDaT) profession that brings these two sides together; and the creation of the Central Digital and Data Office (CDDO) for Government alongside a GDS with a renewed mission and mandate will, I hope, provide further clarity and focus.
GDS started a decade ago, and since then the pace of digital development has transformed people’s expectations – to use Tom’s term – and continues to do so at pace. Smartphones have become pocket personal assistants from which we expect to manage almost every aspect of our lives. We shop, bank, socialise, organise, learn, take photos, order food and perform myriad other everyday activities.
But how digital are we really? Is using a plethora of apps enough? Have government processes and services been truly transformed and redesigned from a digital perspective using all the tools, techniques and technologies available? And is digital still about responding to people’s raised expectations? Or are those expectations raised instead by the endless possibilities that digital offers? In my view it’s now firmly the latter.
At any point in time, by definition, we see ourselves as at the state of the art. But there is always further to go and digital is no exception. One commonly held belief, particularly since the pandemic, is that everyone now uses digital technology. However, the pandemic has starkly demonstrated how quickly inequalities can escalate when a different way of living takes hold.
Our lockdown-informed dependence on digital technology, far from reducing the very real digital divide, has actually deepened it. And this divide is by no means purely a generational issue. More older people than ever before are embracing digital technology and the binary distinction between young digital natives and old digital dinosaurs no longer holds true.
Instead, income and socioeconomic status are now the greater determinants of digital inclusion and exclusion, with people on low incomes of any age much less likely to be digitally active than those more comfortably off.
From a social and economic point of view, digital, rather than being the equaliser and utility for all envisaged with the birth of the World Wide Web, has become another barrier between the haves and have nots with significant negative impacts on health outcomes, educational attainment and social mobility.
This is not to deny the progress made with digital capability during the past couple of years. The pandemic created an extraordinary explosion in digital adoption across the UK, catalysing massive change. According to the 2021 Lloyds Bank Consumer Digital Index, we have made five years’ progress in just one. Working, learning and socialising all moved online, with workplaces, schools and private citizens having to adapt literally overnight.
But this speed of adoption also meant an increase in unintended consequences. Security breaches in online meetings, a disproportionate rise in online scams and fraud, a lack of preparedness by organisations for staff to work from home, increased mental health problems and social isolation are all legacies of the pandemic which we need to address urgently with new protocols, skills and education.
Businesses, organisations, schools and universities need to proactively manage online working and learning, ensuring the right safeguards, policies and training are in place. We had no choice but to adapt when Covid arrived. Now we have to ensure we’ve adapted in the right way.
As digital adoption has grown, so too has the fear that all humans will be automated out of the workplace. This is not only pessimistic but also unrealistic. Currently our machine learning and Artificial Intelligence capabilities are not as advanced as some may think, and rather than robots replacing humans we are starting to understand that we need the extraordinary processing power of computers working alongside — not instead of — uniquely human skills like empathy, compassion, and critical judgement. By combining the two we get far more than the sum of each part.
For example, we know that machines can analyse medical test results faster and more accurately than even the most highly trained humans, but what they can’t do is provide the human reassurance that people want when it comes to their health. Automation can free up staff from repetitive tasks that machines can complete more accurately allowing humans to do what only they can do. It is that symbiotic relationship that we need to develop, and not just in health and care.
But for this to succeed we need to have adequate retraining opportunities for those whose jobs may well be automated. What are we doing to increase their suitability for working with and alongside machines? If the UK is to remain productive, we need to address this urgently.
Retraining and upskilling is a cross sector responsibility that requires government funding but also the cooperation and support of the private and not for profit sectors. This is a societal shift and everyone must play their part.
Covid-19 has shown that we can make leaps and bounds in our adoption of digital technology when we have to, but unless this is complemented by proper planning, training and education we may create almost as many problems as our increased digitisation solves.
I was encouraged to see the recommendations in TPXimpact’s Transforming Government report, which highlights crucial focus areas for government as it responds to the challenging times ahead.
For me, investing in intelligent automation alongside providing retraining and upskilling opportunities, tackling online harms and prioritising social impact by narrowing the digital divide are particularly key.
To return to the definition of digital, applying the culture, processes, business models and technologies of the internet era is now not only essential for reactively responding to people’s raised expectations, but also for proactively designing, delivering and managing public services.
I hope government is listening.