Who gets to understand AI

Written by Dawn McAra-Hunter, Programme Manager, Scottish AI Alliance

Who Gets to Understand AI? The Literacy Gap that could Shape the UK’s AI Future

Artificial Intelligence (AI) has been inescapable over the past few years. No longer a futuristic concept confined to faceless research labs, or the science fiction fantasy that dominated popular culture of previous decades, AI is now and increasingly unavoidable part of everyday life in the UK, shaping decision-making, influencing the job market, and filtering the information we interact with on a daily basis. But as we see continued moves to implement AI into an ever-expanding societal scope, the ability to understand, to question, or to confidently engage with AI is not evenly distributed. 

 

AI is Expanding, But Understanding is Not

We tend to talk about AI as though it is something that happens to people, rather than something shaped by people. By positioning AI as mysterious and technologically incomprehensible, we weave a problematic narrative that places AI beyond the reach of everyday citizens. This sense of distance is something I see throughout my research: metaphors of black boxes, sweeping transformation, and the age of AI, shroud AI in mystery and hype, making it untouchable. What’s more concerning, is the seeping of this speech into public policy, public sector administration, and, well, all aspects of public life in general. 

In reality, AI implementation in many areas of public life is still very much in its infancy. Neither magical, nor incomprehensible, but poorly explained (if at all), poorly understood, and poorly utilised. This is not an issue confined to the general public. An impoverishment of AI literacy cuts through all sectors and social strata. The confidence to critically interrogate claims of objectivity, efficiency, necessity, and transparency is often lacking in the areas where it is needed most. 

AI literacy is becoming a civic skill, just as digital literacy was a decade or two ago. Those who possess it will have more agency, opportunities, and influence as to how AI is used. Those who don’t will increasingly find life happening around them rather than with them. 

 

How AI Narratives Reinforce Inequality

How we talk about AI gives clear signals about who feels entitled to engage with it. AI discourse is not neutral, reinforcing inequalities in three key ways:

  • Highly technical language signals who “belongs” in the AI conversation

When we talk about AI in highly technical, abstract, or opaque terms, it signals to people “this isn’t for you”. Now, of course, there is a place for technical discussions of AI. Its necessary to truly understand how AI works, and to understand the impact of emerging capabilities. But the current couching of AI in highly technical terms disproportionately affects those with lower levels of digital confidence, people with limited access to learning, and those already marginalised in digital spaces. 

  • Narratives of inevitability remove public agency

Inevitability narratives normalise resignation instead of empowerment. When we say “AI is coming whether we like it or not”, we shut down debates as to whether it should be coming. Inevitability challenges our agency. 

  • Media narratives stereotype who uses or benefits from AI

Depictions of AI in media, online, and even generated by AI systems themselves, show AI as something created, controlled and utilised by young, technically skilled men. Representation shapes participation.

These narratives shape not just public opinion about AI, but policy design, service implementation, and organisational culture. In real terms, these impact hiring decisions, research priorities, and who is invited to participate in discussions and decisions about AI and our future. 

As such, AI literacy has to be about more than explaining algorithms. It must also challenge narratives, empowering and activating the public to engage in the AI decision-making process, and reframing AI as something that belongs to all of us. 

 

Closing the Gap and Living with AI

In 2023, Scotland launched its national AI literacy programme, Living with AI, delivering AI literacy training to thousands of individual learners and being delivered through dozens of public sector organisations, with supported options available for groups at risk of digital exclusion. These are groups historically excluded from conversations around digital transformation, and yet they are often the most impacted by these systems. 

The main takeaway we’ve learned from this is that making AI literacy free, publicly available, and a national priority isn’t about turning everyone into technical experts. But it is about empowering people to become informed decision-makers around AI use, and enables higher standards of transparency, security, and accountability to proliferate. When you give people the tools to understand AI, they gain confidence. When they gain confidence, they ask better questions. And when they ask better questions, they become active participants in the development of our digital future. 

 

Why UK leaders should care about AI literacy

AI literacy is not yet systematically embedded across UK digital strategy. We acknowledge its importance. It’s included in frameworks. But its not treated as vital digital infrastructure. 

And yet it should be. 

Why?

  • Public trust in AI depends on better understanding of AI

A public that does not understand AI will not trust its use in public services, even when those systems are safe, fair, and well-governed. 

  • Workforce planning needs more than just technical reskilling

AI will change more or less every job, but not in the same way. People need to understand how AI affects their work, what safe and proportionate use looks like, and what systems of accountability are in place. Literacy builds resilience in organisations.

  • Digital inclusion must evolve into AI inclusion

Until now, digital inclusion has focused primarily on connectivity and access to devices. While this should still continue, AI inclusion must also focus on agency: the ability to identify, use, question, and shape AI systems and their uses. 

  • AI governance needs and informed and engaged public

Regulatory measures without core literacy creates dependency, not empowerment. Democratic oversight of AI systems demands a public that understand what they are overseeing and why. AI literacy, accessible, inclusive, narrative-aware literacy, is the means by which participation can be broadened. 

 

A Call to Action: Building AI Literacy into the Foundations

To create a future where AI implementation is trustworthy, ethical, and inclusive, UK leaders must champion AI literacy as a foundation stone of AI adoption. This looks like:

  • Treating AI literacy as critical national infrastructure, not an optional add-on.
  • Funding community-led AI literacy programmes, especially for groups facing digital exclusion.
  • Embedding ethical and critical AI literacy into workforce development, not just technical upskilling.
  • Embracing clear communication practices in AI deployment, without opaque, sensationalist, or jargon-heavy content.
  • Supporting research into AI narratives and the ways in which public perception is shaped.
  • Recognising that AI literacy is the precursor to trust, and trust is a prerequisite for informed adoption. 

 

Conclusion

Asking “who gets to understand AI?” might sound like a purely philosophical question. On the contrary, it’s a practical consideration and concern for all UK digital leaders, and on of profound importance as we consider the UK’s digital future. 

We generally accept that AI is not neutral. But understanding AI is also not neutral. And this ability to understand and influence how AI systems are used and deployed will fundamentally shape the ways in which our society operates. 

So, if we want an AI future that works for everyone, then understanding AI must be accessible to everyone. We cannot build an equitable digital society if we do not democratise, support, and champion AI literacy. 

This is our opportunity, and responsibility, as UK digital leaders as we head into 2026. 

Let’s make sure we take it. 


Read More Collaboration

Comments are closed.