It is 2019, many women will experience a ‘two-week wait’ for results from their screening. 1 in 7 UK females will be diagnosed with breast cancer in their lifetime and the risk of developing cancer depends on many factors including age and genetics.
Fast forward to 2029 an NHS predictive model could flag someone as being at heightened risk of breast cancer. An algorithm takes seconds to compare their mammogram against millions of NHS scans and a doctor can share the results there and then.
The next step is to develop a tailored intervention and monitoring plan to make sure your daughter never has to experience what you did, but also give the advice she needs in a format that makes sense to her and respects her autonomy.
You see, in this future, even though you may not all be treated by a doctor, we will have cut through the hype of AI and worked out how we can deploy it to make sure it delivers the outcomes that the healthcare system, and all the people of the UK who trust and rely on it, want. We will have done this while ensuring that the values of the NHS are maintained, patients are treated with respect, and, above all, kept safe.
However, we know that we could go very wrong on this journey, especially if we do not tackle issues such as transparency, accountability, liability, explicability, fairness, justice and bias.
As the NHS is for the people and we have a duty to improve their lives, we have a strong duty to create an ecosystem for the development of technologies that will deliver on this data-driven future.
Our use of data-driven tech should be non-maleficent and beneficent. It should be capable of respect for people, respect for human rights, enable participation and be accountable for its decisions.
It needs to keep society in the loop, not just the human in the loop.
It should be:
In other words, we need to create an ecosystem for the safe and ethical development, deployment and use of data-driven technology in which all the constituent parts of the system feel responsible for upholding the values of our NHS. That means the policymakers, regulators, commissioners, healthcare providers, tech vendors, researchers, carers, insurers and data controllers working together with distributed responsibility.
Responsibility for safe tech is distributed across the system
Our updated Code of Conduct for data-driven tech has now been revised with input from as many people as possible. It sets out a series of gold-standard behaviours we expect from those providing technology for use in the NHS so that we don’t have to accept technology that is harmful or below anything less than exemplary. The principles are:
The Code of Conduct is just one of many foundational building blocks that we will have to lay down in the coming years and we will need to get much clearer about what “good looks like” for principles 7 and 10.
But we cannot make this progress on our own. We will need to work with our regulators, with innovators, with patients, with commissioners, with policymakers and with those on the frontline to make sure we embed the values that matter to all voices in the NHS from the very beginning.
And by this we really mean all voices, including those of individuals who do not want to, or cannot, interact with the NHS digitally.
We need to balance innovation and regulation for the best outcomes
Importantly this is not about creating barriers to innovation. We love innovation, and we know that overly-regulating the tech market can stop good things from happening. That is an outcome we want to avoid at all costs.
But we strongly believe that developers that produce technology that is ethical and responsible have a competitive advantage and, if we aggregate this and make the UK the best place to ‘do’ responsible data-driven health and care tech, this will be a competitive advantage for our economy and society as a whole. It’s about creating clear boundaries that we do not want to breach, whilst letting technology flourish in the middle.
We are excited about being on this journey, especially as we launch NHSX, and we hope you’ll come along with us.
Originally posted here