We are asking our website users for their opinions on the Digital Leaders website. If you’d like your voice to be heard then please complete the survey, which should take approximately 3 minutes.
Our ethics help us navigate what’s right and wrong in the work we do, the decisions we make and the expectations we have of the institutions that impact our lives. To hold themselves accountable, most sectors follow codes of ethics, from medicine and bioscience to journalism and government. We can see across the public sector in particular that data is emerging as a key focus for innovation. This brings the exploration of data ethics around how data should be shared, collected and used by all those working with data into focus.
Over the last decade we’ve seen a rise of data use, affecting not only organisations but our communities too. New technologies and mechanisms for collecting data are increasingly a part of our lives, from machine learning to robotics, automated services to smart devices. This exciting innovation brings a vast range of benefits, from managing health treatments to more easily navigating the world.
As with all forms of innovation we need to consider the impact on society, by having these practices in place we can build trust and shared ways of working across industries that are truly designed with people at the heart. Defining these ethics also encourages responsible innovation and stronger diversity in the data space.
As issues of monetisation of personal data, bias created by data sources and the impact of under-representation in data become common, there’s a growing need to provide ethics training widely as a part of enhancing data literacy.
For example, we know that collecting and sharing data only about certain groups of people will disadvantage others. Because how can you design an urban space for everyone, when the only data collected is about how white middle-class male business owners use local amenities? The under-representation of others living in society is a huge problem.
But it doesn’t stop there. Ethics issues must also be considered in the collection and use of non-personal data. We’ve seen this in the problems caused by not publishing the location of bus stops in poorer neighbourhoods. This means that the benefits of smartphone map apps may not be available to people who live in those areas, once again increasing existing inequalities.
These examples are just two ways in which data ethics can affect ordinary people every day, so it’s important to make sure we’re doing everything we can to prevent negative effects. We need to take a continuous improvement and development approach to data ethics and bring communities together from diverse backgrounds and disciplines including developers, users, customers and citizens. We don’t want data ethics to be a barrier – we want it to be an enabler to designing services, technologies and places in the best way possible.
While there are many ethics frameworks available and in development, they often include similar guidance. These points usually focus on making sure any collection and use of data is fair and collected for a specific purpose. These frameworks also aim to make sure organisations managing personal data are open about their processes, policies and uses.
Ethics frameworks need to be developed that put the best interests of people and society at the core. We believe frameworks should also cover areas not involving personal data, including guidance around data handling, such as its collection, sharing and use (we’re thinking of designing models and algorithms). They should also consider issues that are outside of privacy and user control like bias in design and practices that reinforce inequalities or stereotypes.
To help us think seriously about data ethics, organisations must work in the open and share examples that we can discuss and develop as we engage with real people. Good examples show us that ethical problems aren’t simple, they’re multi-faceted and often there’s no single right answer.
We’re hoping to see data ethics continue to develop as we discuss opinions and points of view of diverse communities of developers, users, customers and citizens, and we’re always happy to have a conversation with you about using data in responsible, fair and empowering ways.
Originally posted here