Data is the oil of the digital age. It helps power our economies, provides us with e-government services and as we’ve seen from the Facebook/Cambridge Analytica scandal, has become a valuable commodity in its own right.
Ensuring data flows unhindered is fundamental to our businesses and institutions, but its value means it has to be protected. This throws up something of a dichotomy for organisations looking to transform.
Transformation aims to open up processes, remove data silos and make data more transient and freely available. Yet regulations are becoming more stringent than ever, with the likes of the GDPR (General Data Protection Regulation) being more prescriptive over data handling than the DPA (Data Protection Act).
Transformation projects must therefore factor in data security while at the same time sharing data more widely; that means putting in place measures and controls that secure clear consent and protect data from the moment it is created, throughout its lifecycle, to the moment it is redacted or destroyed.
Even before the Facebook/Cambridge Analytica furore, concerns over data security had been dominating the political agenda for some time. For instance, back in 2016 we saw the abolition of the Safe Harbour agreement with the US and its subsequent replacement with the Privacy Shield, following concerns over the governance of EU citizens’ data once it was sent offshore.
The Privacy Shield seems to have pacified some government agencies, with the likes of the NHS agreeing to the storage of patient data overseas back in January. (Patient data can now be stored in the EU, countries “deemed adequate” by the EU and the US by cloud service providers compliant with Privacy Shield). But concerns remain over the regulation of the standard, with no ombudsman in place and some departments are still insisting that they want data sovereignty i.e. UK data held in UK-based cloud data centres.
The legislation will enable third party providers to not just access data but to analyse and recommend services and even initiate payments using APIs. This means previously walled garden data will become accessible to the market which will create new challenges for the banks both legally and technically: any compromise of that customer data will see the user’s bank deemed liable which means the pressure is on for the incumbents to put in place robust security measures.
The regulatory authorities will of course attempt to restrict access to this data – processes will be prescribed, third parties will need to gain approval, and explicit user consent will need to be granted – and we’re by no means there yet. The associated Regulatory Technical Standards (RTS) on Strong Customer Authentication and Secure Communication (SCA) are currently in draft format and are due to come into force 18 months after they’ve been finalised by the European Banking Authority (EBA) and European Commission i.e. towards the end of 2019.
What movement in both these sectors reveals is how important data sharing has become and the need for other sectors to embrace open data access. Users now expect their data to travel with them whether they are using public or private services.
Regulations such as Privacy Shield and PSD2 are indicative of a growing appetite for data sharing as governments seek to move services to the Cloud for efficiency and cost gains and competition authorities seek to open up the market.
Yet, at the same time, GDPR compels those same organisations to tighten risk assessment, data handling processes and data storage and retrieval mechanisms.
What we have then, is some forewarning of how data handling is likely to evolve. The needs of third parties will have to be accommodated and organisations will need to put in place procedures for secure access when embarking upon transformation projects. Transparency will be key to demonstrating compliance, particularly with respect to user consent, and remedial processes will also be needed in the event of a breach of the GDPR to quickly alert users, and reduce reputational damage and liability.
Organisations should adopt a ‘security by default’ stance with data protection built in to the architecture of the infrastructure.
In particular, dual permissions-based access is a must, to ensure that data can only be accessed by specific teams or individuals, preventing unauthorised access or the manipulation of data. This can be achieved by housing data on a dual architecture which partitions customer records, with data encrypted on both sides using a managed key encryption service to ensure those organisations that need it can have direct access.
With both processes and data access mechanisms in place, organisations that embrace these changes today will have futureproofed their transformation and will be able to roll out future services safe in the knowledge that regardless of changes in the market, they can provision secure access that has prioritised data integrity.
This article was originally published here.