The new General Data Protection Regulation (GDPR) has been the subject of frenzied commentary since its announcement. It represents a much-needed attempt to drag data regulation kicking and screaming into the 21st century. Despite the relentless growth in the amount and variety of data that companies have been processing over the last decade or so, privacy regulations hadn’t seen a significant shake-up since the adoption of the Data Protection Directive (DPD) in 1995. Change has been long overdue for some time.

Formally passed on 14 April 2016, the GDPR is designed to better protect citizens’ data and harmonise legislation across the European Union. Despite the outcome of Britain’s referendum on EU membership, it will still apply to UK companies. With it comes a raft of new guidelines and requirements for controllers and processors of Personally Identifiable Information (PII), as well as an Auditable Assurance process that all companies will need to demonstrate when controlling or processing PII. Data centres operators, and their customers, will undoubtedly have to change at least some of their data handling practices as a result.

jigsaw privacy standards thinkstock photos maxkabakov
– Thinkstock / maxkabakov

Severe fines

With these new requirements come powerful sanctions - those data handlers that fall foul of them can expect to be liable for severe fines of up to 4 per cent of annual turnover or £15.8million (whichever is the greater sum). The weight of these sanctions has added a sense of urgency, and businesses only have until 2018 to implement the required changes. There is, however, a potential stumbling block in the path of compliance: the lack of a standard that specifies whether the Technical and Organisational Measures (TOMs) that an organisation has implemented can be deemed as appropriate.

As it is, the GDPR neither cites any existing or future technical standards nor does it identify a specific body to manage the process of accreditation. Instead, it uses terminology such as ‘appropriate’ and ‘state of the art’ - leaving us with a complicated set of legislation that prescribes no specific measures and leaves far too much to an individual organisation’s interpretation.

For example, the GDPR speaks frequently of ‘privacy by design’ without substantiating what it might actually look like in the day-to-day functioning of a company, apart from mentions of pseudonymization (separating data from direct identifiers so that it can’t be identified without additional information that is held separately) and the adoption of appropriate staff policies.

‘Privacy by design’ is intended to put the GDPR’s privacy measures at the heart of a company’s culture, ensuring that it runs through the core of everything it does and that it is built in to the way the company operates. However, without standards indicating what TOMs might be taken towards ‘privacy by design’, it is left to an organisation’s individual interpretation to decide what measures constitute suitable steps towards ‘privacy by design’. It is all too woolly and leaves a lot of room for slip-ups and omissions.

With a lack of insight into the ‘real world’ of information security practice, the potential standards would be rendered either unworkable, ineffective, or both

Why is a standard that is explicitly linked to TOMs important here? A standard substantiates changes on an organisational basis, giving companies something to aim for. TOMs makes these changes concrete on an operational level, specifying the exact measures that would indicate compliance. A standard would also provide support accreditation, making it easier for end customer organisations to identify third parties who are completely compliant with the GDPR.

Between theory and practice

The implication is that at some point in the future, someone will be tasked with creating a standard that helps organisations understand the GDPR’s requirements in the context of TOMs. It would be ideal if this defines what needs to be done to demonstrate compliance with the standard and provides support accreditation. This task is undoubtedly complex – not least because the industry and the threats we face are always evolving – and there is a risk that the creation of a standard could just be left to policy makers and lawyers.

The danger here is that with a lack of insight into the ‘real world’ of information security practice, the potential standards would be rendered either unworkable, ineffective, or both. The issue goes beyond the legalistic policy challenges of deciding what is legally necessary to protect data and privacy. The GDPR needs to drive a consistent set of behaviours and promote a different kind of culture, as highlighted above by the ‘privacy by design’ issue. Any standards therefore need to be rooted in the day-to-day realities of companies’ working lives. A legalistic approach could lead to a focus on the fear factor of the GDPR, rather than a positive impetus towards cultural change.

The need for standards for the GDPR represents an opportunity for information security professionals of all kinds to evangelise the right approach to security. In particular, those data centres that have a strong security culture can make the case for putting privacy and security at the heart of operational practices. However, even if the potential standard does not materialise, these data centres still retain the opportunity to use their expertise to guide organisations towards companies in a confusing landscape.

Phil Bindley is CTO at The Bunker