The EU Commission has further modernized rules and regulations for the data economy. The GDPR's record-breaking fines and high number of cases of the last ten months reveal that companies are overwhelmed by compliance with the existing rules. And the task is getting more complex, as the amount of data and rules are constantly growing. It's time for companies to rethink how they manage their data and regain control to avoid compliance risks.

Currently 1.6 billion euros of fines have been issued since the General Data Protection Regulation (GDPR) came into force on 25 May 2018. And, the fines are not just in EMEA. In fact, in the past 12 months European Data Protection Authorities have imposed major penalties against five prominent American companies in the technology sector. These five major cases alone account for more than $1.2 billion in fines. The number of reported violations has also grown faster than ever, from 639 to 1037 in the same period. When the law marks its fourth anniversary on May 25, authorities will have marked a record year of penalties. Anyone who expected a more lax interpretation of the guidelines due to the pandemic will certainly be surprised. 

During this period, other trends have emerged. Accelerated digitization coupled with the pandemic and remote working have created more data in more and more places. Many companies are obviously no longer able to keep up with this explosion of data and data silos, especially since the general conditions are rapidly changing once again.

So in March, Ursula von der Leyen, President of the EU Commission, and Joe Biden, US President, jointly announced that they would also adopt new regulations for transatlantic data traffic. It is not clear when this Trans-Atlantic Data Privacy Framework, or what some are calling Privacy Shield 2.0, will come into force as the new legal foundation. But this ruleset will have an impact on how companies with international business handle their data, how they transfer, store and archive personal data. And just a month earlier, the EU Commission presented new approaches with the Data Act to regulate the data market in Europe.

View of the data distorted

Many companies have distributed their data to a variety of storage locations and, as the fines would show, are having challenges in properly managing their archipelago of proprietary, siloed islands of data. No doubt, IT teams spend significant time and resources tackling governance, archiving, and compliance issues. This distorted image causes a number of glaring problems that grow with the amount of data and the number of regulations. In this way, it is almost impossible to see whether data is redundant, whether critical personal data is stored in risky locations, or whether it has been overlooked in the backup plan.

A company can attempt to get these data islands under control with processes and point-product solutions, but may face high infrastructure and operating costs, a lack of integration between products, and increasingly complex architectures. And it is questionable whether all data is protected from ransomware in such a fragmented environment and whether important tasks such as rapid recovery can be implemented in the required time and quality to keep businesses up and running. 

In fact, companies should break away from the archipelago and look for a next-gen approach to data management that enables them to improve data compliance, advance security, remove data silos, and reduce complexity. 

1) Recognise access and value

Businesses need to know what data they own and what value it has. Only then can they answer questions of governance and compliance. And they need to be clear about who has access to that data. For example, can they detect users that have too much access to data, or can they use AI/ML technology to identify unusual backup or access patterns, or other abnormal behavior. These indicators may help to identify possible internal and external attacks, such as ransomware, at an early stage, enabling countermeasures to be put in place in rapid fashion.

2) Gain a unified view of the data

Ideally, all of these functions and an overview of the data landscape can be accessed via a console that only authorized users can access thanks to multi-factor authentication and access control lists - regardless of whether the data is stored on-premises, in a hybrid cloud or in a SaaS service.

3) Establish resilient, highly scalable infrastructure

The data itself should be backed up in a next-gen data management platform, ideally based on a hyperconverged file system, that can easily scale and goes beyond the Zero Trust security model. In addition to the already mentioned strict access rules and multi-factor authentication, enterprises should be able to utilise immutable snapshots, which means that no external application or unauthorized user can modify the snapshots. Organizations should also use modern data management technology that can encrypt the data, both during transport and at rest, to further enhance security against cyber threats like ransomware.

4) Available as a service

Some companies today no longer want to manage infrastructure completely themselves, perhaps because their IT teams need to concentrate on other business-critical tasks. In these cases, they could consider a vendor that offers Data Management as a Service (DMaaS), designed to provide enterprise and mid-size customers with a radically simple way to back up, secure, govern, and analyse their data. 

Outlook

Governments will certainly develop new rules for the data economy, as it plays a major role in all sectors of the economy. IT teams in companies will therefore continue to have to react to new general conditions. To finally find your way out of the complexity trap, you should consider consolidating your data silos into a next-gen data management platform. Organizations can then benefit from synergy effects, including enhanced security, governance, compliance, and the ability to save time and money, since managing this infrastructure becomes much easier.

Subscribe to our daily newsletters