The news has never been so full of data scare stories. Facebook is leading the pack and facing unilateral criticism for releasing personal information to powerful analytical engine Cambridge Analytica. And Yahoo’s recent report that all of its three billion users were affected by a 2013 cyber attack made history as the biggest data breach ever.

Despite the outrage that these stories inevitably cause, the majority of us are resigned to the idea that next major breach is inevitable and the numbers affected will only grow. Whilst GDPR is going to help quell some fears, there is a much documented sense of wariness towards big data applications and the commercial use of personal data by private companies.

Of course, data is a long way from being all bad. It’s at the core of today’s innovation - powering significant advancements in all areas of modern life, from consumer electronics to clinical medicine. Even when we don’t know it, it’s making our lives better - powering improved customer service, targeted consumer deals and sophisticated loyalty schemes. The big data cat is firmly out of the bag and there’s no putting it back.

So how can we best harness the good of data and mitigate the bad? Getting it right is no easy feat.

The building blocks

Virtus London 4 data center
Virtus London 4 data center – Virtus Data Centres

The data center sits at the heart of an organisation. You might be forgiven for thinking that the IT department isn’t the natural home of innovation and business leadership as it once was, but the big data revolution can only efficiently be delivered from purpose built highly efficient data centers. Getting the data center strategy right means that companies have an intelligent and scalable asset that enables choice and growth. But get it wrong, and entire businesses are at risk.

It is this idea that is the cornerstone of big data success.

According to IBM, 90 percent of the data in the world today has been created in the last two years. This data comes from everywhere: sensors used to gather shopper information, posts to social media sites, digital pictures and videos, purchase transactions and cell phone GPS signals to name just a few. And one of the key characteristics of big data applications is that they demand real-time or near real-time responses.

These two ideas mean that intense pressure is put on the security, servers, storage and network of any organisation - and the impact of these demands is being felt across the entire technological supply chain. IT departments need to deploy more forward-looking capacity management to be able to proactively meet the demands that come with processing, storing and analysing machine generated data.

The final say on build vs. buy

For even the biggest organisations, the cost of having (and maintaining) a wholly owned data center can be prohibitively high, and so in the perennial build vs. buy debate, to buy is winning. Outsourcing to a third party provides the best protection against increasing data center complexity, cost and risk, and eliminates the need to worry about uptime. Carrier-neutral connectivity, offered by many, means that companies within the data center environment can choose the carrier service provider that best fits their needs - leasing a facility offers a substantially lower up-front cost. In addition, data center providers can seamlessly allow companies to scale - quickly and easily handling growing storage needs.

High Performance Computing (HPC), once seen as the reserve of large mega-corporations, is now being looked at as a compelling way to address the challenges presented by big data. High Density innovation strategies can also maximise productivity and efficiency, increase available power density and the “per foot” computing power of the data center.

For many, cloud computing is an HPC user’s dream offering almost unlimited storage and instantly available and scalable computing resource. Cloud is definitely compelling, offering enterprise users the very real opportunity of renting infrastructure that they could not afford to purchase otherwise – and enabling them to run big data queries that could have a massive, positive impact on their organisations’ day to day strategy and profitability.

The big security challenges

Perhaps most crucially, the ‘buy’ option, when it comes to data center strategy, addresses reliability and security concerns - the biggest issue when it comes to public trust in data. But for many, these concerns mean that a wholesale move to standard cloud – where security may not be as advanced – isn’t an option. Instead the savviest organisations are quickly recognising that deploying a hybrid cloud strategy, within a shared environment means that IT can more easily expand and grow, without compromising security or performance.

By choosing colocation, organisations get access to a range of security services – including DDoS mitigation, intrusion detection management, managed security monitoring, penetration testing/vulnerability assessments and compliance advice - that are unlikely to be available to the same level in-house.

In addition, colocation or managed services can help to deal with disaster recovery needs. There’s a growing recognition and acceptance that, wherever your data resides, sooner or later it will be compromised, so it’s important to know how to deal with the inevitable rather than to try and defend the impossible. When you buy a service from an expert, it’s their business to get you up and running again quickly.

By choosing colocation, companies are effectively achieving the best of both worlds; renting a small slice of the best uninterruptible power and grid supply, with backup generators, super-efficient cooling, 24/7 security and resilient path multi-fibre connectivity that money can buy, that also has direct access to public cloud platforms to provide the full array of IT infrastructure - all for a fraction of the cost of buying and implementing them themselves.

So, data isn’t a demon - but we need to be mindful of the potential for corruption and risk, as well as the significant benefits. Proper infrastructure and good data management can only help to control the bad and make the good better.

Fundamentally, big data success starts in the data center. Get that wrong and even the most innovative application will fail. Whilst we’ll continue to see big data scandals making waves, the savviest companies will be focusing on what in the data center, not in the media limelight.

Darren Watkins is managing director for VIRTUS Data Centres