By its very nature the IT world has always been fast moving. Unshackled by the complicated processes and protocol that surround and shroud traditional businesses, the IT crowd has existed on a diet of demand/supply since the computer became a workplace essential.

The Olympic movement may have cornered the motto ‘Faster, Higher, Stronger,’ but the computer and IT industry can surely adapt that maxim.

Since the early days of computer acceptance, the world has moved on until there is barely anything we do that is not influenced or aided by IT systems in some way or another.

Natick gets dunked
– Microsoft

The next step

The Internet has been the conduit to much of this development and now the Internet of Things (IoT) – the network of everyday devices such as cars, phones, smartwatches that are embedded with sensors and actuators that exchange data – is building on the seemingly endless stream of information we produce.

Thanks to cloud computing, this data is pushed through the network, distributed to and processed at custom-built centralized data centers dotted around the world. But the proliferation of data to be processed has seen a move towards using the lower trafficked outskirts of the network – or the ‘Edge.’

Edge computing processes data near the edge of the network closer to where it is being generated. In theory, this means that information can be ingested and returned more quickly than the use of a centralized data-processing warehouse, helping eliminate delays - particularly important in businesses where live feeds are utilized, for example where contractors depend on the immediate relay of concrete core temperatures to ascertain when best to strip formwork or where architects have to instantly disseminate design information.

The notion that data produced by edge devices through the IoT is processed is known as ‘fogging’ as it extends the cloud computer concept out to the edges of the network and in many ways, this plays directly into the hands of colocation centers or ‘colos.’

These are generally smaller data centers where space and bandwidth is available for rent to customers and are already close to an urban location or on the periphery of towns and cities where demand is greatest.

Already some large technology firms are taking space at these colos in addition to their own vast data centers in response to the demand on their central network but as space in these colos is exhausted then more centers need to be brought on line.

But therein lies the problem. Our IT needs are evolving at such a pace that the lumbering process of planning never gets the chance to catch up. Data centers do not sit comfortably with planning bodies such as the Greater London Authority. They are clearly light industrial units but with their negligible increase in local employment levels, town planners are loath to sign off their development, preferring more traditional industrial use with quantifiable impact.

And then there is the small question of cooling. Data centers and the equipment they possess require an enormous amount of energy and create a huge heat load that must be dealt with. They are heavy users of air conditioning equipment and could fall foul of incoming EU legislation designed to limit emissions from medium sized combustion plants. The MCP directive seeks to regulate pollutant emissions from plants with a rated thermal input equal to or greater than 1MWth (Megawatt Thermal) and less than 50MWth.

With every data center comfortably above the lowest benchmark, the directive has the potential to throw another spanner in the planning process when the emission limit values it sets out for new MCP plants are enforced in December. Linking new data centers into district heating and cooling systems might be one answer to the sticky heat problem but ultimately the UK needs to do something to help data centers gain planning approval. Currently it can take around 18 months to get through the planning process – a virtual lifetime in such a fast-moving sector as IT. Larger data centers can take even longer.

Recognizing their true worth to the community and businesses on a national level would help speed that process up. Accepting the data center’s role in our lives as a utility rather than a business center might too.

Boundaries are being pushed in a bid to combat planning issues and more easily manage heat and emissions. Microsoft recently sank a new underwater data center in the cooling waters of the North Sea off the Orkney Islands in Scotland. It features 864 servers, 27.6 petabytes of storage – enough for five million films - and will be deployed for up to five years.

While the waters off the Orkneys may not be the ideal location for smaller, more localized data centers, the theory behind it is sound.

What is clear, though, is that the prospect of data centers being marooned for months, even years, in the planning process despite such obvious demand for their services is unthinkable. In a post-Brexit world anything that hampers the development of data centers that bolster the UK’s reputation as a global center of technology must be avoided at all costs.

Robert Thorogood is executive director of Hurley Palmer Flatt.