Technology is changing human existence more rapidly than ever before, and the digital infrastructure is being built faster than any other building project in history. The scale of the project is unprecedented, and the underlying technology is being invented as the project proceeds.

We are building at webscale, and the size and shape of the infrastructure is dictated by massive issues and tiny concerns. Where do you build a data center that can store and process billions of pages for billions of people? And how do you collect data and send instructions to minuscule sensors that monitor and control more and more of our lives.

cover feature webscale landscape tall
– DCD / Chris Perrins & Holly Tillier

Colossal questions

The architects and designers of the world’s digital infrastructure are gathering at the DatacenterDynamics Webscale event in San Jose, on July 19-20, and the agenda includes the technical, the organizational, and the political questions raised by the arrival of the colossal, granular, interconnected systems.

Some see the emergence of a new professional class: of people who understand and drive the creation of digital structures. These people are “Infrastructure Masons”, according to Dean Nelson, who has set up a group of that name. The masons of medieval Europe built cathedrals and created organizations to develop and preserve knowledge of how to build in stone and regulate the morality of the profession.

“I am proud to be a part of the global community of people who build and manage this digital infrastructure,” says Nelson, who previously ran ebay’s data centers. “I think it’s time the world recognizes them and appreciates the work they do.”

The word ‘Masons’ gets across the scale of the projects and the need for human ingenuity in their delivery. In the Industrial Revolution and the Machine Age, similar roles were taken by people like Isambard Kingdom Brunel and Henry Ford, who understood the possibilities and honed the technology of their times.

Morality should be at the heart of the new infrastructure, according to Patrick Flynn, director of sustainability at Salesforce (a DCD Webscale speaker), partly because they are so big: “The internet is a species-wide central nervous system,” he says in a TEDx talk. “Data centers are the information factories. They are the biggest thing we will ever build, consuming more electricity than all but two countries on Earth.”

The morality of infrastructure

Previous big structures have had unintended consequences, he says. The highway system enabled transport and communications but created smog, traffic congestion and the loss of community.

When our children look back on the information system, Flynn wants them to approve: “Unlike past mega projects, this one can have morals and values.” Just making sure that the structures don’t harm humans or the environment is a step towards this: control logic embeds morals into machines, he argues.

Unlike past mega projects, this one can have morals and values

Patrick Flynn, Salesforce

Flynn thinks that making data centers efficient is a moral imperative, and he’ll bring that perspective to the DCD Webscale event. Twenty percent of the servers are comatose in today’s data centers – forgotten servers still running and using power but doing no useful work. That gets Flynn’s goat: “If a server were moral, it would send a message if it thought it was forgotten.

Infrastructure Masons doesn’t yet have a formal work plan, but it has made a start on morality. Its first get-together, in California, got members sharing ideas and gave them a first look at some unreleased technology – and it also raised $50,000 to pay for a school in India.

patrick flynn salesforece crop
Patrick Flynn, Salesforce  – Salesforce

Who is in charge?

Back from the moral frontline, building at webscale involves rethinking the organizations whose infrastructure we are in charge of. Data centers have been inefficient because they are isolated from the consequences of their actions. IT systems use power but the IT department doesn’t pay for it directly and the systems in buildings that are controlled by other divisions of the business.

Facilities management has been separate from the technology division, and that has led to a disappointing – and arguably immoral – situation. Power and cooling systems need to be managed more intelligently, but all too often the systems with the brainpower to do that can’t talk to them.

Even when air-conditioning systems are connected, they often use old technology such as dial-up modems. “Those systems are designed to never change,” says Scott Noteboom, another Webscale speaker, and a former data center builder at Apple and Yahoo. “They’re completely isolated from each other.”

Merging those two silos results in cost savings at the very least, Noteboom points out, but they can also allows the kind of moral systems that Flynn wants to see. A system can’t reduce its impact on the environment, until it is connected in such a way as to enable it to see what that environmental impact is.

Fragmenting infrastructure

The Internet of Things brings new connectivity, but it has the potential to create continually fragmenting infrastructure controls. In this world, Noteboom reminds data center people yet again of the importance of avoiding the silo effect, where different technology requirements result in independent infrastructures being developed, in this case between IT and facilities. 

Noteboom says data center facilities management currently often takes a “North Korea” approach to providing security for the physical infrastructure. The machinery of the physical plant is hidden and isolated, while the other features of the data center get the advantage of an open and highly visible approach to security and management.

scott noteboom ceo litbit crop
Scott Noteboom, LitBit  – LitBit

Noteboom is putting forward practical solutions to this with LitBit, based on the Apache Iota protocol, designed to automate information gathering and control in the Internet of Things, the emerging galaxy of networked devices that could make our lives better.

LitBit’s RhythmOS is an open-source, orchestration backbone with open APIs for the development of third-party applications, tools, and compatible hardware and software that manage sponsor-driven devices. Within and beyond the data center it obviously has potential applications.

Noteboom’s LitBit company is based around the group that originally set off the Apache Iota project, designed for the Internet of Things to eliminate the silo effect, while also providing a new and practical approach for dealing with these issues.

Project Iota set out to create industrial-grade open-source IoT tools, including hardware components and software platforms. Since the Iota platform itself is a Linux computer, more capable hardware platforms increase the size of the managed device environment. Improved performance can be achieved by the use of multiple management platforms in parallel.

Initial hardware platforms under consideration include devices on the scale of the Raspberry Pi. The system lets users create “Maestros” – orchestration components that add user- and system-defined orchestration capabilities. These user-created tools allow devices on LitBit’s RythmOS platform to work together, and can be shared.

Webscale generally refers to the cloud providers and telcos that provide the massive capacity, the servers and communications, required by the users of the internet.

Customer-facing firms such as Google and Facebook have billions of users and petabytes of data to store and manage.

cover feature webscale landscape tall
– DCD / Chris Perrins & Holly Tillier

Enterprise opportunity

One step down, firms such as LinkedIn (now subject to a $26 billion takeover bid by Microsoft) are still webscale, but at a smaller level. It’s still worth their while making their own kit and acting as good infrastructure builders.

Another Webscale speaker is Yuval Bachar, principal engineer at LinkedIn. He has presided over a program where the social media site designs its own network switches and propounds its own data center architecture.

LinkedIn has designed the “Pigeon” network switch, so all parts of its data centers get 100Gbps through a single fiber pair. Inside the racks this feeds a lot of activity, which is provided by servers operating at such high densities that Bachar has adopted cabinets with doors cooled by liquid.

“There are no hot and cold aisles like you would find in a typical data center,” says Bachar. “Everything is cold aisle. The hot aisle is contained within the rack itself.” Savings from this can reach millions of dollars in CapEx, and the resulting systems run better and faster.

It’s a standout example where moral considerations and technical considerations are perfectly in line, where one of the builders of the digital infrastructure can take a pride in his work.

This article appeared in the May/June 2016 issue of DatacenterDynamics magazine.

DatacenterDynamics Webscale is at the San Jose Convention Center on July 19-20