Archived Content

The following content is from an older version of this website, and may not display correctly.

DCD FOCUS: Before we try to understand how software-defined data centers (SDDCs) play into ‘smarter’ cities we should first ask what your definition of a smart city is?
Rashik Parmar:
When we think about ‘smarter’ cities we have to remember the history of cities themselves.

Cities were spaces we sought for safety and sanctuary. When you look at the evolution of mankind you have to consider how governments formed to provide order. The city was the platform on which that order could be delivered in a manageable way.

As cities evolved they became a way of betterment – how people improved their lives has been based on cities and, through this, we have the emergence of business.

Now we are in an evolution that will see us develop sustainability – a notion that we still don’t really understand today. The city has to drive this – not just in the sense of resources but in terms of people’s livelihoods. This is where IBM’s work around smarter cities comes into play.

IBM has more than 2,500 projects taking place in cities around the world. Each incorporates three broad elements.

The first is a technology element. No matter if it is a problem with transportation, health care, jobs or sustainability, one of the first pieces any leader needs to consider is where they are today, and how technology in the form of IT helps.

The second is dealing with what the future will look like and developing a journey towards this. It will take into account people, societal change and customs. It is a complex change agenda and, within that framework, technology has a much smaller role to play as people take on a much bigger one

As these projects move forward, and some ‘people’ changes are embedded, technology comes in again with a larger role as the enabler of new jobs, with tools for creating sustainable platforms that set new sustainable norms.

So what do you define as being a SDDC?
To understand SDDC, you again have to look at evolution – this time going back

to the invention of the transistor in 1947. This led to the notion of a computer, which we now take for granted. This evolved. In 2010 we hit a point of one billion transmissions. We have since come to have an abundance of technologies such as transistors. This has allowed us to rethink the way in which the underlying fabric of the technology could best provide economies of scale and value propositions.

We have moved from an area where a tiny data center is no longer the norm for each business. As you get bigger and more powerful, and cheaper and more efficient, the ability for different organizations to have their own dedicated services becomes less viable. It is more economic to put operations into a  shared-use platform. We saw this with the telco industry, which went through the same thing with small businesses that had their own MPLS backbones and SBMPXs. Now we are seeing it with the Cloud.

So how does this notion of the SDDC tie into the way we look at smarter cities, or is it the other way around?
When we look at smarter cities we see the need for good data center service providers in a city. They must be competitive and provide an infrastructure platform. Just as you have roads, water and energy, IT provision in the form of the Cloud becomes a key part of the armory for a city itself.

You need to be able to adapt that IT platform to the plethora of needs a city may have. We need to be much more agile. This is where the SDDC becomes critical. It allows a common fabric of infrastructure in terms of storage, compute, communications and networking, that can be reconfigured using software so you can make optimal provision of service to meet bigger needs.

The SDDC also allows you to bring in different services by transforming the standard infrastructure – the hardware platform – to meet different demands these services bring. This is where we see a need for self-cognitive learning systems. In future these will mean people won’t even have to rewire systems as tasks like this will be done dynamically, as required.

So the first step to having a smarter city then is the Cloud?
Yes. Sunderland City in the UK is a real example of this. We have been working with Sunderland for three to four years. The CIO and city leadership realized they had to become a service provider for business. They could either let another provider come in to do that or they could do it as a city administration. Together, we created roads (networks) and infrastructure. You might argue a city does not have the best people to do that, and we at IBM would do the same, but they can partner with other organizations that can. This city partnered with IBM to build compute and cloud infrastructure, which the council now uses for its services and makes available to start-ups. The council offers these services at a price competitive – in some cases positive – point, making the region’s businesses more competitive.

This is a very enlightened city.

In Europe, 80% of our discussions with city leaders are about similar initiatives. They realize they have to link with cloud providers and, in some cases, ask about getting IBM or SoftLayer (IBM’s cloud services company)  to put a pod in their city, or Amazon or Microsoft for that matter,  so they can compete with a price competitive advantage.

But cloud is not, essentially, the software-defined data center . . .
We do elements of it. We started with the virtual desktop, or you could say the software-defined desktop. But the maturity of the SDDC is not at the level where we could deploy at scale. The OpenStack platform, and some other technologies, are evolving and you can now virtualize a lot of stuff. But there are still huge gaps. We are at a point where we have adopted as much as we can without bringing too much commercial risk into the platform. But we are aggressively moving this forward and, in some cases, we are trialing things that haven’t been done before.

What is holding the SDDC movement back, and not just for the smarter city?
The gaps are in the articulation of the current step. There are point solutions for each aspect of software defined, but what you don’t have today is the ability for that software-defined environment to work together in an integrated way. This is a big gap. You can’t afford for people to have to do things to transform network, compute and storage at the same time. These things need to join up. Software defined is really the automation and integration across those three areas. We see cases where software defined can work on its own – networking and storage – but to get to an SDDC environment we need everything to work together. IBM is aggressively bringing levels of automation and integration together to address this.

Where we are thinking about this is in the business and commercial space. There are parts of the picture which is ‘shared everything’ and a lot of cloud providers focus on this shared environment model. This is fine and appropriate for a large number of consumer use cases but our marketplace is predominantly enterprises, and they want guaranteed levels of service.

Our strategy with SoftLayer is to manage this at the bare metal layer and give commitments that you would not be able to do any other way.

We get picked on in the marketplace because we don’t have a cloud platform, but we can do this. We are focusing on the enterprises that want to do a controlled, manage hybrid cloud platform but want to keep some stuff – their crown jewels – in their own data center. This creates a massive technology challenge.

So are we seeing cities take advantage of hybrid models, and bursting into the Cloud?
Rio de Janeiro in Brazil will use cloud bursting. It has a challenge – coping with the 2014 World Cup football games and 2016 Olympics. It knows it will have to cope with a volume of spectators and traffic, both in IT and people terms. This will be unprecedented for the city and it is why we built a control center for it – so it can cope with traffic and in future manage natural disasters in a way the city could not do otherwise.

This is not just a normal disaster we are referring to, we are talking about large events in future that create an unexpected need for cloud bursting. This control center includes every type of fluid architecture.

The Internet of Things is only going to exacerbate this need for smart infrastructure. Would you agree?
You have to go back again and look at the foundation of why we went into smarter cities in the first place. It came from work the Academy of Technology did called Instrument of Planning. This looked at our ability to instrument things – cars, roads, fridges, food.

Technologies that focus on these areas are becoming more widely available and affordable for people around the world.

As we start to build the Internet of Things we will get more data about the real world, and that data, teamed with what we already know about the world, will allow us to make more intelligent decisions. This will allow us to be much smarter in how we use resources. I am careful to use the word ‘smarter’ instead of smart because we see this as a journey, to start to instrument things and get new insight. This allows you to improve services and get to new places which then become the new ‘norm’ which you can evolve from.

The Internet of Things is really about instrumentation and intelligence. As you start to look at ways you deploy that technology and improve systems the Internet of Things becomes one tool you have in your kit to go do that.

Look at the city of Potenza in Italy. It came to us with the challenge of how to cope better with an ageing population. It wanted its people to be able to live in their homes for longer. IBM developed a system that was trialed in 50 homes that linked care providers to patients.

We discovered we could use data about the use of water to predict health during this trial. The elderly have a definite pattern for how they live their lives – the time they shower, make tea, lunch. Any shift in these can be an early indicator of onset of illness. This has helped care providers to identify potential illnesses or situations in days so they can fast track visits and provide extra care.



Smart insight: Rio de Janerio's 'smarter' city control room, installed by IBM.


I visited Microsoft’s home of the future around ten years ago, and have seen numerous examples of these ‘smart’ initiatives since then, but few actual deployments until more recently. What has held all this technology back?
When you did the cost/benefit analysis there was a point at which the technology was not viable as it was cost prohibitive. Now cost has moved to a completely new space.

What software defined does is again move this price point down. I think one the biggest shifts that will accelerate adoption and provide a much larger set of use cases will be software defined. We are also finding that today people, which includes the elderly, are much more comfortable with the technology. So consumers have moved as well as the  technology itself.

What does IBM see as being the ‘missing piece’ when it comes to unleashing all the capabilities that come with being a ‘smarter’ city?
A big challenge we see, in our IBM tech teams, is when we get into these kinds of systems, the complexity of managing them is at a scale that is beyond the capacity of any one human. Even to try to understand this.

We have to look for machine learning – or cognitive capacity – that will really allow these software-defined environments to not just deliver service but dynamically configure themselves and, in some cases, predict demands based on patterns of usage and communities of users. Systems have to be able to adapt and be ready for the time when demand comes through.

This is what we have been doing with Watson (IBM’s computer system that uses cognitive technology which has since built an entire business unit around).

By using cognitive technology we have been building real machine learning capacity at a scale that has not been possible in the past. We are doing some stuff in Canada, looking at diseases using cognitive technology with Watson, and this is helping us learn these new machine-learning algorithms, many of which we had never predicted.

This  gives us real fantastic capacity that we can apply to the software-defined data center space.

This article first appeared in FOCUS issue 36. To read the full digital edition, click here.