IBM has begun offering its Bluemix Container Service from the company’s local cloud data centres in Sydney and Melbourne.
The company in March unveiled a beta of the service at the IBM InterConnect conference in Las Vegas. Bluemix Container Service offers orchestration based on the open source Kubernetes system, with Docker as the container engine.
“As we looked at the container landscape it became clear to us that the combination of Docker as the container engine and Kubernetes as the orchestration layer was the right stack for our customers to be able to realise the benefits of containers,” explained Jason McGee, IBM Fellow and VP and CTO, IBM Cloud Platform.
The Bluemix Container Service became generally available at the end of May and then launched in Australia at the end of last week.
The service “takes the benefits of containers – automated scheduling, automated failure recovery, elasticity, security – and delivers that as a service on cloud, so development teams don’t have to worry about ‘How do I run my container system?’,” McGee told Computerworld.
“The cloud takes care of that with a really simple user experience. They can spin up a Kubernetes cluster, they can deploy applications on it, they can scale that environment, they can add more resources as they need to, they can secure it.
“It provides all the tools and systems that they need to help them run container-based workloads in the cloud.”
Andrew Kupetz, CTO cloud, IBM Australia and New Zealand, said a number of the company’s local customers participated in the Bluemix Container Service beta and he expected them to move workloads into production. “A number of those customers will be banks,” he added.
Having local data centres boosts the opportunities for heavily regulated sectors, such as banking and finance, to use the service for production workloads. In addition, many Australian customers even outside of regulated industries see value in retaining certain categories of data on-shore, Kupetz said.
“My experience so far is that the local market here has been faster than average in seeing and understanding the benefits of containers and has been aggressive about adopting the technology,” McGee said.
Data sovereignty is a requirement not just a “nice to have” among a significant segment of IBM’s customer base, he added.
“The other thing that I think makes the local deployment valuable is of course latency. Geography doesn’t play in Australia’s favour when it comes to network latency, so having a local presence allows customers to achieve better performance.”
Analyst firm Gartner has forecast that by 2020, more than 50 per of global enterprises will be running containerised applications in production, up from fewer than 20 per cent today.
“Like many technologies there is a lot of hype but that hype I think in the container case is backed with reality and a lot of real usage,” McGee said.
The use of containers during software development “is essentially mainstream at this point,” he added.
“There’s no developer I’ve talked to recently, including at all the enterprises we interact with, who is not using containers as part of their development process. It’s essentially the de facto way people are building software these days.”
Predictably, use in production is yet to reach the same level: Early adopters are already using them for production workloads, but the beginnings of a mass market transition to the technology outside of dev and test only really began late last year, the IBM exec said.
“I think we’re starting to see that switch from early adopters to more of the mainstream use in production. I think there’s real fire there – it’s not just discussion or hype.”
Those early adopters tended to be web companies, cloud service providers and other technology companies.
“If I look at IBM, we have significant production deployments of containers in our cloud, for example,” McGee said. “Most of Watson, as a concrete example, is running on containers in production.”
“I think if you look at enterprises – there are actually a lot of people who are using containers in production but they haven’t adopted a container-style operational model,” he added.
“They’ve taken a container and used it as a packaging mechanism and they shoved it inside of a virtual machine and their operational model is still a virtual machine model.
“That’s what I think is starting to change: They see the speed and agility they can achieve with containers, they see some of the efficiencies they can achieve. But they realise if they’re really going to do DevOps properly, they have to carry that container model all the way through to production.”
That in turn is helping drive the uptake of orchestration platforms like Kubernetes, he said.
The appeal of a cloud-based containers is the combination of both the speed to develop and innovate, and the ease of moving software into production and scaling, McGee said.
“They have a development value of speed and they have an operational value of efficiency and scale and resiliency,” he said. “The combination makes them really powerful.”
The portability of containers also sits well with IBM’s emphasis on multi-cloud.
“IBM understood the value of hybrid cloud environments probably much earlier than most people in the market, and understood that for most of our enterprise clients, they are going to live in a world that is a blend of on-premise systems and public clouds – and in most cases more than one public cloud,” McGee said.
“It’s just a pragmatic reality that you’re going to want to be able to blend these services and blend providers,” he added.
“With containers, with systems like Kubernetes, which we’re delivering in both public cloud with Bluemix and on-premises as well, those technologies work in other cloud providers.
“Our clients have an ability to blend workloads and move workloads if they need between those environments without making substantial changes to how those applications are architected and implemented.
“I think that’s a value that the container space offers and we’ve been actively working on enabling that story to be real for our customers.”