Skip navigation
 Mitchell Hashimoto talks about new technologies and DevOp tools

Mitchell Hashimoto talks about new technologies and DevOp tools

"The place where I concern myself with paradigm shifts is generally what overall we describe as application, development, deployment and maintenance," Mitchell Hashimoto said as he opened his keynote address at last week's All Things Open conference. "This is potentially DevOps, depending on how you look at it, but it's really both development and the management of servers."

A few weeks earlier, when I'd talked with him to kick off IT Pro's coverage of ATO, I purposefully didn't ask him about his upcoming conference talks because I didn't want to spoil it for him or his audience. That he would talk about DevOps tools was a given. After all, HashiCorp, the company he co-founded and where he's CTO, is known for tools like Vagrant, Packer, Terraform, Consul and Vault, which are designed to help DevOps secure and operate distributed application infrastructures. In this keynote he would be talking about automation tools in general. Later in the day, he'd conduct a workshop that would focus specifically on his company's products.

"Generally, you start with one or two servers," he said. "Very quickly after that, you jump to a handful and management starts to become a little more complex. At some point, data centers evolved to support virtualization, which definitely increased complexity quite a bit. And of course, more recently its been containerization popping up as well, which both runs on physical servers as well as VMs."

He noted that with the rise of SaaS, many data center functions are now being farmed out to third party providers. "These are things like DNS providers, CDN load balancers, and databases. It's very easy, especially as a newer company, to offload a lot of what would be considered critical data center technology to third party providers. In the grand scheme of things, I also consider this part of what is becoming more and more an abstract word -- your data center."

According to Hashimoto, the increasing complexity of data center operations not only necessitates the need for tools to take pressure off DevOps staff, but also presents challenges for those developing the tool sets.

"If you're building tooling for helping application deployment today, the tooling really needs to be built keeping in mind support for a typology sort of like this: support the idea that you could be running on physical servers; you could be running on VMs; you could be running on containers; it could be containers on VMs; it could be containers on physical servers; the network could be virtual; there might be multiple data centers; you might be going across a huge LAN to another region.

"All of this just needs to be taken into account," he said. "Whether you support it or not, it doesn't really matter, as long as you take it into account and can understand what's going to happen."

His experience is that tool development lags behind the development of new technologies by a measurable degree, but that the tools then serve to propel more widespread use of the technology.

"The general idea is that you start at point zero and you start introducing new technology, let's say virtualization," he explained. "Virtualization comes into the mix, and because it's new there's not a lot of tooling to support it. The complexity to use it is really quite high. The tooling complexity, on the other hand, is going down because the tooling is becoming simpler, but not for that technology. It's for the previous technology. It's catching up. It's delayed.

"The complexity of the technology gets harder and harder, and during this time, people are manually doing things. They're just dealing with the pain. They're just going at it, because it's a new technology and they just accept there's no tooling to do it. At some point you kind of reach the maximum pain of that technology and you're not willing to do any more without a little help. This is the point where the tooling complexity starts coming out to deal with the technology complexity."

This is sort of a sweet spot for people like Hashimoto who design and develop tools for a living. "It does feel like there's a magic point somewhere in the middle where the tooling is about the right level of abstraction -- the right level of complexity and difficulty to learn -- to manage the technology in a very productive way. This would be like a 1.0 tool, with books published about the tool, comprehensive blog posts, and resources with various scenarios on how to use that tool for the technology. This is a very nice place to be."

It's at that point that the use of a successful tool begins to take off and it becomes a standard feature in the DevOps' toolbox. "That's when there's enough resources that someone who isn't willing to deal with bugs and really figure out how things work has enough to just get going really quickly."

Eventually, this leads to more complexity being added to the tool as features are added that won't be utilized by the the average user, but which are advantageous to those who need them.

"What this leads to is a point where your tool is so advanced and the technology is basically mastered that people start pushing the limits and start moving to the next thing," Hashimoto said. "That's when I think you hit the next paradigm."

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish