Q&A: DevOps and Containers with Shannon Williams of Rancher Labs

Containers. Containers Everywhere. (Photo Courtesy of Flickr)

Here at Load Impact, we're focused on enabling DevOps teams to create more resilient software with massive scalability. We have seen the use of container technology enable DevOps teams to move faster, test more often and create higher quality software in recent years.

Shannon Williams is the co-founder and VP of Sales and Marketing for Rancher Labs, a software platform for deploying a private container service. The company has also developed RancherOS, a minimalist OS built explicitly to run on Docker.

Rancher Labs prides itself on developing the next generation of cloud software, and their innovative technology enables customers to fully realize the benefits of containers in a scalable production environment.

Note: Load Impact contributing writer and consultant Peter Cannell conducted this Q-and-A. He's previously worked with Shannon at other startups.

Q: Where would you say the industry is in terms of adoption of DevOps today?

A: That is a great question and a fun question because it's changing as we speak. Five years ago we were talking to customers about deploying applications in the cloud and we got alot of blank stares. People would literally point to their ITIL volumes on the shelf. Customers had spent the previous 10 years getting to this point where applications ran reliably, but with this came the era of change control.

Fast forward to today and you see banks that have DevOps teams, healthcare organizations that are responding to the backlash of the business to be more agile. These business have to respond to competitors that are able to roll out powerful technology in a fraction of the time, all while giving users amazing experiences.

The pendulum has swung from the change control era to the golden age of DevOps, or DevOps 2.0. The emergence of containers, cloud services and microservices has allowed for rapid-fire upgrades to applications and is a testament to what is happening out there today.

I'm always excited when I talk to teams using Rancher who have been given a mandate by leadership in operations or even the CIO to drive agility. From our perspective it's a brilliant time to be involved in DevOps as the broad majority of customers build, test and deploy software better than before.

Q: I couldn't agree more. It's great to hear your confirmation of the same dynamics we are seeing in the marketplace.

A: What is amazing is the people in these DevOps roles used to be viewed working on the least sexy part of IT. Now all of a sudden these teams are a strategic advantage. There is an enormous increase in salaries and demand for people that understand application lifecycles and operations today.

Q: Is it that we are getting just faster at deploying features today or is the overall quality better as well?

A: Good question. I think it's both, but it's not uniform across the customer base. Moving infrastructure as a service and Continuous Integration doesn't mean that everything becomes easy. There are plenty of pitfalls and growing pains and political resistance. Fundamentally we are seeing better quality as people design applications and infrastructure for failure. Additionally, so many new tools have been developed such as automated load testing like Load Impact is doing, automated inspection of elements and monitoring.

Companies are rethinking logging, monitoring and implement testing to be as dynamic as the rest of what they are doing — the net outcome is a dramatic improvement in both quality and quantity of code. Not only does feature function accelerate, but application stability and resilience improves as well.

I see this with all the customers I work with — it's not as if the infrastructure is better, servers and networks still fail — but the designing for failure, building for resilience has been ingrained in developers now.

Q: I love the concept of resilience. What we see is, and tell me if you agree, is development teams are running more tests and earlier in the development cycle than ever before. Developers are running load and performance tests that used to be the domain of operations & networking. Are you seeing this dynamic as well?

A: We are absolutely seeing this! What is amazing is that with containers now, customers are bringing up a mirror of production in their development environment. They are testing feature functionality, systems tests throughout the development cycle on their own cluster. We enable this with Rancher as a container service for organizations adopting Docker. As Rancher is rolled out one thing we see immediately is better testing because development mirrors production. Upgrades get more reliable, frequency of updates improves dramatically.

Customers want that magic unicorn like experience of faster releases, better stability, more reliability. There is another aspect to this I want to add which is easier incorporation of more developers. The environment is more accessible to new developers and easier to build upon. As micro-services permeate application development the interdependencies become easier to understand and new developers can ramp up faster.

Q: Let's switch gears a little and touch on containers vs VMs in customer environment.

A: What is amazing is containers can't really take off without the infrastructure of virtual machines that is out there today. I don't see a fundamental conflict between containers and VMs. There are cases where a VM is only being used for automation and in those cases containers are a good replacement. Containers and VMs tend to be very compatible and with most Rancher (and likely Docker) customers are running on top of a VM. I don't think that VMs from a host-resource management perspective is going away anytime soon.

Q: Let's talk more about Rancher. Where would you typically see Rancher deployed in an organization? Is it running on laptops like you might see with Docker images?

A: No, Rancher is more of a cluster management component. Moving a container from a developer's laptop to multiple hosts in the datacenter or a cloud environment is where a container service like Rancher becomes really compelling. The value of Rancher comes in when you are moving an application through the DevOps lifecycle. It is an orchestration and automation platform tied together with infrastructure services that are specific to containers.

With containers you have portability of the image making them ubiquitously deployable on linux, on windows (in the near future), you have this new component that can run anywhere. Fundamentally you still have the networking, storage, all the elements surrounding the container are very different from cloud to cloud, host to host, etc.

Rancher implements is constant storage & networking around the containers. By creating "micro-SDNs" between containers and deploying consistent storage services attached to containers (and can be ported between environments) and organization gains a layer of computing that runs identically on any infrastructure.

The same way a container runs identically on your laptop and in the cloud, an entire application blueprint can be created and not only will the containers run properly but the networking & storage will as well. Without having to do integration to the underlying cloud service you get orchestration, management and even load balancing that is consistent. The real value here is complete portability of an entire service.

Q: Who are the typical users of Rancher within a large enterprise today?

A: Today Rancher is open source (and in beta) and we have tens of thousands of downloads. We also have over 1000 companies who have formally joined the beta program. What we find is that DevOps, Cloud architects and development teams (that own applications) are who gets the most excited about Rancher within an organization.

Once Rancher is in place and they have deployed an application it tends to get shared. The platform is very collaborative and is designed to be multi-tenant and quickly it expands. The consumer of Rancher can be anyone from a point-focused project all the way to a priority for an entire IT department looking to deploy containers enterprise-wide.

Q: Let's step back in time a bit and think back to our days at Teros and application security. We haven't talked about security much on this call and in the past security has been viewed as slowing things down, slow to adopt new changes and not really any of the things we have talked about.

Where do you see security being incorporated in these new environments?

A: I think it depends on the organization. Customers want to understand how we isolate environments, how are we handling encryption, network traffic. Key & secret management is an important part of running an application as well.

It think security, like so many other things, is a reflection of the size and maturity of the organization. For organizations that have aggressively adopted DevOps, security has joined the party. Those teams are seeing this as a way to accelerate innovation and instead of resisting change they are deploying new tools to enhance security.

In organizations where DevOps is a newer phenomena security can be an issue. You may run into teams that are more traditional and tend to push back on the pace of change. That is a fight (resisting change) that security will always lose.

It's just a matter of time for security organizations who haven't adopted this pace of change to realize they are a critical part of this process. They need to be coming up with new solutions instead of resisting where the business is going.

Most security teams want to be part of this dynamic, faster changing world. Many organizations we talk to have either started to get security on board or are starting that process now. I don't think we will be talking about this problem in 3 or 4 years.

Q: I think we will see a number of new security innovations as these micro-services evolve. We are starting to see a new term, SecOps, emerge as well. I'm excited to see what this brings.

A: Being able to do things like distributing agents as micro-services to containers, inspecting the network and model behaviors, what containers talk to what containers, security will just get better. We are seeing the application of machine learning and big data to take all of this new data about applications and how things normally communicate and behave - and apply security.

A wink of the Load Impact eye to Shannon for taking the time for this interview and his market insight. This is an exciting time to be in the evolving world of DevOps and it is great to hear directly from one of the most innovative companies in this space.

Read More

Popular posts

Posts by Topic

see all