What You Missed at PuppetConf 2016

Brian Baggett

by Brian Baggett

Last month, Puppet’s annual PuppetConf was held in San Diego and if you were not able to attend, you missed a great conference. Fortunately, we can tell you what you missed.

Back in April, Puppet began helping customers deploy container technology like Docker, Kubernetes, Mesosphere, and CoreOS. The Puppet Docker Image Build is allowing people to automate the build of containers just like they do with the building of servers. All of this is courtesy of Puppet’s Project Blueshift. This project is focused on helping their users adopt emerging technologies and the current focus is all about containers. There were great sessions on deploying Puppet in both AWS and Azure. As deployments in public clouds have grown, Puppet has adapted to keep up.

Another big announcement was the introduction of Phased Deployments. This features allows automators to use Facter Facts stored in Puppet like environment or location and deploy changes only to those targets. In essence, this allows very focused and targeted deployments in large environments.

Puppet introduced “situational awareness” into its tools starting with the newly announced Puppet Enterprise 2016.4. This technology is focused on giving developers into assessing the state of infrastructure and the impact of proposed changes.

We were happy to attend and speak to our partnership with Puppet at the Partner conference. Puppet is an instrumental part of our cloud management platform and it was a pleasure to be recognized as the 2016 Puppet Solution Partner of the Year.

If there was one recurring theme of the conference it was that Puppet sees the shift in managing traditional, on-premise services to off-premise, container driven platforms and is making the transition to become a platform that handles diverse technological needs.

Brian and Puppet