Leading experts bare all about the DevOps movement

The DevOps movement is rising, and an increasing number of IT professionals are keen to adopt this new way of working in order to achieve optimum collaboration between their Development and Operations departments. The ability to react quickly to customer demands is of top priority to businesses all over the world, and the benefits of DevOps is rapidly becoming widely known as offering fantastic business value.

Rackspace has released an Ebook and infographic, highlighting The DevOps Mindset: Real-World Insights from Tech Leaders to help you realise and implement your own DevOps practices within your organisation.

The Ebook shares valuable insights from practicing DevOps leaders with a key focus on outlining the need for enhanced collaboration, measurement and sharing through all aspects of any business. The DevOps Mindset showcases unique perspectives, challenges and achievements, as well as the catalysts which led them to adopt a DevOps mindset.

By successfully balancing the technical and social side of your development and operational processes you can actively learn and advance much quicker to help achieve your company goals. An unequal development of both sides will result in automation without collaboration and a lack of thought into exactly how your ideas and services will effectively be available to your customers.

DevOps advocate Jim Kimball, Chief Technology Officer at HedgeServ says: “I think the fundamental shift toward DevOps started when we got away from focusing on individual team goals and elevated our conversation to organizational goals and let the teams drive toward them.”

“To achieve true DevOps collaboration, you need your employees to really think and act as one, not just be merged together in name only. By pushing communication from the start, everyone gets a better feel for others’ needs and how they do their jobs.” Said James Kenigsberg, Chief Technology Officer at 2U, Inc.

This awesome Ebook shapes a Q&A format and delves deeper into how this new form of agile collaboration is sweeping its way through the software and IT industries.

You will also be able to take away useful tips for business leaders considering transforming their company culture towards a DevOps methodology.

We think this Ebook from Rackspace is a grade A piece, and an asset to anyone contemplating DevOps or thinking about adopting this innovative way of reaching new levels of productivity.

Recent Rackspace study shows businesses adopting DevOps practices at a remarkable rate

What do you think – is Devops just a fad or is it here to stay? Well, Rackspace recently commissioned independent technology market research specialist Vanson Bourne to conduct this piece of research and answer that very question. 700 global technology decision-makers were surveyed and the study discovered that businesses are now recognising DevOps as an established industry with adoption figures soaring at an extraordinary rate. Companies are now seeing significant business value in implementing DevOps as part of their own everyday practices.

So let’s look at the facts according to the Rackspace DevOps Adoption Study

What was previously recognised as a niche domain and implemented by only a select few, is now seeing widespread adoption and considerably transforming the way IT is viewed across a huge range of industries.

61% of those surveyed, highlighted customer satisfaction as the key incentive for DevOps adoption, enabling businesses to deliver better value to their customers through technology, and improve inefficiency to reduce delivery time to the customer.

While utilising DevOps practices and setting clear business goals at the beginning of every project, 57% saw an increased customer conversion or satisfaction rate.

The official Adoption Study infographic highlights 66% of respondents have already implemented DevOps practices, and 79% of those who have not, plan to do so by the end of 2015.

It is clear DevOps is increasingly being recognised as delivering real business value. A massive 93% reported setting clear end goals for their DevOps initiatives, showing a definite focus on significantly improving customer satisfaction for a long-term positive impact on the business as a whole.

In a nutshell – DevOps allows businesses to consider the ways in which they organise and structure their company to initiate better ways of working. It creates opportunities for businesses to deliver better experiences to their customers faster, broaden the range of services they offer and better serve their business by using data more proactively.

A big thank you to the Rackspace Adoption Study for these incredible figures. It’s fantastic to see this industry expanding so rapidly, and we’re looking forward to seeing what the future holds in this space.

3 more #DevOps litmus tests…

We wrote back in August about 3 “litmus test” questions about #DevOps in your organisation. We’d like to add 3 more questions that focus on the more operational aspects of DevOps.

(1) Does Ops attend your Scrums?

Not every organisation can easily restructure to a DevOps model of having Dev&Ops fully integrated into cross-functional teams (or at least not at first).

But one thing we can do to build a bridge between Dev & Ops is to participate in the Agile/scrum development process.

So does Ops have representation at your daily standups? Are they participating in your sprint planning and retrospectives?

If they aren’t then you fail this DevOps litmus test!

(2) Where’s your Ops repo?

One of the key tenets of the DevOps CALMS model is A for Automation. Automation generally implies code, and code should be in source control.

We generally find that most Ops teams moving towards a DevOps model and investing in automation have their own source code repository to store their code. Some people might argue that automation code related to a specific application or service should actually be in the repo with the application code, which is also a valid pattern.

Either way – the litmus test is “where’s your Ops repo?”. If the answer is “we don’t have one”… you’re doing scripting, but you’re not doing DevOps!

(3) What’s your github username?

Ok, I know that not everyone uses Github (or Git as their DVCS) but you tend to find that most DevOps people have a github account so they can contribute back to the community by sharing their work, contributing to open source projects, creating their own projects or forks etc. I’d argue that having a basic working knowledge of git is just about a “must have” skill for modern DevOps people.

The generic form of the question for Ops people is “what’s your login details and how do you access your ops repo (from question #2 above?”. If the answer is “I don’t have one” or “only the people in the DevOps team have access” you might have created a DevOps silo that might become a barrier to wider DevOps adoption across your organisation.

-TheOpsMgr

DSC Tooling: cDscResourceDesigner and DscDevelopment

In PowerShell v4 – the first release of Desired State Configuration – the experience of authoring DSC resource modules isn’t terribly pleasant.  You have to produce both a PowerShell module and a schema.mof file, both of which have to follow certain rules, and agree with each other about the parameters that can be passed to the module’s commands.  If you get it wrong, your resource won’t work.

One of the extra DSC-related goodies Microsoft has released is an experimental module called xDscResourceDesigner.  It contains several functions that help to take the pain out of this authoring experience.  For example, you can run a Test command on a resource to make sure that its PSM1 file and schema.mof are both legal and in sync.  DscDevelopment, another of Steve Murawski’s modules from Stack Exchange, takes this concept a step further, and allows you to do things like automatically generate a schema.mof file from a PSM1 file.

cDscResourceDesigner is a community-modified version of the original xDscResourceDesigner module.  Most of our focus, so far, has been on the Test-cDscResource function, which is called from Invoke-DscBuild and becomes part of our continuous delivery pipeline.

Microsoft’s experimental version of this Test command was pretty good, but it would report failures on some modules and schema files that are perfectly valid, as far as WMI and the Local Configuration Manager are concerned.  In fact, several of Microsoft’s own DSC resources (both experimental and those included in PowerShell 4.0) would fail the tests of the original version of this command.  We’re working on identifying and fixing these types of false failures, so the Test-cDscResource command can be trusted as a reliable reason to fail a build.

We’ve also got an eye on the future.  One of the big DSC-related enhancements coming in PowerShell v5 is the ability to author resources using a new class definition language.  There will no longer need to be a schema.mof file separate from the PowerShell code, as the class definitions will contain enough information for DSC to do everything it needs to do.  Once that new version is released and deployed, you’ll have a much better experience by using this new resource definition language.  At that time, we plan to add new commands to DscDevelopment or cDscResourceDesigner which will allow you to automatically convert resource modules between v4 and v5 syntax, in either direction.

PowerShell DSC tooling updates released!

Over the past few weeks, we’ve been working on incorporating Desired State Configuration into our continuous deployment pipeline.  We’ll be using this technology internally, eating the dog food, but we also want to be able to help our clients leverage this technology.

To get started, we decided to take advantage of the excellent work and experience of Steven Murawski.  During his time at Stack Exchange, Steven was one of the first and most visible adopters of DSC in a production environment.  Over time, he developed a set of PowerShell modules related to DSC, which have been published as open source over at https://github.com/PowerShellOrg/DSC.

We’ve been working on some updates to this code and are now pleased to announce that it’s publicly available via GitHub at https://github.com/devopsguys/DSC/tree/development

A list of the changes that Dave Wyatt has been working on within DevOpsGuys over the past couple of months can be found at https://github.com/PowerShellOrg/DSC/pull/80

The most extensive changes, and many of the design decisions were made in collaboration with Steve Murawski. There’s still more work to be done before this branch is ready to be merged into master; we just wanted to get this code out into the public repo asap. The motivation behind these changes was to get the tooling modules to a point where we can train clients to use them in a continuous delivery pipeline.

The biggest priorities were creating examples of the folder structures required by DscConfiguration and DscBuild, and simplifying the Resolve-DscConfigurationProperty API and code.

We’ve also tried to improve the overall user experience by making minor changes in other places; making Invoke-DscBuild run faster, making failed Pester tests or failed calls to Test-cDscResource abort the build, making Test-cDscResource stop reporting failures for resource modules that are actually valid, etc.

We’d love your feedback on the changes we are making, so please get in touch with your comments.

DSC Tooling: The DscConfiguration module

The DscConfiguration module is one of the tooling modules originally written by Steven Murawski for Stack Exchange.  It has since been released as open source, and the main repository for the code is hosted by PowerShell.org.

The DscConfiguration module provides us with two main pieces of functionality.  First, it allows us to store the ConfigurationData hashtable in several smaller, more manageable data files.  This is a vast improvement over the experience you would have if you had to maintain the entire ConfigurationData database in a large, single file.  Once you scale out beyond a few machines, you quickly wind up with a file that is thousands of lines long.  It can be easy to overlook errors in such a large file.

The other major piece of functionality provided by the DscConfiguration module is the introduction of the concept of Sites, Services, and Global settings, in addition to Nodes.  Instead of needing to define every setting directly on each Node – which could introduce a large amount of duplication throughout the ConfigurationData table – you can define settings that are inherited by all nodes, by nodes in a specific site, or by arbitrary groups of similar nodes (which are referred to as Services in the DscConfiguration module.)

When we began working with the Stack Exchange modules, the DscConfiguration module’s code had the most complexity.  As Steven described it, the module had grown organically over time, in response to changing needs within Stack Exchange.  We have collaborated with Steven to simplify the code in this module, making it easier to understand and to document, in some places.

Here are some of the changes we’ve made to the module at DevOpsGuys:


We moved the Resolve-DscConfigurationProperty function from the DscBuild module to the DscConfiguration module.

It makes more sense for it to live here, with all of the rest of the code that has explicit knowledge of how the ConfigurationData hashtable has been structured.We speculated that this function was originally in the DscBuild module due to a block of code which would look up the value of the $ConfigurationData variable if it were not explicitly passed into the function.  Refer to this gist for the original code an our updates.

Presumably, the intention of that code was to allow you to call Resolve-DscConfigurationProperty from a configuration script without having to bother to explicitly pass along a value to the -ConfigurationData parameter on every call.  The problem with the original implementation is that it used the Get-Variable cmdlet, which will only resolve variables in the same script module as its caller.  If you were using Invoke-DscBuild, you wouldn’t notice the difference, because that command also contains a $ConfigurationData variable that would eventually be resolved.  If you tried to call a configuration from outside of Invoke-DscBuild, though, the calls to Resolve-DscConfigurationProperty would fail (unless you passed in -ConfigurationData explicitly.)

By converting this block to use $PSCmdlet.GetVariableValue() instead, the function can now be called from a configuration even if you’re not using the Invoke-DscBuild command from the DscBuild module, and without explicitly using the -ConfigurationData parameter.


We took out the concept of “Applications”, replacing it with the ability to define a hierarchy of properties.

In the original Resolve-DscConfigurationProperty function, there were two ways you could use it:  you could pass in a -PropertyName string, or an -Application string.  If you used -Application, the function would look up values in a slightly different place, and would return a hashtable containing whatever values were needed to install a particular application (Name, ProductId, SourcePath, Installer, etc.)  The code itself was quite complex, due to having checks for whether $PropertyName or $Application was being used in virtually every helper function.

We speculated that the reason for this Application parameter’s existence was because Stack Exchange needed a way to return a container for other properties, instead of just a single flat PropertyName.  In order to simplify the code and make it more flexible for users, we extended this concept to all properties.  You can now pass in a value for PropertyName which looks like a file system path, and it will resolve those values by looking through nested hashtables.  Refer to this gist for examples of how this looks for the caller, before and after our changes.  One advantage of this new approach is that you can override individual values of the nested hashtables without duplication; with the old Application implementation, you would have to repeat the entire table even if you only wanted to override one key / value pair.

DSC Tooling: The DscBuild module

The DscBuild module contains the Invoke-DscBuild function.  This is essentially a complete implementation of a DSC continuous delivery pipeline which takes care of producing all of the artifacts which need to be published to your pull servers:  MOF documents, zip files for resource modules, and checksums for both.  It also uses Pester to run any unit tests on your resource modules, and runs the Test-cDscResource command from the cDscResourceDesigner module on them as well.  Failures in either of these two types of tests will abort the build.

It’s not strictly necessary to use Invoke-DscBuild; you could reproduce this process in a product like TeamCity using individual steps for each part of the process: running unit tests, compiling configurations, etc.  The Invoke-DscBuild function can serve as a template for setting up your own continuous delivery pipeline, or you can use it as-is.

We have made some small changes to the Invoke-DscBuild process to improve its performance and functionality.  Calls to Test-cDscResource were tweaked slightly to cut down on the number of calls to Get-DscResource (which is an extremely slow command, for whatever reason), and we modified the process so it only tests and rebuilds module zip files if the version number of the module has changed from what is already built on disk.  When we first reviewed the module, failed unit tests were not causing the build to abort; this has been corrected as well.

On the whole, we found the DscBuild module to be a great starting point for producing a DSC-based CD solution.

Follow

Get every new post delivered to your Inbox.

Join 2,363 other followers

%d bloggers like this: