Applications/DevOps

DevOps Transformation: Violating The Prime Directive

Bart Driscoll By Bart Driscoll Global Innovation Lead – Digital, Dell Technologies Consulting October 28, 2014

DevOps success is antithetical to an enterprise’s natural tendencies to operate within the boundaries of Star Trek’s prime directive.  Rather than collaborate across planets (aka. departments) and raise the collective capacity and capability of organizations, many organizations prefer …

“No identification of self or mission. No interference with the social [or technological] development of said planet. No references to space or the fact that there are other worlds or civilizations.”  -Star Trek – Prime Directive.

For DevOps, this model does not compute.

DevOps is disruptive; it is a framework for transforming application delivery within an organization.  DevOps targets the people, processes, and tools associated with application development and deployment operations to highlight inefficiencies in how changes are created and promoted through the software development lifecycle, or SDLC.  Once inefficiencies in the SDLC are revealed, DevOps leverages the principles and practices of lean and agile to eradicate waste, improve consistency, and accelerate throughput systematically.

Most enterprises struggle in this space due to traditional organizational structures and methodologies that foster knowledge silo’s and employ heavy governance practices designed to slow down change in an attempt to improve stability and resiliency.  This focus isn’t surprising given that the only thing worse than missing a release deadline is taking down production in most enterprises.  This problem is further exacerbated by disconnected tooling and manual procedures that are complex, highly error-prone and dependent upon tribal knowledge to successfully deploy applications.  Many enterprises, large and small, are realizing that this model is not sustainable in the new economy where speed and agility separate the disruptors from the disrupted; and, high costs prevent IT from making strategic investments that will improve the profitability and productivity of the business.

DevOps address these challenges head-on by employing an incremental, iterative approach borrowed from agile, and; by leveraging concepts from lean, like value stream mapping, to identify steps along the SDLC that are particularly inefficient, expensive, and repetitive.  Solutions are developed by cross-functional teams consisting of experts from Development, Quality Assurance, Operations, Infrastructure, and/or Security/Compliance to enable the unique perspectives and goals of all SMEs to be considered.  As a result, devops solutions are often able to optimize or eliminate manual checks and other handoffs that were originally created to ensure that “an expert” reviewed the change prior to deployment.  In addition, devops focuses heavily on building an integrated, automated tool chain both to eliminate repetitive manual tasks, like server builds or application deployments, and to shorten the feedback loops to developers and operations in the event an issue or defect is found.

DevOps accomplishes this tool chain vision by directing every system, configuration, and application along the delivery lifecycle to be treated as code and managed via version control.  Storage, load balancers, network configs, application code, run-time configurations, etc. are all managed through APIs and custom scripts that can be deployed and/or rolled-back on-demand.  Versions of applications and the corresponding infrastructure are managed in parallel to preserve the integrity of the overall solution.  For example, introducing new query functionality into a legacy application may require a newer version of the database, a code deployment, and schema changes to the data model.  In this case, it is critical to version the application code, software libraries, and data model as a single instance due to the interdependencies between code and data as well as data and software configuration.

By treating the environment and application as a packaged instance and managing that instance through code, an enterprise can start moving towards the ultimate goal namely, Platform-as-a-Service or PaaS.  In PaaS, automation handles most provisioning, configuration, testing, monitoring, deployment, etc., tasks along the SDLC.  By shifting these task to a machine, people and dollars are freed up for more strategic initiatives.  Additionally, automation eliminates manual handoffs and error-prone procedures thereby accelerating the overall delivery pipeline and improving stability and resiliency.

EMC IT recognized this challenge and kicked off the ePaaS project in 2012.  Through their efforts to date, they have been able to streamline a legacy provisioning and procurement process for infrastructure that once took approximately 4 months to complete.  This costly and slow process was replaced with an automated solution accessed through a simple services portal that not only reduced the timeline to less than 1 HOUR, it also provided the framework for metering and tracking cost associated with these systems.

EMC, like many enterprises, didn’t try to build Rome in a day.  Rather they focused on delivering core devops services, like IaaS, Build/Integration, Deployment Engines, etc. and then started layering on additional services like testing, monitoring, etc. These add-on capabilities extend the platform and continue to drive cost out while increasing efficiency, speed, and agility of the automation platform.

A second recommendation is to build the core services to support a single application environment.  Once complete, enterprises can also broaden these core services to support a wider array of application environments and configurations.  Determining what environment to build first can either be determined by upcoming development projects or by working with EMC PS to conduct a portfolio rationalization exercise where high value applications in need of modernization can be prioritized based on business value.

This pace-based approach was recently recommended by Gartner’s own Chief DevOps Architect, Ruston Vickers as an Enterprise Standard and Best Practice.

So why commit to this much change and disruption?

In short, there really isn’t an option.  Organizations, like your competitors, that embrace devops and invest in building automation typically see dramatic improvement in speed and agility that ultimately help drive cost and waste out of the system.  While results may vary, a number of studies have shown that companies that invest in devops and implement automation have experienced…

  • 30x more frequent code deployment
  • 2x higher rate of success deploying changes
  • 12x faster mean-time-to-recovery (MTTR)

Additionally, over 80% of senior IT leaders in a recent Agile State of Union Survey reported that leveraging agile and devops within their organization has led to improvements in productivity, quality, visibility, alignment, and responsiveness.  With ROI like this, change is necessary.

In the immortal words of the Star Trek nemeses, the Borg, “Resistance is futile.”

Bart Driscoll

About Bart Driscoll


Global Innovation Lead – Digital, Dell Technologies Consulting

Bart Driscoll is the Global Innovation Lead for Digital Services at Dell Technologies. This practice delivers a full spectrum of platform, data, application, and operations related services that help our clients navigate through the complexities and challenges of modernizing legacy portfolios, implementing continuous delivery systems, and adopting lean devops and agile practices. Bart’s passion for lean, collaborative systems combined with his tactical, action-oriented focus has helped Dell Technologies partner with some of the largest financial services and healthcare companies to begin the journey of digital transformation.

Bart has broad experience in IT ranging from networking engineering to help desk management to application development and testing. He has spent the last 22 years honing his application development and delivery skills in roles such as Information Architect, Release Manager, Test Manager, Agile Coach, Architect, and Project/Program Manager. Bart has held certifications from PMI, Agile Alliance, Pegasystems, and Six Sigma.

Bart earned a bachelor’s degree from the College of the Holy Cross and a master’s degree from the University of Virginia.

Read More

Share this Story
Join the Conversation

Our Team becomes stronger with every person who adds to the conversation. So please join the conversation. Comment on our posts and share!

Leave a Reply

Your email address will not be published. Required fields are marked *