Share if you like

We are witnessing another IT revolution

We live in a fantastic time… Not because the need to deliver quality software @ speed has changed, but because today we can combine people, processes and technology to make it happen.

There’s always been the need to release quality software @ speed in order to deliver business value but the gap between IT methodologies and technology has left this vacuum unbridged.

Today we can answer the question: “How can we turn business requirements into fully working production systems in the shortest possible time frame?”

This is possible thanks to the technology enablers which, in turn, have opened the doors to a change in how people interact, processes are followed and ultimately IT systems are delivered.

Today, after the Touring machine and the Internet we are witnessing another IT revolution…It’s called DevOps.

Technology is the enabler

We got to this point in small steps: initially computers were invented. These machines could perform calculations much faster than human beings which created new possibilities and as a consequence people became more demanding. Computers were suddenly used in space programs, military, education and of course, business.

But computers and programmers weren’t cheap. Mistakes were costly and we adapted by creating software delivery methodologies with the goal to remove the delivery of buggy IT systems. This is when the Waterfall methodology bloomed. There was the need to get the requirements absolutely right, then the design, then the development and finally the testing. This was all driven by the illusion that since computer programs are the combination of zeros and ones we could decide in a similar binary fashion what we wanted out of them, until we discovered that, as humans, we are not as binary. The Waterfall gated approach didn’t deliver on the expectations. The majority of IT programmes driven by this methodology failed or delivered much later than required, often providing functionality that was different from the original specs and at a cost that was significantly higher than what had been budgeted.

Technology, once again, was the enabler. With computers getting faster and cheaper and the advent of the Internet, a number of new software systems emerged. Now programmers weren’t part of a restricted elite, but programming became mainstream. With more developers now empowered to transform their ideas into reality, the number of “utility software” grew as well. We also saw best practices and patterns emerge; concepts such as Source Code Management (SCM), Test Driven Development (TDD) and Continous Integration aimed at breaking some of the barriers represented by the Waterfall gated approach. With these barriers partially removed, the need arose for a greater collaboration between those who requested features (the business) and those who delivered them (IT). So in the late 90’s a new mindset made its way in the IT industry: Agile. Its goal was to bring business and IT together, to favour dialogue over gates, collective responsibility over finger pointing and break complex problems into small deliveries which, as a consequence, were faster, more responding to the business needs and of a higher quality.

But we were still a long way from delivering quality systems at pace. The problem, once again, was that we started from the details rather than looking at the bigger picture.

The Bigger Picture

Yes, fundamentally the bigger picture has never changed since IT made its appearance. We need to deliver entire systems, not just parts of a system.

IT software delivery can be compared to a manufacturing plant: there are work centres depending on each other. If part A enters the first work centre, once processed it becomes part A2; this is then delivered to the next work centre which processes it to become, say, part B, and so on. Relating this to software delivery, we can easily see that business requirements enter the manufacturing line as part A, engineering process them producing an IT business delivery, say part B, which is then passed to the next work operations for deployment into production.

Every business can be thought of as the combination of three fundamental parts:

  • Inventory
  • Operational cost
  • Throughput

In order to be successful, every business needs to:

  • Reduce Inventory
  • Reduce Operational cost
  • Maximise Throughput

When we produce artefacts, whether parts or software systems, we are producing Inventory. Operational cost is the cost of people, machines and processes that created the artefact. It’s only when the part is sold, or by analogy the software system goes live that Inventory becomes Throughput. Any barrier to a product delivery, therefore, affects the business negatively and by contrast, every product delivery increases business success.

A system will only be delivered at the speed of its slowest cost centre, a.k.a Theory of Constraints. What we’ve done with computers first and IT methodologies such as Waterfall and Agile later, has only speeded up one work centre (engineering), not the entire system. Much of the disillusionment involving Agile nowadays is caused by the lack of vision for the bigger picture. The Business is asking: why is IT still delivering software systems so slowly even though we are using Agile? Wasn’t Agile meant to deliver software quicker?

The answer is that Agile has speeded up the “engineering” work centre but systems are the result of the collaboration of various actors in the process: business for requirements, engineering for development and testing, operations for Continuous Delivery. Software systems are delivered at the speed of their slowest work centre. In order to speed up software delivery, we need to identify the slowest work centres (a.k.a. the bottlenecks), subordinate everything we do to the bottleneck, elevate the bottleneck until we break it and start all over again, all the while making sure that Inertia doesn’t become the bottleneck.

So what are the bottlenecks to break?

The answer varies from organisation to organisation. I’m pretty sure that people reading this article will be familiar with at least one of the following concepts:

  • Infrastructure provisioning takes too long
  • Developers and Operations work in isolation, as distinct teams. Developers think that their job is done when they’ve finished coding. Operations, dreading that moment, want smooth, reproducible, automated and reliable production releases
  • Lack of test automation (or lack of testing in general)
  • Manual software and database deployments
  • Different environment landscapes, e.g. Dev is different from Integration, which is different from QA, which is different from UAT, which is different from Production
  • Change Requests for production releases take too long
  • Server software installed / maintained by external teams
  • Lack of Continuous Integration tooling
  • Lack of support for operations in the code (logging, resiliency)
  • Lack of monitoring
  • External teams imposing to the Developers the tools to use
  • Inadequate tooling (e.g. SCM, CI, Testing automation, etc)
  • Poor requirements (e.g. not following a BDD mindset, lack of executable acceptance criteria)
  • Red tapes, e.g. controls imposed by organisations to safeguard the stability of production systems
  • Unskilled staff (e.g. in development and Scrum)

By operating with a DevOps mindset, organisations are able to break these constraints and therefore to deliver Quality @ Speed,

DevOps to the rescue

If you look at all of the above bottlenecks you’ll notice that they’ve all got one thing in common: lack of collaboration between people. DevOps is a mindset which wants to increase the collaboration between people by getting them to work together as one team. A team is composed by all the actors who collaborate towards the delivery of a Business Capability. By collaborating,  they can ensure that requirements are written following a BDD approach, code is developed by taking Operations needs in mind, infrastructure is provisioned and configured automatically, code is continuously built, tested and deployed to integration environments, the delivered solution addresses non-functional requirements (NFRs) and it’s of high quality. The picture below gives a high-level overview of the DevOps mindset.


In order to deliver on the DevOps vision, all stakeholders in a project need to collaborate each to deliver their part. The cornerstones of a successful DevOps strategy are:

  • Technology as an enabler, e.g. being able to use the Cloud allows for scalability and elasticity
  • Automation. Every phase in the SDLC needs to be automated, from requirements validation, to testing, to infrastructure provisioning, to configuration management to Continuous Delivery
  • Repeatability. By automating, we remove the chance of human errors. Being able to repeatedly configure our infrastructure and validate our system correctness and quality, we achieve consistent and fast feedback, which in turn allows us to react to changes in scenario. Additionally, repeatability allows us to treat any phase of the SDLC as BAU. A deployment to an integration development environment should not be different than a deployment to production.
  • Consistency. Automation and Repeatability allows our activities to be consistent. Consistency is the key to transparency and speed. Transparency allows us to identify improvement opportunities fast and therefore to adapt to changes in the way we operate to continuously identify and elevate our constraints to increase overall throughput.
  • Collaboration. If all people involved in a Business Capability delivery collaborate as one team, technology and people can deliver the best results.
138 Total stats 10 Today\'s stats