When we researched the topic of Agile development back in 2008, it was against a backdrop of widespread suspicion of the approach. While the hype was beginning to build around iterative development that appeared to shun the structure and discipline of traditional methods, many equated it to corner-cutting. Others expressed concerns over the apparent prioritization of speed over ‘proper’ upfront analysis, design and planning activity.
A particular accusation was that Agile encouraged a ‘lazy’ trial-and-error approach that didn’t sit well with the needs of development teams working on business-critical applications in a mainstream environment.
Agile or waterfall: it all needs discipline
Such concerns and objections were partly based on ignorance. As we discussed in that report over 12 years ago, Agile was always just as much about discipline and structure as the waterfall methods that everyone was more used to back then. Indeed, you might legitimately argue that you needed to be a lot more organised and process-driven when operating in a fast-moving, feedback-driven environment, in which requirements are never set in stone.
But whatever the roots of their concerns, Agile sceptics could hold out against all of the evangelism because, quite frankly, most didn’t have a strong motivation to change. Everyone said they were under more delivery pressure than ever, but when have developers ever said anything different?
The truth is the level of volatility that businesses and development teams were experiencing was nothing back in 2008 compared to today. Deliver a highly differentiated digital innovation nowadays, and the chances are it will be table-stakes for operating in your market within a month. Leave it another few months and it’ll probably be classed as legacy, as competitors take it to the next level or change things up completely.
Trial-and-error versus fail-fast
Against this background, it’s interesting to note that while the term ‘trial-and-error’ used to have negative connotations, in 2021 taking a science-driven, experimental approach, and applying the ‘fail fast’ principle, is considered essential to keeping up and staying relevant in an ever-changing digital world.
With the Agile debate now largely over, the conversation has moved on to DevOps, Continuous Delivery and how far you push automation across the entire application lifecycle. Indeed, most of the discussions we now have in this space with IT leaders and practitioners are concerned with how to build on success in niche areas. A recurring question is how to scale the use of iterative delivery approaches – from development, through deployment, to operations and continuous improvement – in a highly automated manner.
Unlike with Agile development, however, many organisations moving towards full DevOps don’t have the luxury of time to work through their hang-ups. So if you’re now facing DevOps scepticism, download our 2008 report as a reminder of the kind of misplaced perceptions that can stand in the way of progress. I suspect a lot of the objections will feel very familiar…
Dale is a co-founder of Freeform Dynamics, and today runs the company. As part of this, he oversees the organisation’s industry coverage and research agenda, which tracks technology trends and developments, along with IT-related buying behaviour among mainstream enterprises, SMBs and public sector organisations.
Have You Read This?
The pandemic and productivity: a Covid-19 conundrum
Lifecycle Management of HCI Systems
Modern Data Protection for HCI
Manage your data, not just your storage
Analytics-Driven Storage Management
Make the camera work for you, not against you
The role of machine learning and automation in storage