Published/updated: April 2013
By Dale Vile
There’s a trick we use at Freeform Dynamics when trying to figure out the true significance, if any, of the ‘latest big thing’ being promoted by IT vendors and pundits. We ask ourselves what will be left when the marketeers get bored with the associated buzz word or hot phrase and transition their messaging to the ‘next big thing’, which they inevitably will.
As a simple example, consider ‘Web 2.0’. It’s a term that most of us became sick of very quickly because it was so abused as a label to push product. When we teased it apart, though, it was clear that two underlying developments were taking place that were important to acknowledge – the internet becoming a more interactive place, and Web interfaces becoming a lot richer. Those developments have clearly stayed with us, and have had a big impact on how businesses exploit the Web and the way in which developers design applications to run over it.
With ‘SOA’ (Service Oriented Architecture), another term that’s now gone out of fashion, the thing that stayed with us was software componentisation and open APIs as the norm rather than the exception when designing and building software. Whether people are using service bus technology, publishing catalogues of business services, and transforming the way they operate in the way the marketeers envisaged is secondary to this.
Road to nowhere
Coming back up to date, we ended up with some interesting conclusions when we looked beyond the messaging and posturing associated with cloud computing. We might baulk at the top line claims of some of the more extreme evangelists who argue that we are heading towards a total shift from on-premise to hosted computing. We might also get weary of the messaging contortions that are apparent as technology vendors attempt to explain how even the most mundane and traditional products can enable your ‘journey to the cloud’. An Ethernet port on a storage device does not a ‘Personal Cloud’ make.
However, unless you are totally closed to new ideas, you can’t help but recognise that there are some interesting things underpinning all of the bluster around cloud. We have seen some pretty big strides in terms of what’s possible in areas such as infrastructure virtualisation, software architecture and systems management over the past decade that have shifted many of the traditional lines that exist between what’s possible, what’s practical and what’s commercially viable. Things that would not have been considered sensible options for mainstream businesses five or ten years ago because of prohibitive cost and/or complexity are now there for the taking.
As an example, think back a few years to when we were all being bombarded with the hype around ‘Grid Computing’, one of the ‘big things’ of that era. Back then the idea of creating pools of compute power to boost hardware utilisation, facilitate smoother growth, and better deal with fluctuations in demand was pretty neat. So too was the notion of utility computing, in which compute cycles could be consumed on demand, just like electricity. The problem was a lot of the solutions and services were incomplete, immature and/or extremely expensive. You typically needed a lot of money, specialist skills and courage if you were going to go for it.
While a lot of the messaging we hear today around cloud computing is arguably just grid and utility computing redux, the big difference is that IT vendors and service providers can now actually deliver on the promise. Sure, we are missing some standards in places, and some of the licencing and subscription models remain, shall we say, a work in progress. But fundamentally it all works, and provided you select the appropriate solution for the problem at hand bearing in mind your environment and the relevant constraints, it doesn’t need black magic skills and stupid amounts of money to take advantage of it anymore.
Mentioning the ‘appropriate solution’ here brings us onto one of the two most significant things about cloud that will remain with us once the marketeers move on – increased choice. Given a requirement today, e.g. for a new application, you can elect to deploy it in a traditional manner on its own dedicated stack of hardware and platform software, or on a so-called ‘private cloud’ that you have created by pooling server and storage resources in your datacentre or computer room. You then also have various hosted service options, from traditional co-location, through individual virtual servers or virtual private clouds, right up to the Software as a Service (SaaS) model in which application functionality is delivered ‘on tap’.
For the foreseeable future, all of these forms of deployment and delivery will co-exist, and the vast majority of mainstream organisations will mix and match them as needs dictate. You might choose to keep your next ERP deployment in-house on dedicated hardware, for example, but take advantage of SaaS for your email and collaboration requirements because it’s easier to deliver a rich experience to mobile and remote users while maintaining security. Even then, you might elect to keep some office tools on the desktop rather than moving everything into a hosted environment.
Such an architecture illustrates what’s increasingly referred to as the ‘hybrid approach’ in which you use multiple delivery mechanisms in tandem within the same notional system. Another simple example might be keeping your email server in-house, but taking advantage of an online archiving service to ease the head-ache of long term retention. The point is that there is no absolute right or wrong, there’s just a lot more options now that you can consider based on functionality requirements, access needs, practical constraints, and the type of data being handled, and, not least, your preferences.
The consequence of this, and something that’s already happening within organisations that ‘get’ the significance of increased choice, is a shift in mind-set within IT from deploying and managing systems to delivering services to the business. This is about moving the centre of gravity of IT decision-making from the traditional focus on ‘how’ capability is delivered to ‘what’ is delivered.
The second and most significant thing that will remain after the cloud hype disappears is therefore the emergence of a ‘source agnostic’ approach to IT delivery, in which decisions are made according to business needs and constraints, rather than technology ones.
By Tony Lock
A recent global survey of 1279 IT and business professionals highlighted that rapidly changing business and regulatory demands are driving a need to modify how security is managed in their software development processes. ...more
By Dale Vile
In the drive towards ever faster and more granular software delivery cycles, it’s important to ensure that speed and responsiveness don’t come at the expense of quality. Insights from 327 IT professionals in a recent survey shed light on the issues and practicalities. ...more
By Richard Edwards
By Dale Vile
By Bryan Betts and Dale Vile
Yesterdays software delivery processes are not up to dealing with today’s demands, but modernising you approach is not just about implementing Agile, even creating a DevOps culture. You need to focus on some specific, hard-core principles. ...more
By Dale Vile & Jack Vile
Cloud services are increasingly becoming part of the IT delivery mix, but a recent study of 378 senior IT professionals suggests a parallel commitment to ongoing investment in the datacentre. This in turn shines a light on the key role of modern application platforms. ...more