Published/updated: April 2012
Tunes to get you started
’Dynamic workload management’ and ’private cloud’ are just two of the terms currently in vogue to describe new approaches to the management of IT systems.
Dynamic IT is likely to be marketed as a fully automated environment where everything is controlled in an orchestrated manner.
We know from a recent survey that a growing number of respondents regard dynamic workload management and private cloud initiatives as important. But while interest is quite high, only a few indicate they are well advanced in their implementation.
The idea of a do-everything-at-once orchestration project is almost certain to be considered as overkill, beyond any hope of implementation without ripping out all systems and starting again from scratch.
Back to basics
Few companies would be prepared to contemplate such a big-bang approach. Most will be by looking for step-by-step improvements, beginning with the fundamentals of infrastructure management.
So what can organisations do to put in place the building blocks for private cloud without breaking the bank? Where do they start?
The first item on the list should be getting key elements of the underlying infrastructure – primarily servers, storage, and networking – working together effectively. The aim is to limit complexity when laying the foundations of an orchestrated environment.
But we know from a number of studies that few organisations feel they have the systems management tools they need to undertake routine daily administration, never mind operating dynamic private clouds.
Even fewer consider the tools they have to be well integrated with each other. In many organisations systems administration is performed in silos, usually using discrete tools.
This often provides a fragmented view of the world, with no coherent picture of systems components and how they work together.
Knowing me, knowing you
It is therefore important for IT administrators to have an accurate and up-to-date handle on just what systems are deployed in the company, what applications they support and how these are related to business services.
Essentially, this amounts to performing some basic inventory discovery, coupled with an elementary staff survey to find out the importance of each service and the numbers of people using it.
Equally, it is a good idea to have some information on the quality of service being delivered prior to making any alterations to the underlying infrastructure to make sure that flexing resources will not result in degraded services and unhappy users.
This is a key area, yet we know from many studies that most organisations have little service quality monitoring in place.
Other management tools may become important as flexibility increases. These include asset management and change management systems, perhaps ultimately leading to a service catalogue.
Many of these may already be in place, at least partially. But as part of a joined-up approach to any form of private cloud and dynamic IT service delivery, it is critical that the tools used to manage servers, both physical and virtual, are well integrated with those employed to administer storage systems and networks.
All part of the plan
In terms of prioritising what to include in the new dynamic management environment, it is usually better to start with relatively simple applications or services.
IT staff can then establish the operational procedures to manage the service as a whole, even if this means using multiple tools, without getting bogged down in application-level complexity.
When the processes are in place, any technology updates or changes can be considered case by case, as long as any tools acquired fit into an overall plan.
As things develop, it is likely that issues such as charge back and service accounting and reporting will become important. Process automation systems may also become relevant as the scope of the dynamic infrastructure expands, although they should not be an inhibiting factor at the start of the journey.
The aim is not to try to take on everything in one go and totally transform the whole of your IT delivery.
Start simple, gain confidence and grow from there. Boiling the ocean is rarely effective except in science fiction movies.
CLICK HERE TO VIEW ORIGINAL PUBLISHED ON
By Richard Edwards
By Dale Vile
By Bryan Betts and Dale Vile
Yesterdays software delivery processes are not up to dealing with today’s demands, but modernising you approach is not just about implementing Agile, even creating a DevOps culture. You need to focus on some specific, hard-core principles. ...more
By Dale Vile & Jack Vile
Cloud services are increasingly becoming part of the IT delivery mix, but a recent study of 378 senior IT professionals suggests a parallel commitment to ongoing investment in the datacentre. This in turn shines a light on the key role of modern application platforms. ...more
By Tony Lock & Dale Vile
Despite the advent to cloud computing the datacentre remains central to corporate IT. But with demands continuing to escalate, how do you ensure your infrastructure is powered robustly and efficiently? ...more
By Bryan Betts
Many are exploiting cloud computing to drive business advantage, while others are enjoying the flexibility and efficiency of DevOps. But what happens if you use both together in a coordinated manner? The answer is a significant amplification of the benefits of each. ...more