Published/updated: September 2009
by Dale Vile and Jon Collins
One of the hottest topics in the IT industry at the moment is virtualization, particularly in relation to the x86 servers that have so often proliferated to unmanageable levels. While the principle of server consolidation based on the latest virtualization technologies is now accepted, how far have organizations progressed in this area? And based on adoption experiences, what are the practical considerations when dealing with server proliferation?
Wake-up call: organizations generally have more physical servers than applications
Feedback from a recent study suggests that the IT infrastructure in larger organizations is often supporting several hundred, if not a thousand or more, applications, with even smaller businesses supporting software portfolios in the 10 to 50 application range. While this may be familiar, the wake-up call is that applications are generally outnumbered by the physical servers on which they run. As a result, 85% of respondents highlight existing or emerging issues with server proliferation.
Server proliferation is a function of cultural as well as technical factors
Historically, new applications have been installed on their own dedicated hardware, regardless of whether the full capacity of a server is required – this avoids conflict with other applications, and enables each box to be tuned to run an application in an optimum manner. However, the dedicated server approach reinforces the (administrative and political) expectation of business stakeholders owning everything associated with the applications they fund, with the server and other equipment allocated to their own cost centre.
The consequences of server proliferation are real, but can be tackled
Server sprawl has a direct, negative impact on routine activities such as patch management, application provisioning, and general monitoring and management of performance. This has a knock-on effect with regard to operational overheads and associated costs. Server proliferation also goes hand in hand with poor server utilization and power and space related challenges, which not only translate to elevated costs, but can also constrain development and growth. Those who have server proliferation under control demonstrably suffer significantly fewer problems in all of these areas.
Virtualization technologies are key to driving improvements
Quantitative and anecdotal evidence suggests that there are clear and tangible benefits to be gained from the implementation of virtualization technology to consolidate and rationalize x86 server estates, and experience in the mainstream is being accumulated rapidly. With the solution landscape still developing, however, it is important to monitor the way in which offerings are evolving in terms of pricing, bundling and capability, e.g. something that looked current a year ago might not do so today.
Adoption experiences highlight the importance of forward planning
When adopting any new technology, it is important to ensure that new problems are not being created for the future, e.g. for the unprepared, unwanted proliferation of physical servers can so easily be replaced by virtual server sprawl. Understanding implementation and management best practice, and planning accordingly, will reduce the risks and enhance the returns from your virtualization activity.
The study upon which this report is based was independently designed and executed by Freeform Dynamics and executed in collaboration with The Register news site. Feedback was gathered via an online survey of 301 IT professionals from the UK, USA, and other geographies, and an interactive ‘reader workshop’. The study was sponsored by Microsoft.
This report is free of charge. Click above to download the PDF or view the interactive e-document.
If you experience any problems during this process please contact us at;
email@example.com or call +44 (0)1425 626501 / 620008
By Dale Vile
Creating a more customer centric business environment has historically been hard to achieve. In this paper, we will examine how technology and market trends, together with changes in the regulatory landscape, are elevating the status of customer centricity from ‘aspirational ideal’ to ‘business critical imperative’. ...more
By Dale Vile, Tony Lock, Jack Vile
With the phenomenal rise in the adoption of smartphones, tablets and other desirable devices, many pundits predict that the direction of corporate IT will increasingly be defined by end users. But does this make sense? ...more
By Jack Vile & Dale Vile
As both company and personally owned mobile devices are increasingly used in business, understanding and dealing with the associated risks has become a significant concern for many. In this report we explore some of the trends in mobile technology adoption. ...more
By Dale Vile & Tony Lock
If it has been a while since you thought about your DR measures, or a review has been prompted by a risk assessment, compliance audit, actual disaster or some other scare, it’s worth taking some time to understand what can be achieved in light of important changes that have taken place over the past few years. ...more
By Tony Lock
With the advent of digitisation, all public sector environments generate and capture a significant amount of electronic data. Against this background, this paper explores how to manage costs and risks while meeting these changing needs through ‘active archiving’. ...more
By Dale Vile
In some organisations, the tension between IT and business teams has come to a head around the topic of devices & the so called ‘Bring Your Own Device’ phenomenon.It’s time for IT and business managers to get together & start a proper dialogue about to deal with evolving requirements. ...more
By Dale Vile