Published/updated: September 2009
by Dale Vile and Jon Collins
One of the hottest topics in the IT industry at the moment is virtualization, particularly in relation to the x86 servers that have so often proliferated to unmanageable levels. While the principle of server consolidation based on the latest virtualization technologies is now accepted, how far have organizations progressed in this area? And based on adoption experiences, what are the practical considerations when dealing with server proliferation?
Wake-up call: organizations generally have more physical servers than applications
Feedback from a recent study suggests that the IT infrastructure in larger organizations is often supporting several hundred, if not a thousand or more, applications, with even smaller businesses supporting software portfolios in the 10 to 50 application range. While this may be familiar, the wake-up call is that applications are generally outnumbered by the physical servers on which they run. As a result, 85% of respondents highlight existing or emerging issues with server proliferation.
Server proliferation is a function of cultural as well as technical factors
Historically, new applications have been installed on their own dedicated hardware, regardless of whether the full capacity of a server is required – this avoids conflict with other applications, and enables each box to be tuned to run an application in an optimum manner. However, the dedicated server approach reinforces the (administrative and political) expectation of business stakeholders owning everything associated with the applications they fund, with the server and other equipment allocated to their own cost centre.
The consequences of server proliferation are real, but can be tackled
Server sprawl has a direct, negative impact on routine activities such as patch management, application provisioning, and general monitoring and management of performance. This has a knock-on effect with regard to operational overheads and associated costs. Server proliferation also goes hand in hand with poor server utilization and power and space related challenges, which not only translate to elevated costs, but can also constrain development and growth. Those who have server proliferation under control demonstrably suffer significantly fewer problems in all of these areas.
Virtualization technologies are key to driving improvements
Quantitative and anecdotal evidence suggests that there are clear and tangible benefits to be gained from the implementation of virtualization technology to consolidate and rationalize x86 server estates, and experience in the mainstream is being accumulated rapidly. With the solution landscape still developing, however, it is important to monitor the way in which offerings are evolving in terms of pricing, bundling and capability, e.g. something that looked current a year ago might not do so today.
Adoption experiences highlight the importance of forward planning
When adopting any new technology, it is important to ensure that new problems are not being created for the future, e.g. for the unprepared, unwanted proliferation of physical servers can so easily be replaced by virtual server sprawl. Understanding implementation and management best practice, and planning accordingly, will reduce the risks and enhance the returns from your virtualization activity.
The study upon which this report is based was independently designed and executed by Freeform Dynamics and executed in collaboration with The Register news site. Feedback was gathered via an online survey of 301 IT professionals from the UK, USA, and other geographies, and an interactive ‘reader workshop’. The study was sponsored by Microsoft.
This report is free of charge. Click above to download the PDF or view the interactive e-document.
If you experience any problems during this process please contact us at;
email@example.com or call +44 (0)1425 626501 / 620008
By Tony Lock
A recent global survey of 1279 IT and business professionals highlighted that rapidly changing business and regulatory demands are driving a need to modify how security is managed in their software development processes. ...more
By Dale Vile
In the drive towards ever faster and more granular software delivery cycles, it’s important to ensure that speed and responsiveness don’t come at the expense of quality. Insights from 327 IT professionals in a recent survey shed light on the issues and practicalities. ...more
By Richard Edwards
By Dale Vile
By Bryan Betts and Dale Vile
Yesterdays software delivery processes are not up to dealing with today’s demands, but modernising you approach is not just about implementing Agile, even creating a DevOps culture. You need to focus on some specific, hard-core principles. ...more
By Dale Vile & Jack Vile
Cloud services are increasingly becoming part of the IT delivery mix, but a recent study of 378 senior IT professionals suggests a parallel commitment to ongoing investment in the datacentre. This in turn shines a light on the key role of modern application platforms. ...more