Published/updated: March 2010
by Jon Collins, Tony Lock and Dale Vile
Most medium and large organisations need to run ‘compute-intensive’ applications of some form. While High Performance Computing (HPC) is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Compute-intensive application workloads are not industry-specific
Today’s computer systems are more powerful than ever. But also on the increase, are the needs of medium and large businesses to run highly demanding workloads that make maximum use of available computer power. Understandably, larger organisations have more requirements than smaller organisations, and such workloads are more prevalent in certain verticals such as financial services, telecoms and research. However the need is evident across the board.
Not all compute-intensive needs are currently being met
More often than not, such demanding workloads are being run in batch mode rather than interactively, which cannot be ideal: smaller organisations (with sub-5,000 employees) in particular tell us that their compute-intensive needs are not being met. Hurdles to solving this problem are not only to do with finding sufficient time and resources, but also involve both existing applications and current infrastructure, suggesting legacy issues are holding organisations back.
The gap is closing between specialist HPC and more mainstream, compute-intensive IT
While traditional HPC may have been about building custom compute platforms for specialised applications, today’s HPC is not as isolated as many might think. Specialists no longer see HPC as a separate domain; in addition, HPC is increasingly reliant on commodity equipment and software. While the gap with mainstream computing may be closing, the journey is not over yet, as HPC systems still require considerable customisation compared to general-purpose machines.
The HPC community has much to give in terms of skills and experience
Lessons learned in HPC environments are equally applicable in delivering infrastructure to support more general compute-intensive workloads – for example, architecture and design skills around networking and communications, power, cooling and so on. Indeed, the HPC community is better placed than most to identify candidate workloads that could benefit from the HPC treatment – candidates which might not be evident to those who are not HPC-savvy.
Meanwhile however, the evolution of HPC itself needs to accelerate
While demand for compute-intensive platforms may be high, the traditional supply chain for HPC is not changing that fast. It may be that developments in other areas of IT, such as adoption of virtualisation and cloud-based hosting models, may increase momentum in this area. In particular it is generally agreed that automation and configuration tools are lacking – though this will inevitably change as such models become more widely used.
This report is based on the findings of a research study completed in November 2009 in which feedback was gathered from 254 predominantly IT professionals with direct or indirect experience of high end server computing environments. The report was sponsored by Microsoft, though the study was designed, executed, analysed and interpreted on a completely independent basis by Freeform Dynamics.
This report is free of charge. Click above to download the PDF or view the interactive e-document.
If you experience any problems during this process please contact us at;
firstname.lastname@example.org or call +44 (0)1425 626501 / 620008
By Dale Vile and Tony Lock
It’s easy to be caught out by a cyber attack or internal mistake that leads to your customers’ data or important intellectual property ending up on the black market. Making sure your business is adequately protected and is able to respond effectively to a security incident ...more
By Dale Vile Tony Lock & Jack Vile
Application programming interfaces (APIs) have been around for decades. In the early days of IT they were primarily used to give programmers convenient access to libraries of prebuilt functions. As systems became more distributed, APIs found their place ...more
By Dale Vile & Jack Vile
The world we live in is increasingly digital. As the smart use of technology leads to markets speeding up and becoming ever more unpredictable, a strong set of established offerings and execution capabilities only gets you so far. Feedback from 1,442 IT ...more
By Dale Vile
Advances in digital technology create significant opportunities to transform both customer engagement and business operations. As the trends in these areas continue, feedback from 1,442 respondents in a recent survey highlight 10 key traits of the highest achievers. ...more
By Dale Vile
IT infrastructures are often coping pretty well with current business requirements, but many IT professionals are aware that new and changing needs will lead to future capability gaps. They also know that more of the same is not the answer ...more
By Dale Vile
In today’s fast-moving, information-intensive business environment, data management is more of a challenge than ever. Relying on manual processes and scripts, or ad hoc piecemeal automation, is not sustainable ...more
By Dale Vile
A perennial problem with storage is how to deal with escalating requirements in a smooth, manageable and non-disruptive manner. By removing many of the traditional limits on system expansion, Ceph based configurations ...more
By Dale Vile
Not so long ago, many were speculating that ‘Bring Your Own Device’ (BYOD) would define the future of end user computing. Most organisations today, however, see a role for both company and employee owned equipment to meet the wide and varied range of needs ...more