Published/updated: May 2012
Failing to do so could be costly
There’s no need to rattle off another set of numbers about the relentless increase in the amount of data organisations hold. And there should no longer be a need to explain why storage and back-up are important. After all, most of the executives fortunate enough not to have experienced a major data loss event at work will likely have discovered in their personal lives how painful it can be to retrieve lost photos or contact information, if indeed retrieval is even possible.
In reality, though, storage remains a bit of a technology step-child, and it hasn’t become any easier for IT leaders to get budget for storage-related investments, or to take optimal decisions when it comes to allocating the funds that are available. Business executives for their part complain about being bombarded with technical jargon, without seeing a clear reason why a particular virtual multi-tiered storage platform should be preferable to a less costly SATA disk array.
Not all data is equal
The starting point for turning this dialogue of the deaf into a more fruitful discussion is the recognition that not all data is equal. Some data is business critical to the degree that the company couldn’t function without it. Other data may need to be retained for compliance reasons, but is hardly ever accessed. Yet other data needs to be instantly accessible, but only until the next update, whether that’s in an hour, a day, or a week. Then there’s data that’s kept ‘just in case’, and no doubt a lot that should have been deleted a long time ago.
Ignoring the differences between the categories of data held by an organisation is not tenable, because it would be expensive in one way or another. Treating all data as ‘highly business critical’ is quite simply unaffordable from a storage perspective. At the other extreme, treating all data as ‘non-critical’ may be cheaper initially, but would represent increased operational risk, to the degree of threatening the viability of the business.
Deciding what matters
So how do you decide in which ‘bucket’ a particular data set belongs? All too often, it’s left to IT to make a best guess. However, it’s the business people who have the best understanding of what matters most in terms of data access and retention, so they need to be involved in the dialogue about storage very early on.
While IT has a responsibility to the business to provide services efficiently and effectively, it’s also incumbent upon the business to provide IT with the information required to make the most appropriate technology selections. In the case of storage, this is all about articulating the requirements for particular data sets in terms of availability, recovery, loss protection, security, longevity and performance.
That’s not to say that IT executives should use this type of terminology with their counterparts in the business. Instead, the dialogue should revolve around the value and importance to the business of the various data sets the organisation holds, along with any external mandates.
Having the answers to the type of question shown in Figure 1 allows IT to propose potential solutions. As it’s likely that trade-offs will have to be made, it’s usually a good idea to express the requirements and solutions in terms of a categorisation schema, e.g. gold, silver, bronze; or five star, three star, one star. It doesn’t matter which, as long as it provides business and IT executives with common ground for the discussion on what’s required for a particular data set from a business perspective, and what IT solutions are available. If the business requirement is for a ‘gold standard’ solution, but this turns out to be prohibitively expensive, the discussion can then move on to assessing the implications of going for the ‘silver’ alternative.
Other aspects of data storage need to be addressed as well in order to ensure that costs are kept to a minimum. For example, processes should be put in place for deleting data when it’s no longer required. And given how many files are stored in duplicate, triplicate, or even more, it may be worth investing in deduplication / compression technologies that help ensure that everything is stored only once.
However, the key step to an optimised storage strategy is for IT and the business to engage in a dialogue about the business value of particular data sets, and how this translates into storage requirements and technology options.
A more in-depth look storage can be found in a detailed white paper, available for free download here. A shorter treatment of the topic aimed at business executives is available here.
CLICK HERE TO VIEW ORIGINAL PUBLISHED ON
By Richard Edwards
By Dale Vile
By Bryan Betts and Dale Vile
Yesterdays software delivery processes are not up to dealing with today’s demands, but modernising you approach is not just about implementing Agile, even creating a DevOps culture. You need to focus on some specific, hard-core principles. ...more
By Dale Vile & Jack Vile
Cloud services are increasingly becoming part of the IT delivery mix, but a recent study of 378 senior IT professionals suggests a parallel commitment to ongoing investment in the datacentre. This in turn shines a light on the key role of modern application platforms. ...more
By Tony Lock & Dale Vile
Despite the advent to cloud computing the datacentre remains central to corporate IT. But with demands continuing to escalate, how do you ensure your infrastructure is powered robustly and efficiently? ...more
By Bryan Betts
Many are exploiting cloud computing to drive business advantage, while others are enjoying the flexibility and efficiency of DevOps. But what happens if you use both together in a coordinated manner? The answer is a significant amplification of the benefits of each. ...more