What’s your perfect device? Are you a BlackBerry, iPhone or Android smartphone fan? Are you wedded to your MacBook or Windows notebook? Do you see the iPad and the myriad of slates that will follow as the future of client computing? Regardless of preferences, the chances are that you have an opinion. You care about devices.
But hold on, how can that be right? Surely with everything supposedly moving into the cloud, the devices you use to access your internet based applications and data are not that important. Provided they run a browser, why do you care?
If you put idealism and purism to one side for a minute, the reasons are pretty obvious. Different form factors meet different requirements simply from an interaction perspective. If you are composing or manipulating content in a big way, then you probably want a decent screen (or screens), a keyboard and a mouse. If it’s lightweight business communications and document handling while on the road, then slates fit the bill pretty well. Just messaging and casual content related activity, and we are in smart phone territory.
We could go on and talk about entertainment and the options around video and sound, but the point is that depending on what we are doing, we ideally want a different size and shape of device, with different interaction, input and output capability.
The other consideration is local processing power and storage capacity. Even with today’s ‘pervasive’ wireless networks, we are still a long way away from being able to assume a fast and stable connection when out and about. The ability to access at least some of the applications and content we want when disconnected will therefore remain a requirement for some time to come. Whether it’s composing an email or watching a video on the train, you don’t want to be interrupted while your device tries to reconnect after losing the network.
Local device processing capability can also have a big impact on the user experience. Even on a reliable network with a lot of back-end horsepower, if all execution takes place server-side, network bandwidth constraints and latency can significantly limit the quality of graphics and video, as well as hampering overall responsiveness. But this is a theoretical drawback today as most of the devices we use, even the ones we think of as ‘dumb’, still have quite a bit of processing and graphics capability on board. It might not be obvious, but even when apparently just accessing Web content through a browser, our device is doing a lot of work locally – compressing and decompressing content, graphics rendering, and even executing application components that are automatically downloaded and run in the browser environment.
Beyond the browser, there is then the whole ‘app for that’ phenomenon initially popularised by the iPhone, but now common across the mobile industry. In this model the user is very consciously downloading applications to run locally, even though many of them are simply front ends for Web based services. This is despite the fact that many people are holding up the iPhone, iPad and similar as examples of devices designed for the cloud computing era.
All of this raises an important question. If the client device at the ‘edge’ of the cloud is doing so much work, doesn’t that negate a lot of the benefits touted by cloud advocates? Many of these are to do with the centralisation of complexity, yet the way things are going, the client side of the equation is, if anything, becoming more complex. Now, for example, if you are a developer and want to enable that optimised and robust user experience across a range of devices catering for different user needs and preferences, you must build, deploy, maintain and support multiple versions of the client component – for the iPhone, iPad, Android devices, Symbian devices, Windows phones and anything else that becomes popular – not to mention the good old PC and Mac platforms.
The truth is that we are seeing a resurgence in client/server computing. OK, so now it’s sexy mobile apps accessing Web services in ‘the cloud’ rather than PC front ends talking to database back-ends, but the consequences are the same from a cost and complexity perspective.
So, like many other aspects of cloud computing, you cannot take the promises of reduced complexity at face value across the board. While new ways of doing things promoted under the term ‘cloud computing’ provide some great options for optimising the way IT is delivered, it’s amazing how old principles and ideas resurface along the way. And as this happens, we must be careful not to forget the lessons of the past, in this case to do with managing complex distributed client/server landscapes.