Cloud Computing isn't really a new idea that's just sprung up over the past few months. Rather, it's the summation of a series of technologies that have been converging for some time. Now, cloud computing has become a solid option for organizations of all sizes- so what's holding them back?
A clear requirement for effective cloud working is fast, reliable Internet access, especially for smaller organizations that are using public cloud providers. You need enough bandwidth so that those who need to collaborate and work across cloud technologies are able to do so. That's something that's quite easy to manage when the bulk of workers are in an office - provided the Internet connection is up - but it gets trickier with remote workers and those who travel.
Regular travelers will be aware of the difficulty of finding a workable, secure web connection. And given that one of the main benefits of cloud working is to have access to files and applications wherever you happen to be, variable web access quality has to be a concern.
This of course applies equally to people accessing in-house computers over the Internet. But when everyone in an organization is dependent on a cloud platform, there is an understandable concern that there is no fallback position for a vital part of a company's infrastructure- and one that underpins most other operations now.
Losing control of data
It's not just an emotional attachment to a computer room that keeps companies from outsourcing data. It's the fact that there's an inherent feeling of security in having data under close control (assuming there's some kind of remote backup for disaster recovery, of course).
Removing the need for local storage clearly has some cost benefits, but for a generation of system administrators and support staff brought up on a different way of working, it's a change that rings some alarm bells.
One in particular trusts an external source for working data- what happens if access drops or if someone loses the data? Even appreciating the security that cloud computing offers, there's a leap of faith and an element of uncertainty (along with a loss of transparency), that creeps in when data management is moved out of the immediate control of an IT department.
Fear of change
People do fear any sort of change, even in a fast moving industry such as IT. They don't tend to fear upgrades too much -although a jump say to the latest Microsoft Office can mean some firefighting- but a wholesale alteration in the way systems are supposed to work is a challenge.
It's hard not to have some sympathy with this. In many organizations, IT is a tool, a means to do a job, and nothing more than that. There's a strong argument that the software industry, in particular, has become adept at selling upgrades and alterations that we don't actually need, or that don't make a difference to daily life. So when a change comes along that does, reticence is hardly surprising.
Persuading people to alter the way that they've done things for years, whether attempted via carrot or stick, is rarely a straightforward battle.
So persuading key decision makers and their staff to embrace cloud computing can be a heck of a job in and of itself. Even the best-deployed cloud solution might therefore still be a bit of a bumpy ride (see also number 10- the human factor).
When data is being looked after by another party, it's right and proper that security issues are raised. Every business has confidential information that it likes to keep behind closed doors, and the fear that cloud computing could make such material more vulnerable isn't one that can simply be ignored.
Yet recent times have seen that the biggest source of confidential document leaks is more likely to have been a misplaced USB stick, using unsecured Internet connections and less than honest employees.
Reputable cloud service providers view security as pivotal to what they offer and, with the added help of a bit of common sense, there's a strong argument that most businesses would benefit from more robust security if they do migrate to a cloud service.
Clearly, this is a very real and sensible concern. There's no computer network in the world that doesn't have the risk of downtime at some point in its life. However, there's still the comfort blanket of being able to yell at an IT department and get up-to-the-minute information when it's a self-hosted computer network that's at the heart of the problem.
What happens, though, when that's taken out into a cloud environment? Who gets the ear-bashing then? And, more to the point, what happens when a cloud service a business is relying on goes down, even for a short period? With localized working, even without a network, having some machines with working productivity software installed at least means things can get done.
What's often forgotten in this argument, though, is that a cloud service stands a good chance of having a working, operational backup called into action quickly. Furthermore, one by-product of cloud adoption is that the maintenance and repair of problems is also outsourced, so it may well be that any problems are resolved faster than an in-house team can act.
Service providers should- and do also have incentives to ensure maximum uptime.
Paying the bill
There are some obvious economic benefits to adopting cloud services, from a reduction in dependency on in-house IT, to the outsourcing of data management and security, and the saving, potentially, on expensive software licenses, and battling to keeping hardware performance up to date.
Yet the future is uncertain. Technology is littered with examples of new innovations and developments that were initially designed to reduce costs, and yet many businesses are still investing heavily in their IT budgets.
Some valid questions arise about cloud computing, then. Is it offering value for money? What guarantees are in place that pricing won't slide upwards as businesses become more and more dependent on cloud services? Is this just software companies trying to get us to switch to a subscription system for licenses, and so the longer term cost may actually be higher?
These are appropriate questions, often with no immediate answer. There's an element of leap of faith, and a need to down a good service contract. But cast iron guarantees? They're lacking at present, and for firms with razor-tight balance sheets that has to be an issue.
It's too early
Even accepting the comparable maturity of some elements of cloud-based working (many, for instance, trust their email to a webmail service with little worry), there remains a feeling, rightly or wrongly, that this is still an area of computing that's in its infancy. Let's not forget, too, that lots of 'next big things' have gone on to be anything but. As such, many businesses are holding back from adopting cloud services as they wait to see how the assorted offerings develop, and as they let others do the path finding for them.
There's always some sense to not moving business-critical operations to areas where you'd be an early adopter (although cloud adoption can, and should, be done piecemeal), and there's a feeling that the wrinkles need to be resolved in cloud services before more firms embrace the potential on offer.
Yet, there's a degree of obvious myth to the argument that the services are immature- after all, Saleforce.com has been around since 1999, an eternity in IT terms. Rather, we are more likely reacting to changes in the way services are packaged and sold, although there are clear reasons to be cautious if, say, the local Internet infrastructure is rather poor and you want to invest in a public cloud system.
We are not lemmings
Much of the discussion surrounding cloud computing has implicit assumptions built into it. That it's the right thing to do. That it's the logical next step in business technology. That it's a question of when, rather than if, a company should take advantage of what cloud computing offers. In much the same way that it was once assumed that everybody would upgrade their copy of Windows within a couple of years of Microsoft releasing a new version, there's an impression sometimes put across that cloud computing will become compulsory.
But, of course, it isn't. There is doubt that the argument that cloud is the future has been convincingly made. Because, while there are potentially huge benefits to what's being offered, there's no one-size-fits-all mentality here. Is cloud computing really the right option for a small business of two or three people? Is it the right way forward for a large organization, with hundreds of employees in many different locations?
There are strong cases to be made in both instances that the answer is yes (again, down to the fact that you can choose what works for you, with public, private and hybrid cloud approaches on offer). But that doesn't mean that the case doesn't have to be made.
The benefits of the cloud have to be defined, be tangible, and be presented properly. It's the users who tumble over the cliff to follow the crowd who will, inevitably, hit problems, and fail to reap the intended benefits of what the cloud can offer.
What, actually, is it?
Arguably one of the biggest challenges facing cloud computing is this: how, exactly, do you define it? Because already, different service providers describe cloud products in very different ways. There doesn't appear to be one widely accepted definition of what cloud computing is, and without that, packaging and selling the benefits to organizations is made that bit more difficult.
Furthermore, part and parcel of the uncertainties surrounding cloud computing is the argument over standards. There's no solid, common and obvious foundation for cloud services to build on. Like it or lump it, people know where they are with a Windows operating system, a copy of Lotus Notes and some variant of an office suite. But what such common, unifying tools exist in the cloud? There are not, at this stage, obvious, dominant players in the market- although Amazon, Google and Salesforce.com would stake claims- and for companies looking for a big brand name to trust, that does have to come into their thinking.
Until cloud computing can be defined in a manner that's as understandable as an operating system or an office suite (and arguably, it can be defined as both), it's going to create some uncertainty within firms as to what exactly they're being sold, and how it allows them to work with others.
The human factor
At the heart of every significant problem to do with technologies the same factor: people. We have seen time and time again that you can have an IT infrastructure that's seemingly tight and secure, but it's a simple human slip that's opened up an element of risk. Furthermore, someone who doesn't fully understand, or doesn't want to understand, what it is that they're being presented with, will always cause some degree of problem.
It may be straightforward to get the MD and finance director to sign off on the financial benefits, but just look at how hard it is to get 'buy-in' for end users to use systems such as CRM (customer relationship management). New web-based systems in the cloud may pose similar problems.
Realistically, of course, every issue we've discussed here has a person at the heart of it, or a fear of what someone can and inevitably will do when given the keys to something new and different (and that's just one individual: the potential dangers multiply exponentially when people hunt in packs).
It's a big problem, and why many businesses are keen to retain the technological status quo, in that it keeps the human-technology balance in a position that it's migrated to over a period of time.
It's important never to forget the 'people factor' when bringing in any new technology.