|June 6, 2010 08:15 AM EDT||
The cloud essentially "consumerizes" all of IT, not just relatively unimportant bits like procuring personal hard- and software. This requires a whole rethinking of corporate IT, as the idea of any master design becomes unattainable. How can IT as a species survive this trend as it may render the education of a whole generation of IT-ers irrelevant? On the brighter side - it really caters for the talents of today's teenagers: consumption as a lifestyle.
The idea of consumerization - users being allowed to freely procure their own personal hard- and software - has been around for a while. But few CIOs and even less heads of IT Operations have embraced it. Other than some token adoption, where users could choose between an iPhone or a Blackberry or where users got a personal budget to order from the company supplied catalog of pre-approved hardware, we see little adoption of the concept. The idea is that users can go to any consumer store or webshop and order any gadget they like, be it an iPad, laptops, printer or smart phone and configure these basically while still in the store to access their corporate mail, intranet and company applications. The idea originated when people wanted to use their 24 inch HD pc with 4 processors and mega memory - all essential to enjoy modern home entertainment and video and far superior to company standard issue equipment- to also do some work.
Cloud computing now makes such a consumer approach also possible at the departmental level. Department selecting and using non corporate approved or endorsed SaaS based CRM applications are the most commonly used example. But more interesting are the cases where departments - tired of waiting for their turn in the never reducing application backlog of corporate IT - turned to a system integrator to build a custom cloud application to meet their immediate needs. Several system integrators indicate that they have more and more projects where no longer IT, but the business department, is their prime customer. Contracts, SLA's and even integrations are negotiated directly between the SI and the business department, in some cases IT is not even involved or aware.
So back to consumerisation. Although the trend has been far from whole heartily embraced by most corporate IT, it is continuing. In my direct environment I see several people who, instead of plugging their laptop into the corporate network at the office, take a 3G network stick to work. For around 20 Euros a month this gives them better performance accessing the applications they care about, not to mention it gives them access to applications most corporate IT department do not care for, like facebook, twitter, etc. Question is off course, can they do their work like that? Don't they need all day, full time access to the aforementioned fully vertically integrated ERP system? The answer is No. First of all, the vertically integrated type of enterprise that ERP was intended for, no longer exist. Most corporations have taken to outsourcing distribution to DHL or TNT, employee travel to the likes of American Express, HR payroll and expenses to XYZ, etc. etc. The list goes on and on.
All these external service providers support these services with web based systems that can be accessed from anywhere, inside and outside the company firewall. At the same time, the remaining processes that occur in the corporate ERP system are so integrated that they hardly require any manual intervention from employees. Consequently employees don't need to spend their time doing data entry or even data updates or analysis on that system. Any remaining required interaction is facilitated by directly interfacing with the customer via the web shop or via other web based systems. One could say that the world moved from vertically integrated manufacturing corporations to supply chain connected extended enterprises.
The question I will address in my next post is how does the cloud enabling consumerisation for enterprise applications play a role in this and what this means for IT moving forward.
On the supply side of IT, it means applications are best delivered as easily consumerable services to employees and others (partners, customers, suppliers). One large European multinational is already delivering all their new applications as internet (so not intranet) applications. Meaning any application can be accessed from anywhere by simply entering a URL and doing proper authentication. Choosing which applications to provide internally is based on whether there are outside parties willing and capable to provide these services or whether the company can gain a distinct advantage by providing the service themselves.
When speaking about consuming services, one should try and think broader than just IT services. The head of distribution may be looking for a parcel tracking system, but when asking the CEO or the COO they are more likely to think of a service in terms of something a DHL or TNT delivers. Services such as warehousing, distribution, but also complaint tracking, returns and repairs, or even accounting, marketing and reselling, all including the associate IT parts of those services. It is the idea of everything as a services, but on steroids (XaaSoS). Please note that even when an organization decides to provide one of these services internally, they can still source the underlying infrastructure and even applications "as a service" externally (this last scenario strangely enough is what many an IT person seems to think of exclusively when discussing cloud computing).
On the demand side of IT the issue is an altogether other one. How do we warrant continuity, efficiency and compliance, in such a consumption oriented IT World. If it is every man (or department) for themself, how do we prevent suboptimisation, In fact , how do we even know what is going on in the first place. How do we know what services are being consumed. This is the new challenge, and it is very similar to what companies faced when they decided to not manufacture everything themselves anymore, abandoning vertical integration where it made sense and taking a "supply chain" approach. Cloud computing is in many aspects a similar movement, and also here a supply chain approach looks like the way to go.
Such a supply chain approach means thoroughly understanding both demand and supply, matching the two and making sure that the goods - or in this case services - reach the right audience at the right time (on demand). IT has invested a fair amount of time and effort in better ways and methodologies to understand demand. On the supply side, IT till now assumed they were the supplier. In that role they used industry analysts to classify the components required, such as hardware and software. In this new world they need to start thoroughly understanding the full services that are available on the market. An interesting effort worth mentioning here is the SMI (Service Measurement Index) an approach to classify cloud services co-initiated by my employer, CA technologies and lead by Carnegie Mellon University.
After having gained an understanding of both demand and supply, the remaining task is "connecting the dots". This sounds trivial but is an activity that analysts estimate becoming a multi-billion industry within just a few years. It includes non-trivial tasks like identifying which users are allowed to do which tasks in this now open environment and optimizing the processes by picking resources that have the lowest utilization and thus cost. Because going forward scarcity will determine price especially in the new cloud world (which resembles Adam Smith's idea of a perfect open market a lot closer than any internal IT department ever did or will do).
Now off course all of the above won't happen overnight. Many a reader (and with a little luck the author) will have retired by the time today's vertically integrated systems - many of which are several decades old and based on solid, reliable mainframes - will have become services that are brokered in an open cloud market. A couple of high profile outages may even prolong this a generation or two more. But long term I see no other way. Other markets (electricity, electronics, publishing and even healthcare) have taken or are taking the same path. It is the era of consumption.
PS Short term, however, the thing we (IT) probably need most is a new diagraming technique. Why? From the above it will be clear that - in such a consumerised world - architecture diagrams are a thing of the past. And an IT person without a diagram is like a fish without water . We need something that allows us to evolve our IT fins into feet and our IT chews into lungs, so we can transition from water to land and not become extinct in the process. One essential aspect will be that unlike pictures of clouds and very much like real clouds, the diagrams will need to be able to change dynamically, much like pictures in a Harry Potter movie (it's magic). Who has a suggestion for such a technique ?
[Editorial note: This blog originally was published at ITSMportal.com on May 31st , 2010]
- The Top 250 Players in the Cloud Computing Ecosystem
- The Five Characteristics of Cloud Computing
- The Next Chapter in the Virtualization Story Begins
- 4th International Cloud Computing Conference & Expo Starts Today
- CIA was Headed to an Enterprise Cloud All Along: Jill Tummler Singer
- Deputy CIO of the CIA to Keynote 1st Annual GovIT Expo
- 5th Cloud Expo in New York: Themes & Topics
- Managing Cloud Infrastructure at Cloud Expo 2011 New York
- Nick Carr's Cloud-Network Disconnect
- "Private Clouds Cut Costs of Internal IT," Says Univa CEO