Product Innovation vs Product Complication: The “one minute strategy” syndrome

When is product innovation a hindrance?

jackalopeWhen it is innovation for innovation sake?

Let’s look at a company known for innovation and great products –Google; they have it right – or do they? It seems to me that product companies have teams that create wonderful new products, (Gmail, Google Maps) and you eagerly await the next one…what is the issue with that you say? The issue is these teams seem to stay with a product and tweak it and tweak it and feature enrich it until it ends up worse with each release. The age old maintainability and reliability over complication leads to poor usability.

These days, more often than not your hear the street complaining about the latest version-“Why did they remove select all?”; ”The search on the new version is awful, where is my old search?”; “Why are they auto filtering my email? If I wanted it in that folder I would have set a rule up…” – all complaints due to playing with the old rather than innovating the new.

Let’s look for a moment to the Insurance market – remember years ago when “time to market” was big? What did they do? Tweak the PAS systems to the point you can set up a new product in three days – to what gain? You cannot create and file a product in that time so why be able to get the PAS updated so fast? And why do they continue to invest in that feature?

Tweaking an old PAS leads to less reliability and increased maintenance; shortening the lifespan and forcing large injections of capital to replace – Why? Why not invest the short term dollars in enabling your enterprise to grow, facilitate change. The new breed of PAS systems are built on real components, with disconnected services and messaging – the Component Architecture.

More and more vendors are building tools based on such an architecture, removing the heavy integration and allowing the evolution of an enterprise component by component. No more breaking the old by adding the new, instead a simpler, proven method of extending or replacing the old without the integration and conversion nightmares of the past.

Why do I say product extension and feature tweaking is a “one minute strategy”? Because just like most Enterprise Roadmaps we see today, this development and innovation model lack thought and lacks a strategic vision. Instead of taking a moment to right the ship and steer a course into calmer water, these strategies propagate tactical solutions that never reach to the nub of the issue. We do not all have $50M to spend to keep replacing so we need an alternative, a different mode and that mode is to embrace the new architecture, get ready for it and then be able to accelerate change.

So next time you look at your product or your enterprise, try to think out of the box and stop trying to make a steering wheel better – it is round and it works…..look for ways to integrate it better….allowing future growth and expansion without the growing pains.

The Cloud Has No Clothes!

Emperor's New ClothesEverybody remembers the classic fairy tale where an emperor and his people are conned in to believing he was attired in a fantastically beautiful set of clothes, when in fact he was in the buff.  No one was willing to admit they did not have the refined taste and intelligence to see the spectacular cloth and splendid robes. It took the strength of innocence in a child to point out the truth. I am about as far from an innocent child as one can get, but it appears to me the cloud is parading about naked.

Every vendor has a cloud offering, every pundit “agrees” the cloud is the future, investors value every cloud company with a premium, every data center operator is “born again” as a cloud player. Every CIO has a cloud initiative and budget line. Really, I have seen this movie plot before, and it does not end well, especially for the Emperor (and the con-men vendors too).

We have worked internally on projects as well as externally with clients to implement aspects of the “cloud”. Results have been mixed and in the process gathered some hard won experience which I will condense here (while protecting both the clothed and the naked).

First, Software as a Service (SaaS) will work if adopted with minimal software modification and maximum adoption of it’s native business process. It is very cost effective if it precludes investment in internal IT infrastructure and personnel, not bad if it slows the growth of same. Outsourcing well-defined rote functions to the SaaS route works well (such as Email).  Adopting SaaS for new non-strategic functions tends to be successful where there are few users and a high degree of specialization. Data backup into the cloud is an excellent example regarding highly specialized solutions that take advantage of economies of scale provided in hardware.

SaaS fails in terms of cost or functionality when it is subject to customization and extension. Service costs tend to swamp the effort from initial modification through long-term maintenance (humans=$$$$). Costs will especially spiral when you combine many users and many customizations.  Remember the “Keep It Simple, Stupid” (KISS) principle saves money and points to success.

Buying virtual machines in the cloud works well if the configuration is simple; few software products, few users, straightforward integration. Development and early deployment is particularly attractive, as is usage by start-up companies and software proofs, tests, and trials. Again, the KISS principle reigns supreme. Remember hardware continues to drop in price and increase in capacity.  Package software costs are stable. Understand the billing algorithms of the key “clouds”. Each has its cost advantages and drawbacks, and they change rapidly under increasing competition and hype. Always benchmark medium to long-term cloud virtual machines against native hardware virtual machine implementations, the results may surprise you (I have been surprised over and over).

The Emperor’s story is an old one and so is the cloud concept in principle; remember its first turn on the karmic wheel of optimizing the highest cost component was time-sharing. This strategy optimized the high cost of proprietary hardware/software (remember IBM and the Seven Dwarfs, but I digress into another fairy tale). As minicomputers (Digital, Data General, Wang) dropped the price of hardware through competition with IBM, software packages became the gating factor. Workstations continued the trend by another factor of 10 reduction in cost of hardware and package software (human service costs are rising).  Wintel and the Internet have driven the marginal cost of raw computing to almost zero compared to the service component. As hardware has followed Moore’s law and software package economies of scale moved to millions of copies, the human costs have skyrocketed in both relational and absolute terms.

If we can keep history as our lens and focus on our cost pressure points, we can maintain our child-like innocence and see others prancing naked while we keep our kilts and heads about us.

Yammer or SharePoint 2013 for the Social Enterprise?

In buying Yammer last year, Microsoft pretty much acknowledged that it dropped the ball on social and needed to bring in external reinforcements. Acquiring Yammer also fits well with the new cloud services approach of office 365. The vision according to Microsoft is cloud first. They love the ability to roll out changes and fixes on a faster pace, but mostly, they love the business model.

At the same time SharePoint 2013 includes a much improved set of tools for social collaboration including a brand new activity stream app. So what should you use? Yammer or SharePoint 2013 built in social tools?

Here is the timeline and guidance as provided by Microsoft:

If you are a SharePoint cloud user – go with Yammer. There is a basic integration available now with the promise of single signon in the fall. They also promise updates every 90 days.

If you are an on-premise user (and most companies are since SharePoint 2010 online was not very good..) and moving to SharePoint 2013, the decision is a bit more complicated.

Yammer offers an existing app for SharePoint 2010 that can be integrated in if you are a paying Yammer customer, but nothing yet announced for SharePoint 2013.

So the only option really is to deploy the SharePoint social services unless you are already using Yammer Enterprise and can wait if/until they support 2013.

The longer term roadmap beyond 2014 is cloudy as well. Yammer is a cloud offering and will clearly be tightly integrated into office 365 but as much as Microsoft would like to, not everyone will get on their cloud platform that quickly. In all likelihood, Microsoft will continue to support and even release new version of SharePoint on premise but certain aspects will likely not be improved much and Social seems one of them. Yammer will become a selling point and an incentive to go cloud.

Another interesting point is how will this work for Hybrid Deployments and how migration to the cloud will handle the social data or be able to migrate it into Yammer. We’ll have to wait and see..

For more details see the official blog post from Microsoft and an interesting post on ZDNET on how Microsoft approached social for their internal Intranet, apparently using both models and giving users the choice when creating a collaboration site based on their primary need – document based (SharePoint) or activity stream (Yammer). Now, if only one site could do both..

Cloud 2012 Redux

Ready for Cloud-01

You shouldn’t have to commit everything at once

This year will be remembered as the year the cloud moved beyond the realm of “Back to Time-Sharing” or a curio for greenfields and start-ups.  While Software as a Service (SaaS) is interesting, it can not be a center piece of your IT infrastructure or strategy due to its limited scope and cost/scalability metrics.  By the same token, every IT system is not a greenfield opportunity, and most require a steady evolutionary response incorporating the existing infrastructure’s DNA and standards.

Just illustrating a “private cloud” with a “public cloud” next to it does not cut it.  What does that really mean?  Ever wonder what is really in that cloud(s)?  Better yet, in safe understandable steps, explain it; cost benefit 3-5-7 year projections, organizational impact for IT and business process, procedural impact for disaster recovery, etc.  Sorry, “Just buy my product because it is what I have to sell!” does not work; I need a tested time-phased architectural plan, with contingencies, before I commit my company and job.

For the first time in the continuing cloud saga, we have been able to put together and test a “non-aligned” approach, which allows an organization to keep IT infrastructural best practice and not “sign-in-blood” to any individual vendor’s ecosystem.  With the proper design, virtual machines (VMs), can be run on multiple vendors’ platforms (Microsoft®, Amazon.com®, etc.) and on-premise, optimized to cost, performance, and security. This effectively puts control of cost and performance in the hands of the CIO and the consuming company.

In addition, credible capabilities exist in the cloud to handle disaster recovery and business continuity, regardless of whether the supporting VMs are provisioned on premise or in the cloud. Certain discreet capabilities, like email and Microsoft Office™ Automation, can be “outsourced” to the cloud and integration to consuming application systems can be maintained in the same manner many organizations have historically outsourced functions like payroll.

The greatest benefit of cloud 2012 is the ability to phase it in over time as existing servers are fully amortised and software licences roll-off and require renewal.  Now we can start to put our plans together and start to take advantage of the coming margin-cutting wars of the Cloud Titans in 2013 and beyond.

Time to Remodel the Kitchen?

Although determining full and realistic corporate valuation is a task I’ll leave to people of sterner stuff than I (since Facebook went public, not many could begin to speculate on the bigger picture of even small enterprise valuation), I’ve recently been working with a few clients whom have reminded me of why one sometimes needs to remodel.

Nowadays, information technology is often seen as a means to an end. It’s a necessary evil. It’s overhead to your real business. You joined the technological revolution, and your competitors who didn’t, well… sunk. Or… you entered the market with the proper technology in place, and, seatbelt fastened, have taken your place in the market. Good for you. You’ve got this… right?

I’m a software system architect. I envision and build out information technology. I often like to model ideas around analogies to communicate them, because it takes the tech jargon out of it. Now that I’ve painted the picture, let’s think about what’s cooking behind the office doors.

It’s been said that the kitchen is the heart of the home. When it comes to the enterprise (big and small) your company’s production might get done in the shop, but sooner or later, everyone gets fed business processes, which are often cooked in the kitchen of technology. In fact, technology is often so integral to what many companies do nowadays that it’s usually hard to tell where, in your technology stack, business and production processes begin. Indeed, processes all cycle back around, and they almost certainly end with information technology again.

Truly, we’ve come a long way since the ’70s, when implementing any form of “revolutionary” information technology was the basis of a competitive advantage. Nowadays, if you don’t have information technology in the process somewhere, you’re probably only toying with a hobby. It’s not news. Technology graduated from a revolutionary competitive advantage to the realm of commoditized overhead well over a decade ago.

Ok… ok… You have the obligatory kitchen in your home. So what?

If you think of the kitchen in your home as commoditized overhead, you probably are missing out on the even bigger value an update could bring you at appraisal time. Like a home assessment, due diligence as part of corporate valuation will turn up the rusty mouse traps behind the avocado refridgerator and under the porcelain sink:

  • Still rocking 2000 Server with ActiveX?
  • Cold Fusion skills are becoming a specialty, probably not a good talent pool in the area, might be expensive to find resources to maintain those components.
  • Did you say you can spell iSeries? Great, can you administer it?
  • No one’s even touched the SharePoint Team Services server since it was installed by folks from overseas.
  • The community that supported your Open Source components… dried up?
  • Cloud SLAs, Serviceability?
  • Compliance?
  • Disaster Management?
  • Scalability?
  • Security?
  • Documentation…?
    • Don’t even go there.

As you can see… “Everything but the kitchen sink” no longer applies. The kitchen sink is transparently accounted for as well. A well designed information technology infrastructure needs to go beyond hardware and software. It considers redundancy/disaster management, security, operating conditions, such as room to operate and grow, and of course, if there are any undue risks or burdens placed on particular technologies, vendors, or even employees. Full valuation goes further, looking outside the walls to cloud providers and social media outlets. Finally, no inspection would be complete without a look at compliance, of course.

If your information technology does not serve your investors’ needs, your CEO’s needs, your VP of Marketing and Sales’ needs, as well as production’s… but most importantly your customers’, your information technology is detracting from the valuation of your company.

If the work has been done, due diligence will show off the working utility, maintainability, security, scalability, and superior added value of the well-designed enterprise IT infrastructure refresh.

To elaborate on that, a good information technology infrastructure provides a superior customer experience no matter how a customer chooses to interact with your company. Whether it’s at the concierge’s counter, in the drive-through, at a kiosk, on the phone, at your reseller’s office, in a browser or mobile app, your customers should be satisfied with their experience.

Don’t stop with simply tossing dated appliances and replacing them. Really think about how the technologies work together, and how people work with them. This is key… if you take replacement appliances off the shelf and simply plug them in, you are (at best) merely keeping up with your competitors. If you want the full value add, you need to specialize. You need to bend the components to your processes. It’s not just what you’ve got.  It’s how you use it.  It’s the critical difference between overhead and advantage.

Maybe the Augmented Reality Kitchen won’t provide a good return on investment (yet), but… there’s probably a lot that will.

Are you Paralyzed by a Hoard of Big Data?

Lured by the promise of big data benefits, many organizations are leveraging cheap storage to hoard vast amounts of structured and unstructured data. Without a clear framework for big data governance and use, businesses run the risk of becoming paralyzed under an unorganized jumble of data, much of which has become stale and past its expiration date. Stale data is toxic to your business – it could lead you into taking the wrong action based on data that is no longer relevant.

You know there’s valuable stuff in there, but the thought of wading through all THAT to find it stops you dead in your tracks.  There goes your goal of business process improvement, which according to a recent Informatica survey, most businesses cite as their number one Big Data Initiative goal.

Just as the individual hoarder often requires a professional organizer to help them pare the hoard and institute acquisition and retention rules for preventing hoard-induced paralysis in the future, organizations should seek outside help when they find themselves unable to turn their data hoard into actionable information.

An effective big data strategy needs to include the following components:

  1. An appropriate toolset for analyzing big data and making it actionable by the right people. Avoid building an ivory tower big data bureaucracy, and remember, insight has to turn into action.
  2. A clear and flexible framework, such as social master data management, for integrating big data with enterprise applications, one that can quickly leverage new sources of information about your customers and your market.
  3. Information lifecycle management rules and practices, so that insight and action will be taken based on relevant, as opposed to stale  information.
  4. Consideration of how the enterprise application portfolio might need to be refined to maximize the availability and relevance of big data. In today’s world, that will involve grappling with the flow of information between cloud and internally hosted applications as well.
  5. Comprehensive data security framework that defines who is entitled to use the data, change the data and delete the data, as well as encryption requirements as well as any required upgrades in network security.

Get the picture? Your big data strategy isn’t just a data strategy. It has to be a comprehensive technology-process-people strategy.

All of these elements, should of course, be considered when building your big data business case, and estimating return on investment.

Is Custom Development Dead?

Is Custom Development dead? After the last two years of custom development’s nuclear winter, (following 2008s Financial Armageddon), one would think the the Grim Reaper did his best in the blast. I really hope not, designing and building strategic systems make the more mudane aspects of software engineering worth enduring the mind-numbing syntactical pain of creation. Nothing like the smell of napalming the competition with a totally new way of doing business in the morning (my view of “Apocalypse Now” with a business bent). Maybe, just maybe, I hope rumors of Custom Development’s death are greatly exaggerated.

Did Custom Development die of natural causes, maybe pulled off of life support by risk-hating Executive Management as a perverse form of parental control after the financial snafu (Custom Development moves from PG-13 to XXX)?  Off-the-Shelf software products and the ever increasing cost of continuing maintenance really hurt Custom Development as a viable systems choice, but is that enough to put it down? Cloud and “nouveau Cloud” technologies (read SaaS, SalesForce.com) may have provided the coup de grace.  I seriously doubt it, every time I look into the Cloud I get serious PTSD flashbacks to the 70s and 80s IBM Mainframe World Domination (OMG there is a 3270 in the corner!!).  At least there was alot less hype and easier choices back then (Nobody got fired picking IBM!).

It is possible Custom Development died offshore (simple Mickey Finn, bag over the head, Shanghaied and Held for Ransom!)?  While Business Processes and System Maintenance have done reasonably well offshore (Castor Oil of the Corporate world, let Mikey take it!), strategic custom development has had less success.  Quality innovation that can transform a corporation really requires a local team steeped both in the host company and surrounding culture.  Plus, Custom Development tends to have a high infant mortality rate so it is best attempted in short phases supported by an Agile Methodology, definitely not in Offshore’s financial model wheelhouse.  So I don’t think Offshore is implicated.

There is the theory that evolution has spoken and Product-based Solutions have succeeded Custom Development, just as mammals succeeded dinosaurs.  Product companies would like you to believe that, but does that seem plausable (Land of the Lost, Jurassic Park where are you?)?  While Product-based solutions have advantages in success rates and cost, they by their nature lack the true freedom that drives raw creativity and innovation.  Custom Development is that wellspring.

The only thing we have to fear, is fear itself!  Adversity to risk is curbing animal spirits, creativity, and innovation, ….for now.  Custom Development is not dead and will return from its vacation with Puff in the Cave by the Sea when Jackie Paper locks-and-loads and we begin some serious innovation and transformation with strategic custom software systems (BTW thats when the Jobs return too!).

6 ways to get your web presence and infrastructure in shape for 2010

In this lingering recession, everyone is looking for new ways to better position themselves to compete and grow revenue. A lower level of consumer and business spending will require efficiency, careful optimization and leverage of low cost assets and methods. It’s time to get into shape! Here are 6 ways to revamp and strengthen your web sites and infrastructure on a modest budget:

Revamp your web strategy for a web 2.0+ world.

The internet world has dramatically changed in the last 3-4 years. Social networks, user communities, user generated content, twitter, the iPhone and other mobile devices, GPS and location aware devices and the other components of Web 2.0 completely altered the way businesses and users communicate and transact online. Each of the Web 2.0 components come with their own set of opportunities and challenges. They provide new channels that enable communication at a fraction of the cost while demanding a new approach to openness, transparency and interactivity. Regulatory, security and governance concerns are not always easy to address. Chart a path in these new waters by rethinking your Web Strategy and redefine the role that the web and other digital channels will play in the company’s future and put a plan in place for its execution.  

Implement a social media strategy and measure its value

Social media tools are a great way to build honest online relationship with customers and other audiences. Doing it right is not always easy. A social media strategy will force you to think through and define where to be and what is to be communicated, set the tone and nature of interactions, set guidelines on how to respond to negative feedback, factor in legal and regulatory implications, address intellectual property and security issues and many other aspects need to be thought through. In addition, measuring the impact of these activities is not always easy. Building a model that can assess and provide value guidelines is very important. 

Reduce costs by Leveraging open source and Cloud web infrastructure components

We have a client who recently came to us asking advice after a planned $3M Oracle e-business implementation turned into a projected $15M 3 year project. We recommended they look at OfBiz and other open source e-commerce frameworks. Open source enterprise level software , SaaS and Cloud Computing have matured to the level that major organizations are leveraging these low cost scalable solutions to build a robust infrastructure that can replace big investments in hardware, software licenses and data centers.  

Take control of your content – Deploy a Content Management Solution

For many companies, fresh content is key to repeat visits. As sites scale, managing and maintaining them becomes an expensive and difficult task often dependent on IT or external resources. Content Management Systems (CMS) provide business users with the ability to modify and update sites and global structures that make graphical changes easy to implement. They also provide ability to segment users, add personalization and social features such as Blogs and community without the need for additional software and services.

User Experience Redesign

If your website has not gone through a redesign in the last 3 years, chances are that it looks dated. What looks fresh and relevant changes all the time and the key in the last few years has been incorporation of user engagement and interactivity, quality content that speaks more directly to the users, content targeting and using sites as relationship building tools rather than one way communication streams. Sites need to add rich content, video and mobile support as well as dynamic interfaces. All these changes contribute substantially to improved website ROI

Optimize sites for goals and conversion

It’s crucial that every marketing and search dollar is well spent. To do this, websites need strong web analytics so that sites can be continuously optimized to maximize conversion and be careful to avoid the main pitfalls. Web analytics capability allows businesses to test new ideas, layouts and promotions and to quickly refine them to drive sales and traffic as well as optimize search and marketing spend. With Google analytics and other low costs services, setting great analytics does not have to mean big bucks.

IT Cost Cutting and Revenue Enhancing Projects

scissorsIn the current economic climate the CIOs and IT managers are constantly pushed to “do more with less”. However, blindly following this mantra can be a recipe for disaster. These days IT budgets are getting squeezed and there are fewer resources to go around however, literally trying to “do more with less” is the wrong approach. The “do more” approach implies that IT operations were not running efficiently and there was a lot of fat that could be trimmed — quite often that is simply not the case. It is not always possible to find a person or a piece of hardware that is sitting idle which can be cut from the budget without impacting something. However, in most IT departments there are still a lot of opportunities to save cost. But the “do more with less” mantra’s approach of actually trying to do more with less maybe flawed! Instead the right slogan should be something along the lines of “work smarter” or “smart utilization of shrinking resources”; not exactly catchy but conveys what is really needed.

polar bearWhen the times are tough IT departments tend to hunker down and act like hibernating bears – they reduce all activity (especially new projects) to a minimum and try to ride out the winter, not recognizing the opportunity that a recession brings. A more productive approach is to rethink your IT strategy, initiate new projects that enhance your competitive advantage, cut those that don’t, and reinvigorate the IT department in better alignment with the business needs and a more efficient cost structure. The economic climate and the renewed focus on cost reduction provides the much needed impetus to push new initiatives through that couldn’t be done before. Corporate strategy guru Richard Rumelt says,

“There are only two paths to substantially higher performance, one is through continued new inventions and the other requires exploiting changes in your environment.”

Inventing something substantial and new is not always easy or even possible but as the luck would have it the winds of change is blowing pretty hard these days both in technology and in the business environment. Cloud computing has emerged as a disruptive technology and is changing the way applications are built and deployed. Virtualization is changing the way IT departments buy hardware and build data centers. There is a renewed focus on enterprise wide information systems and emergence of new software and techniques have made business intelligence affordable and easy to deploy. These are all signs of major changes afoot in the IT industry. On the business side of the equation the current economic climate is reshaping the landscape and a new breed of winners and losers is sure to emerge. What is needed is a vision, strategy, and will to capitalize on these opportunities and turn them into competitive advantage. Recently a health care client of ours spent roughly $1 million on a BI and data strategy initiative and realized $5 million in savings in the first year due to increased operational efficiency.
 
Broadly speaking IT initiatives can be evaluated along two dimensions cost efficiency and competitive advantage. Cost efficiency defines a project’s ability to lower the cost structure and help you run operations more efficiently. Projects along the competitive advantage dimension provide greater insight into your business and/or market trends and help you gain an edge on the competition. Quite often projects along this dimension rely on an early mover’s advantage which overtime may turn into a “me too” as the competitors jump aboard the same bandwagon. The life of such a competitive advantage can be extended by superior execution but overtime it will fade – think supply-chain automation that gave Dell its competitive advantage in early years. Therefore such projects should be approached with a sense of urgency as each passing day erodes the potential for higher profits. In this framework each project can be considered to have a component of each dimension and can be plotted along these dimensions to help you prioritize projects that can turn recession into an opportunity for gaining competitive edge. Here are six initiatives that can help you break the IT hibernation, help you lower your cost structure, and gain an edge on the competition:

Figure-1-Categorization-of-

Figure 1: Categorization of IT Projects 

Figure-2-Key-Benefits

In the current economic climate no project can go too far without an ROI justification and calculating ROI for an IT project especially something that does not directly produce revenue can be notoriously hard. While calculating ROI for these projects is beyond the scope of this article I hope to return to this issue soon with templates to help you get through the scrutiny of the CFO’s office. For now I will leave you with the thought that ROI can be thought of in terms three components:

  • A value statement
  • Hard ROI (direct ROI)
  • Soft ROI (indirect ROI)

Each one is progressively harder to calculate and requires additional level of rigor and detail but improves the accuracy of calculation. I hope to discuss this subject in more detail in future blog entries.

Cloud Computing Trends: Thinking Ahead (Part 3)

cloud-burstIn the first part of this series we discussed the definition of cloud computing and its various flavors. The second part focused on the offerings from three major players: Microsoft, Amazon, and Google. The third and final part discusses the issues and concerns related to the cloud as well as possible future directions.

A company may someday decide to bring the application in-house due to data security or cost related concerns. An ideal solution would allow creation of a “private in-house cloud” just like some product/ASP companies allow option of running a licensed version in-house or as a hosted service. A major rewrite of existing applications in order to run in a cloud is probably also a non-starter for most organizations. Monitoring and diagnosing applications in the cloud is a concern. Developers must be enabled to diagnose and debug in the cloud and not just in a simulation on a local desktop. Anyone who has spent enough time in the trenches coding and supporting complex applications knows that trying to diagnose complex intermittent problems in a production environment by debugging on a simulated environment on a desktop is going to be an uphill battle to say the least. A credible and sophisticated mechanism is needed to support complex applications running in the cloud. The data and meta-data ownership and security may also give companies dealing with sensitive information a pause. The laws and technology are still playing catch-up when it comes to some thorny issues around data collection, distribution rights, liability, etc.

If cloud computing is to truly fulfill its promise the technology has to evolve and the major players have to ensure that a cloud can be treated like a commodity and allow applications to move seamlessly between the clouds, without requiring a major overhaul of the code. At least some of the major players in cloud computing today don’t have a good history of allowing cross-vendor compatibility and are unlikely to jump on this bandwagon anytime soon. They will likely fight any efforts or trends to commoditize cloud computing. However, based on the history of other platform paradigm shifts they would be fighting against the market forces and the desires of their clients. Similar situations in the past have created opportunities for other vendors and startups to offer solutions that bypass the entrenched interests and offer what the market is looking for. It is not too hard to imagine an offering or a service that can abstract away the actual cloud running the application.

New design patterns and techniques may also emerge to make the transition from one cloud vendor to another easier. Not too long ago this role was played by design patterns like the DAO (data access object) and various OR (object relational) layers to reduce the database vendor lock-in. A similar trend could evolve in the cloud based applications.

All of the above is not meant to condemn cloud computing as an immature technology not ready for the prime time. The discussion above is meant to arm the organization with potential pitfalls of a leading edge technology that can still be a great asset under the right circumstances. Even today’s offerings fit the classic definition of a disruptive technology. Any organization that is creating a new application or over hauling an existing one must seriously consider architecting the application for the cloud. The benefits of instant scalability and “pay for only what you use” are too significant to ignore, especially for small to mid size companies. Not having to tie up your cash in servers and infrastructure alone warrants a serious consideration. Also not having to worry about setting up a data center that can handle the load in case your application goes viral is liberating to say the least. Any application with seasonal demand can also greatly benefit. If you are an online retailer the load on your website probably surges to several times it average volume during the holiday shopping season. Having to buy servers to handle the holiday season load which then remains idle during rest of the year can tie up your capital unnecessarily when it could have been used to grow the business. Cloud computing in its current maturity may not make sense to pursue for every enterprise. However, you should get a solid understanding of what cloud computing has to offer and adjust the way you approach IT today. This will position you to more cost effectively capitalize on what it has to offer today and tomorrow.