It’s the End of the World As We Know It!

The Holidays are a great for watching “End of the World” shows on the History Channel. They were a great comfort, actually almost encouraging, because all of the prophecies target 2012.  “The Bible Code II”, “The Mayan Prophecies”, and the Big 2012 Special compendium of End of the World scenarios, covering Nostrodamus to obscure German prophets, all agree that 2012 is the big one (Dec 21 to be exact!)  What a relief!, the rest of the news reports are trending to canned goods, shotguns, and gold by the end of the year.  We really have almost a whole 48 months before everything goes bang (I wasn’t ready anyway, procrastination rules!).

Unfortunately, we need to do some IT planning and budgeting for the new year and probably should have some thoughts going out 36 months (after that see the first paragraph).  As I discussed in a prior blog, the reporting, BI/CPM/EPM, and analytics efforts are the strongest priority; followed by rational short cost savings efforts.  All organizations must see where they are heading and keep as much water bailed out of the corporate boat as possible.  Easy call, job done! 

Then again a horrifying thought occurred to me, what if one of these initiatives should fail? (see my nightmares in prior blog posts on Data and Analytics).  I am not saying I’m the Mad Hatter and the CEO is the Red Queen, but my head is feeling a bit loosely attached at the moment.  Management cannot afford a failed project in this environment and neither can the CIO in any company (remember CIO=Career Is Over).

The best way to ensure sucessful project delivery (and guarantee my ringside lawn chair and six-pack at Armageddon in 2012) lies in building on best practice and solid technical architecture.  For example, the most effective architecture is to use a layer of indirection between the CPM application (like Planning & Budgeting) and the source data systems (ERP, Custom transactional).  This layer of indirection would be for data staging, allowing transfer to and from fixed layouts for simplified initial installation and maintenance.  In addition, this staging area would be used for data cleansing and rationalization operations to prevent polluting CPM cubes with uncontrolled errors and changes.  In terms of best practice, libraries and tools should be used in all circumstances to encapsulate knowlege rather than custom procedures or manual operations.  Another best practice is to get procedural control of the Excel and Access jungle of wild and wooley data which stands ready to crash any implementation and cause failure and embarassment to the IT staff (and former CIO).  When systems fail, it is usually a failure of confidence in the validity or timeliness of the information whether presented by dashboard or simple report.

CPM, EPM, and Analytics comprise and convey incredibly refined information and decisions of significant consequence are being made within organizations to restructure and invest based on this information.  The information and decisions are only as good as the underlying data going into them.  So skimping on the proper implementation can put the CIO’s paycheck at serious risk (Ouch!).

IBM Anounces Certification in Cloud Computing .. Huh?

IBM first announced a competency center in Cloud Computing, then a Certification over the past couple of weeks.  Well, I guess the old Druid Priests of Mainframes should recognize the resurrection of their old God TimeSharing.  Here we are, back and rested from the Breast of Gaia, Greener than druidismGreen (Drum Roll Please…….): Cloud Computing!  (Cloud Computing quickly adjusts his costume makeover to hide Ye Olde TimeSharing’s wrinkled roots)  Yes! here I am fresh, new, exciting, Web 2.0, Chrome Ready!  With me are the only guys (Big Smile from IBM!) who can Certify and Consult in My Mysteries…. IBM!

The more things change the more they stay the same, but this pushes the Great Hype Engine to a new high (or low..ha ha ha).  I can understand IBM wanting to jump on the Cloud Computing bandwagon, but are we really ready for a certification?  No one is really sure what is in the Cloud, or how it operates, but IBM is ready to lock it and load it.  Yep, they are Certifiable! (ha ha ha!).  While one can admire the desire to define and take a stand on Cloud Computing; this is one topic that requires a bit more “cook time” before full scale avarice takes hold.

Cloud Computing to too “cloudy” and “amorphous” to define today.  While expertise and advice is required, there needs to be more independent vetting and best-of-breed component level competitions.  Full solution demo platforms need to be put together to elicit ideas and act as pilots.  Case studies need to spring from these efforts as well as early adopters before an organization bets the farm on a Cloud solution.  The existing ERP platforms did not come into being overnight and these solutions have an element of their interdepency and complexity (Rome was not built in a day!).  All of the elements of backup, disaster recoverability, auditability, service level assurance, and security need to be in place before there can be a total buy in to the platform.  The reputation of Cloud Computing does hang in the balance, all that is required is one high visibility failure to set things back for potentially years (especially given the current macro environment).

Above all at this stage, a certain level of independence is required for evaluation and construction of possible solutions.  Evolution mandates independent competition (Nature Red of Tooth and Claw, Cage Fighting, Yes!).  Maturity brings vendor ecosystems and the all consuming Application Stack, but not yet.  More innovation is required, we may not even have heard of the start-up who could win this game.

The Fog Has Engulfed Us Captain! What Do We Do?

Sailing in fogThe current business environment reminds me of being socked in a fog bank in minutes, after being on a pleasant summer sail.  The entire episode puts the pucker factor meter in the red zone.  One minute clear sun and nice breeze, the next you can’t see your hand in front of your face.  Your other senses become more acute  — suddenly you hear the splash of the waves on the rocks you cannot see (funny I didn’t hear that a minute ago).  The engines of power boats are closer, seeming to come at your every quarter (PT109 how bad can it be?).

As you sit in the cockpit with your canned air fog horn and US Coast Guard approved paddle, you think that the portable marine radio you bought will not save your sorry carcass (at least you can get the Coast Guard to retrieve your drowned body as you go down).  You kick yourself for not buying that radar instead of the case of wine as a boating accessory (in fact, you think of downing some of that right now to ease your passing).  What you would not give for just a little visibility.

That’s what running a business feels like right now (makes you want to puke doesn’t it, what fun).  My Kingdom for some Visibility!  Sure, you can see what the others are doing; cut a few heads there, shut a facility there.  Is that the right thing to do?  Are you killing your future seed corn or bailing the water which will sink the company?  Ugh!  In this case, you really wish your company’s reporting could be that radar to tell where and where not to go (sure wish I got that CPM Package rather than that Sales meeting in Napa Valley).  With dashboards, planning and budgeting, consolidation, and operational BI, I would have a much better sense of what to feed and what to kill to take advantage of my competitors coming out of this economic fog (Aye Captain! in the Bay of the Blind the One Eyed Man is Admiral!).  Wishing and regrets won’t get you much, and capital investment at this point seems to be a dirty word (Yep, there it is on George Carlin’s list).

In the case of my sailing experience, the way I dug out of the fog and fear was to dig out the depth finder the former owner left behind and the charts I bought because it seemed like a good idea at the time.  I then proceeded to steer the sailboat in circles matching the readings on the depth finder with the depth readings on the chart based on my dead reckoning of my location (you reckon wrong, you’re dead).  Needless to say it worked, the fog cleared, and I was within a quarter mile of where I should have been (Cool!).  Just straightening out existing corporate reports and cleaning existing data is the equivalent of using the depth finder and charts already on hand (Yes! I know the difference between capital and expense).  In fact, that effort usually saves money by eliminating old unused reports (Oh, I feel so green!).

In any case, take a solid first step by getting those state-of-the-art visibility tools of BI/CPM/EPM when the current problems clear or things become so dire as to require dry dock repairs.  That way, the pucker meter won’t be buried in the red the next time this happens, and it will.

Image courtesy of Herbert Knosowski, AP

From the Cloud to the Bunker, the Cold Splash of Reality

Queue the movie Aliens: “…we’re screwed man, it’s over, it’s over!  They’re going to come in here and they are going to…! Get a grip Hudson!”.  That is what things feel like here at the moment.  We are just welding up the armor around the bunker waiting for the Credit Crisis Aliens to get in and decimate IT with their acid blood and ability to plant parasites in our chests.  I guess we do need to get a grip and figure out what to do to shift gears for a new reality.

Anyone want to travel to the C-Suite (Alien Central) to request budget for Web 2.0, Cloud Computing, Chrome, or Green initiatives? (Just leave your dog tags and gear here soldier, it will make it easier for us to split it up among ourselves).  The whole thing makes me chuckle as I weld another piece of steel up over my door.  The first book I go for in situations like these, given my experience and training, is George Orwell’s “1984”.  Doublethink spin is the order of the day today.  Green Computing, becomes High Energy, Aggressive Server and License Rationalization Savings Initiative.  Cloud Computing becomes Radical Infrastructure Outsourcing and Savings Program.  Web 2.0 becomes Intensive Customer Acquisition and Support Cost Reduction Program by Having Them Do All of The Backoffice Work.  Everyone admit it. You’ve seen names like these before; look at the name of any Congressional Bill, they use the same playbook.

Cynicism aside, the world has changed.  IT needs to focus on providing solid data and tools to aid in planning and budgeting for the company to move forward given the new reality.  Tactical cost savings initiatives need to be put on the table to keep staff occupied in a productive manner.  This is the time to consolidate that server farm, outsource network configuration and maintenance, eliminate under-utilized software, and rationalize/outsource maintenance of the PC hardware base.  Each of these are a steel plate welded on the doors to keep the Aliens at bay.

Continue low-cost planning initiatives in new technology — all things pass and this too shall pass in time.  IT needs to be ready to move forward without skipping a beat and keeping this focus will help morale as well.  New technology is the source of most of the major productivity gains and cost savings of the last 20 years.  So the organization as a whole needs to stay tuned-in to any opportunities coming on the horizon.

Plus, think of the fun watching the trade press and the vendors being chased and harvested by the aliens, it could not happen to a better group.  I cannot wait for the shift in editorial priority and ad focus.  Get your copies of the “Aliens” and “1984” ready for reference!

Cloud Killer App: Looking for Love in All the Wrong Places

The new darling of the technical media and every product company, Cloud Computing, is searching for it’s Killer Application.  That seems to be the topic of every article and PR announcement.  Every show and seminar proclaims to have previews or insights to this great new Holy Grail. This Grail is the software that will launch the Cloud Computing platform to prominence and make everybody billions.  Really! Whatever they are smoking can I get some too!  What totally scares me is being “one” with Larry Ellison.  How did I ever get in this philosophical state?

During prehistoric times as a college student, a professor of mine returned a paper I submitted with a simple comment; “If this is the solution, what was the problem again?”.  The professor gave me the Stalinistic “opportunity” to resubmit the paper with either the same or (hint hint) modified solution (wrong choice: Gulag for you).  Believing he was the south-side of a north-bound mule, I knew there was a trick to this situation.  Disassembling the paper logical thread by logical thread revealed he was right; the solution the paper proposed did not map to the original case study problem and an all night typewriter-based re-write was in order ( I hate when that happens!).

Pardon the rambling dementia, but we have the same situation here, Cloud Computing does not necessarily lead to a new Killer Application.  Logically, Cloud Computing will lead to a new range of hardware, not software innovation.   Cloud Computing presents the opportunity not to be enslaved to a classic server based data center or even a PC.  It will supercharge mobile computing via advanced cellphones and drive further mobile gadget innovation.  Cloud Computing drives pervasive computing, that is it’s Killer Application.

Image courtesy of King Megatrip

Web 2.0: Rumors of My Death Are Greatly Exaggerated

Sometimes it is a good idea to step back and think after reading the breathless reporting on the Great Left Coast Technology Shows, TechCrunch 50 and Demo Fall . What is most interesting is some of the ensuing analysis.  For example, this piece basically says Web 2.0 is dead, because the offered Web 2.0 innovation was yet another photo site, friend network, etc…  Even Web 2.0’s death is old news.  During November 2006, Web 2.0 was considered as much as dead to be superseded by Web 3.0  (ugh!!! I haven’t got Web 2.0 straight yet).

What is going on?  How can I even remotely look intelligent as a technologist going for budget or capital to work with Web 2.0 technology?  Dead, not dead, no wait it is Web 3.0.  This would make anybody think the IT profession as a whole was psychotic for even suggesting a value proposition incorporating Web 2.0 technology within or without the company.

Perhaps a different view would help put all of the noise in perspective.  After recently reading “Engines that Move Markets: Technology Investing from Railroads to the Internet and Beyond” by Alasdair Nairn, one can apply the lessons learned from past cycles of technology adoption to that of Web 2.0.  While technologies such as railroads, electric lighting, and automobiles are dissimilar, they all tend to follow the same cyclic steps.  One of those early steps is the rise of copycats or “me-too-ism”.  Everybody wants to jump on that gravy train with biscuit wheels, and hopes to tap into the investment cash stream moving into the new technology innovation.  By gauging where you stand step-wise in the cycle you will know when and where to invest.  So in this case Web 2.0 is not dead, it is merely signaling a move to the next stage.

The next stage will be corporate and organizational adoption, not necessarily the next great consumer Web site.  The consumer space has been the lead innovation ground for the Internet, with corporate and organizational use trailing.  So the consumer space is moving to the later consolidation phase for Web 2.0, while the corporate and organization space is beginning innovation and adoption.  Just the announcements by IBM for a social collaboration lab and Oracle for their Beehive initiative show the value of using this cyclic model as a lens for evaluation.

Now is really the time for corporations and organizations to begin to consider adoption of Web 2.0 technology with implementation studies and pilot programs.  The potential productivity gains and first mover benefits will be huge for those who can begin the cultural changes necessary.  Because the technology drives more of a cultural and organizational change than a true technological change there is little benefit to waiting for the technology to be “perfected”.  Instead, the organization’s culture needs to adapt to best practice in collaboration and analytics driven evolution, and where people are concerned it takes time to adapt and assimilate.

Image courtesy of gapingvoid.com

Chrome: Apple Looks Better All The Time

I should be biting my tongue, but the pain exploding in my brain by thinking this prevents me from doing anything further to my anatomy.  As one who escaped IBM’s totalitarian regime of the 1980’s (run Apple 1984 Super Bowl commercial), I can not believe I want to return, even if Steve Jobs is cool and IBM was not.  Chrome is what is sending me there.

Does anybody think of the poor slobs shoveling coal in the bowels of IT support when they think up a new browser or (shudder!) yet another toolbar.  These unsung heroes are just turning the corner on the Safari onslaught — every user with an iPod (99.999998% approx.) had this disease ridden Typhoid Mary installed on their PC auto-magically (thank you for the opt out Apple, not).  At least Chrome is “voluntary” at this point, requiring a mouse click for download, but given Google’s track record with their Toolbar, it is sure to be foisted on every unsuspecting PC in short order.  I can’t wait.

The best part about all of these revolutionary browsers is playing malware shell games with their developers: “We fixed some bugs, but we are not going to tell you which ones (Ha Ha Ha).”  Nothing personal, but what happened to “Do No Evil”?  It is an oxymoron, name one marketing/advertising entity with morals (it started with Josef Goebbels and has been downhill ever since).

This weeks Economist has a much more interesting insight in its technology section. The bulk of the world will be accessing the Internet through their cell phones based on cost, penetration, and true ubiquity.  This is the platform of the future and the one most in need of innovation and development (the greatest good for the greatest number I always say).  Putting all of the resources of the Internet in the hands of the poor and repressed and truly flattening the world as put forward by Friedman seems so right, squabbling over the desktops of the rich developed world seems so Evil (well trivial and venal in any case).

I am not a Luddite (argh! I am having an existential moment), Chrome does have value beyond firing up the trade press and blog traffic (oops, did I say Chrome in my blog too?).  It legitimately tries to move the user experience up a level in terms of trying to derive an informational level of interface instead of gratuitous data groveling at a list level.  More research needs to move in this direction as the data volumes increase to the absurd.  One question we discussed: Would cartoon character representation assist C-level executives understanding?  The answer is of course, Yes! Only The Family Guy could illuminate those fixtures.

To Structure Or Not To Structure: That Is The Question

To structure or not to structure: that is the question: Whether it is nobler in the mind to suffer the slings and arrows of metadata, ontology, and sixth canonical normal form.  Or to take up arms against 30 years of data structure dogma and piety and by opposing the convert to Web 2.0 search technology (and potentially ruin my career, the remaining shards anyway).  Lately, I have felt as torn as Hamlet; stay with my data heritage or end it all with radical Web 2.0 abandon.

As one who came out of the late 1970’s as a DBA (data base administrator) religiously putting flat files and hierarchical DBMSs (data base management systems, IMS specifically) to the sword, evangelizing the purity of the CODASYL model and teleprocessing systems.  Naturally, I was put to the sword in turn by Code, Date, and relational DBMSs.  Later, we fought back with object oriented databases, but being older and wiser, detente reigned.  The only good data was analyzed and structured data, fourth normal form (sixth is extreme) at a minimum.  All carefully placed in some DBMS so it could be transacted, searched, and reported. Ultimately, this drive to structured data has lead to Business Intelligence (BI, oxymoron, like military intelligence), Corporate Performance Management (CPM) and Executive Dashboards (picture the Elmo dashboard toys you strap to the baby’s crib, spin it, ring it, beep it, Ha ha ha).

Like Galileo, unfortunately, I tasted some forbidden fruit and it has haunted me for years.  I was first taunted by the BLOB construct, which allowed unstructured data to be put in a data base container without the DBMS caring what it was, properly tagged the unsearchable could be found, but it was still labor intensive.  My second taste came from being one of the sorry set of individuals to develop on Apple’s Newton platform (great haiku, bad handwriting recognition).  The development platform and runtime were a rich object soup giving incredible flexibility as to what constituted data and instruction.  Now, I am severely tempted by HTML and Search in the guise of Web 2.0 (tie me to the stake and light me up, I confess).

Building out Enterprise Data Models for the average corporation or, even more difficult, Biomedical Data Stores for life sciences are extremely labor intensive, frustrating, and often futile endeavors.  The difficulty (cost, time) is directly correlated to the need for precise metadata and ontology.  Deriving, documenting, and retrofitting are massive efforts (and definitely not for the ADHD among us, who me?).  All of the investment is up front, before the first benefit can be realized (real scary career-wise).  However, this is the “right”, dogmatic, safe way to handle data.

This is why our data stores are embarrassing data dumps (landfills, complete with dozers and sea gulls).  It is the difficulty and cost of proper classification and maintenance of data in a structured environment that feeds this end.  Think of it as data entropy, devolving to the most basic disorganized state.  If this basic unstructured state is where data is going, why not just leave it in the “natural” state?  Use the human cognitive effort and Web 2.0 tools to promote the best and most useful data to the top of the heap and let the stuff of dubious integrity drop and disappear into the gravel in the bottom of the big data (fish) tank.  Rather than spend all that up front investment before the first benefit; the process would be one of steady refinement over time.

The raw data permeating the Web is greater than any structured data store and seems infinite in type and variety.  Like the ocean, people dip what they require and interests them with ever increasing success.  The rate of evolution of the supporting technology is astronomical. If we could put half the effort into molecule discovery we put into Britney Spears antics the world would be a much better place.

Web 2.0: Like Prego Spaghetti Sauce “It’s In There!”

It's in there!

It's in there!

Web 2.0 is giving me flashbacks to an old TV commercial for Prego spaghetti sauce; “Tomatoes, in there! Garlic, in there! Carrots, in there! Half of Italy, in there!…”  It seemed no matter what you asked for it was in that bottle of sauce.  Being a sauce, how could you really tell what was in there, or if it was really needed?  Plus, the tomatoes colored everything red so who knows?  Now we have another bottle of technical sauce here called Web 2.0; it’s in there!  It’s colored all Internet so how can you tell what is really in there, or if it is really needed?

Good question, seems like every vendor says they’re on the bottle of ingredients, in fact the most important one.  It would be funny if it was not so pathetic.  Unfortunately, the smell here is not a nice bubbling spaghetti sauce, closer to a warm crock of….., you get the concept.  Every vendor out there seems to believe companies will blindly buy anything labeled Web 2.0. Rather, the CIO’s are more apt to remember the Internet bubble and where that approach got them the last time.

What is required is more definition of what Web 2.0 is, and why we in IT need to move in that direction.  To get that basic understanding, we need to breakout that old spaghetti sauce pan again to boil out all the fancy analysis and obsequious technology.  Lo and behold! What remains is a simple concept: the inmates are now in control of the asylum.  Users of the Internet have turned the tables on the big players in the space, they are no longer happy being spoon fed from a portal. The denizens want to hunt it on their own terms, see it their own way, save it and dispose of it as they please.  If you stand in their way, this mob of Internet hunter-gatherers will crush you with the loss of their eyeballs (poor Yahoo, poor EBay, happy Facebook, happy iPhone).

If this basic principle is followed like a lode stone, much that is occurring in the Internet space is much more illuminating and the proper path forward (with supporting technology) is a great deal clearer to discern.  For example, the winning companies embrace openness and external developers.  There is no way their internal staff can create and the site push enough content and functionality to stay on top.  The Tao of a top site is to be one with the masses, following and attempting to push is uncool.  Allowing users to mash-up specialty widgets into cool personal discoveries is winning, monetization will ultimately follow.

By this point, you are thinking — how is all this ethereal philosophic spew helping me?  I need to get something together that can be called Web 2.0 or my IT existence is at risk!  Do not worry Grasshopper (I’m showing my ’70s again, rats!) I’ll put forward a corporate-friendly straw man.  If SharePoint is used to enable a project, process, or department; it is so Web 1.0 (boring!).  If we put the entire corporation up on SharePoint, acting like a corporate Facebook, we are getting there.  If we template it such that we now have ubiquitous collaboration; optimizing and moving our corporate intellectual property (IP) at light speed much nicer.  But for ultimate coolness, we need to commit heresy and wire a Google search appliance in, after adding all of our corporate content to the pile: documents, presentations, everything.  Then the cherry on top, flatten key data bases to HTML and toss them in.  Now, with proper organizational change management (Yes Billy! You can run with scissors, points down please), employees can use all of the power contained in Web 2.0 to maximize unstructured corporate data for speed and profit.  Mangiare! Mangiare!

Data, The Ugly Stepsister of Web 2.0

The basket of technology comprising Web 2.0 is a wonderful thing and worthy of all of the press and commentary it receives, but what really scares me is the state of data in this new world.  Data sits in the basement of this wonderful technology edifice, ugly, dirty, surrounded by squalor, and chained in place.  It is much more fun to just buy the next storage array (disk is cheap, infinite, what power bill?), than it is to grind though it, clean it up, validate it, ensure proper governance and ontology.

What is Web 2.0 for, if not to expose more content? And data is the ultimate content.  Knowing what is hiding in the basement, there are going to be a lot of embarrassed organizations (Lucy, you got some ‘splaining to do!).  Imagine how difficult it is going to be to link and synchronize content and data in the Web 2.0 environment.  Imagine explaining the project delays and failures of Web 2.0 initiatives when the beast in the basement gets a grip on them.

Normally, the technology will be blamed.  Nobody wants to admit they store the corporate crown jewels in the local landfill.  Nobody will buy the new products fast enough.  The server farms being built to support Cloud Computing will sit spinning and melting Arctic Ice in vain (Microsoft’s container-based approach is cool).  This could seriously impact the market capitalization of our top tech giants Microsoft, Oracle, Google, Amazon.  Oh no! It could crash the stock market and bring on tech and financial Armageddon given our weakened state!  Even worse, my own career is at stake!  The devil with them, they are all rolling in money, I could starve!

Now that I have my inner chimp back in the box, we need to put together a mitigation strategy to allow for a steady phased improvement of the data situation in tandem with Web 2.0 initiatives.  It is too much to expect anybody to clean up the toxic data dump in one sitting and we can not tag Web 2.0 with the entire bill from years of neglect (just toss it in the basement, no one goes there).  If we do not ask IT to own up to the issue and instead allow projects to fail, senior management, (fade to The Office), will assume the technology is at fault and will not allocate the resources needed to make this key technological transition.