It’s the End of the World As We Know It!

The Holidays are a great for watching “End of the World” shows on the History Channel. They were a great comfort, actually almost encouraging, because all of the prophecies target 2012.  “The Bible Code II”, “The Mayan Prophecies”, and the Big 2012 Special compendium of End of the World scenarios, covering Nostrodamus to obscure German prophets, all agree that 2012 is the big one (Dec 21 to be exact!)  What a relief!, the rest of the news reports are trending to canned goods, shotguns, and gold by the end of the year.  We really have almost a whole 48 months before everything goes bang (I wasn’t ready anyway, procrastination rules!).

Unfortunately, we need to do some IT planning and budgeting for the new year and probably should have some thoughts going out 36 months (after that see the first paragraph).  As I discussed in a prior blog, the reporting, BI/CPM/EPM, and analytics efforts are the strongest priority; followed by rational short cost savings efforts.  All organizations must see where they are heading and keep as much water bailed out of the corporate boat as possible.  Easy call, job done! 

Then again a horrifying thought occurred to me, what if one of these initiatives should fail? (see my nightmares in prior blog posts on Data and Analytics).  I am not saying I’m the Mad Hatter and the CEO is the Red Queen, but my head is feeling a bit loosely attached at the moment.  Management cannot afford a failed project in this environment and neither can the CIO in any company (remember CIO=Career Is Over).

The best way to ensure sucessful project delivery (and guarantee my ringside lawn chair and six-pack at Armageddon in 2012) lies in building on best practice and solid technical architecture.  For example, the most effective architecture is to use a layer of indirection between the CPM application (like Planning & Budgeting) and the source data systems (ERP, Custom transactional).  This layer of indirection would be for data staging, allowing transfer to and from fixed layouts for simplified initial installation and maintenance.  In addition, this staging area would be used for data cleansing and rationalization operations to prevent polluting CPM cubes with uncontrolled errors and changes.  In terms of best practice, libraries and tools should be used in all circumstances to encapsulate knowlege rather than custom procedures or manual operations.  Another best practice is to get procedural control of the Excel and Access jungle of wild and wooley data which stands ready to crash any implementation and cause failure and embarassment to the IT staff (and former CIO).  When systems fail, it is usually a failure of confidence in the validity or timeliness of the information whether presented by dashboard or simple report.

CPM, EPM, and Analytics comprise and convey incredibly refined information and decisions of significant consequence are being made within organizations to restructure and invest based on this information.  The information and decisions are only as good as the underlying data going into them.  So skimping on the proper implementation can put the CIO’s paycheck at serious risk (Ouch!).

10 things you can do right now to improve your Enterprise Business Intelligence (BI) capabilities

  1. Document examples of manulytics (manual analytics) activities to illustrate hidden fixed costs. Any BI investment initiative needs executive support and budget. You need to make a case for the investment to improve your BI capability and show the business a ROI (Return on Investment). The cost of the current manualytics activity needs to be documented to highlight the hidden fixed costs of the current way of doing business to help build consensus to make improvements.
  2. Identify manualytics processes to be moved to production and automate.
  3. Raise awareness of data as a corporate asset.
  4. Enlist and cultivate a C-level executive sponsor for your Enterprise BI effort.
  5. When the business asks a question that is difficult to answer – keep track of the level of effort expended to generate the information. How many analysts with spreadsheets are compiling information manually? When the answers accuracy are questioned, how much more time is spent proving the numbers are correct.
  6. Development and document metadata wherever possible – Build in metadata requirements gathering into your SDLC – Create and standardize a process to capture table and column definitions and business logic into a standard format. Get tribal knowledge documented so that the business can continue to operate if people leave or move on.
  7. Data Governance – develop a committee to work towards managing the data and IT assets of the organization.
  8. Create/Assign data stewards for each of the source systems to agree on service level agreements for your source systems and resolve data quality issues.
  9. Work to centralize your reference data – business hierarchies like department and product need to be centralized, agreed upon by all stakeholders – this is a task that can be driven by your corporate governance committee.
  10. Don’t boil the ocean – Look for candidate pilot projects with a narrow scope to show quick wins to the business (90 day max.)
  11. Work toward tool standardization – many organizations own one of each BI tool – work to standardize on one or two.
  12. Build a Center of Excellence around BI and ETL – work to centralize your internal expertise for BI and ETL.

Well we ended up with twelve items, any one of which could fill a book or whitepaper and may be the subject of a future post.

As we work with different organizations, similar themes emerge. Every organization is different and your road to BI maturity is different from other companies. Sometimes it pays to have a fresh set of eyes come in and survey your current state to get you started on the right foot.

What is Manualytics and why should I care?

Manualytics (manual analytics) is the labor intensive, manual process of creating information. It involves finding, loading, correlating and consolidating data into spreadsheets to answer a particular question.

The business leadership asks a question which initiates a number of analysts to start sift through the silos, usually with spreadsheets.

The answers from different departments don’t agree. That spawns a second round of analysts with spreadsheets trying to prove whose numbers are correct.

The business is left with an answer they don’t trust which leads to decision making with inordinate risk.

Does this sound familiar? Can you give examples in your organization that resemble this process? You are not alone. Manual analytics exists in virtually all organizations because the business can create questions faster than IT can provide answers.

The problems with Manualtyics include:

  • It is inefficient and labor intensive
  • It produces inconsistent results and can compounded errors
  • It buries complex business logic in spreadsheets
  • It introduces uncertainty and confusion
  • It engenders mistrust of the data
  • It leads to risky decisions
  • Second and third layer of analysis
  • Data Untraceable from Target to Source – compliance anyone?

One of the largest problems is that Manualytics is a hidden overhead cost/activity in many organizations. If the manual spreadsheets (or any desktop system) are run every month and are business critical – then they need to be productionalized and automated.

If you don’t have a program to improve your BI capabilities and limit/reduce the amount of manual analytics then you are treading water at best.

This siloed architecture and manualytics activities describes what we call a typical BI current state. We see this situation to varying degrees, in one form or another, in virtually all organizations.

So how do we break out of this cycle?

How do we position our systems and people to obtain actionable information?

How do we overcome our siloed architectures and maximize our long term IT investments?