If you are a healthcare provider or payer organization contemplating an initial implementation of a Business Intelligence (BI) Analytics system, there are several areas to keep in mind as you plan your program. The following key components appear in every successful BI Analytics program. And the sooner you can bring focus and attention to these critical areas, the sooner you will improve your own chances for success.
First, a small note on terminology: We often hear the term “business intelligence” used as the overarching label for these now-familiar graphical dashboard UIs that enable direct, end-user access to actionable metrics, insightful analytics and the rich, multi-dimensional substantiating data that underlie the top-level presentation. Interactive drill-down; rules-based transformation of data, derivation of facts, and classification of events; hierarchical navigation; and role-based access to progressively disaggregated and increasingly identified data sets are some of the commonly implemented capabilities. However, we have found that most client organizations will re-cast the “BI” moniker to a more subject- or mission-related name for the system and its intended objectives. “Clinical Intelligence”, “Quality Analytics” and “Financial Performance Management” are examples. These names are chosen to apply more directly to the mission, focus and objectives of the specific subject area(s) for which the system is being designed and deployed. But they often have similar expectations with regard to the data and the end-user capabilities that will be present, and the following principles equally.
Key Technical Components
Often, the first questions the organization asks focus on the technical components; the platforms, tools and applications that will form the eventual solution. Which ETL tools should we use? Which cube design tools are best for our needs? How do we integrate with standard vocabulary services? What do we use to design and deploy our dashboards? These “how to” choices will follow consideration of the “what” needs for the system.
The key technical macro-components for a BI analytics system will invariably include one form or another of each of the following capabilities:
The primary components for data receipt and uptake, most frequently receiving raw transaction data from upstream (source) operational and transactional (OLTP) systems, such as an electronic medical record (EMR) system, surgery scheduling system, billing and reimbursement system, or claims processing system. The portfolio of source systems can be diverse and extensive, and can include the messaging traffic that travels between and synchronizes these individual systems. Primary data that is captured and exchanged can range from standard, discrete data elements, to less structured collections (e.g. documents), to diverse binary formats originating from virtually any point of care device type. The primary purpose of these components is to capture the raw data (and/or meta-data) elements that reflect the domain-specific (e.g. clinical, financial) operational or event context in which the primary data originated, propagating this data downstream for consumption in an analytic context.
Data obtained from a mission-focused transactional system will almost invariably need to be computed, mapped, translated, combined, aggregated, aligned or otherwise transformed to enable its consumption for more analytic (OLAP) applications. Mapping the raw source data elements onto a standard taxonomy and aligning them with a designated ontology or other conceptual model of the subject domain can enhance the stability and extend the useful life of your data. Various forms of derived data elements will arise, including various asymmetric relationships between elements existing in a data lineage. This serves to enrich the raw source data, both increasing its relevance and improving its consume-ability for the anticipated (and often unanticipated) analytic contexts.
Pipelining raw data elements through various “enrichment engines” can increase their value and usefulness to a broader set of audiences. Mapping atomic-level clinical procedure encodings to higher levels in a hierarchy; assigning cases to service lines and resolving overlap conflicts to support differing analytic objectives; and linking isolated events to a network of terms in a standard vocabulary are all examples that can improve the consume-ability of individual or collections of data elements.
The volumes of data gathered and organized into an enterprise-scale data warehouse (EDW) or other integrated repository will require a spectrum of storage approaches to meet the competing demands of user consumption patterns and dynamics. The demands of some data consumers will require operational data stores (ODS) that combine data from multiple sources, and deliver it to the point of consumption in a timely manner, using various high-throughput staging strategies to align key elements and minimize the latency between the occurrence of the primary event and the availability of the combined data in the ODS. EDWs integrate (and often rationalize) information from multiple sources (and potentially multiple ODSs) often according to a comprehensive universal data model that reflects the primary data entities and their relationships, facilitating and enhancing the future consumption of the data in a wide variety of unanticipated use cases. Data cubes can emphasize and optimize the dimensional characteristics of the data entities, and facilitate the hierarchical segmentation, navigation and exploration of the captured subject domain.
Unstructured data elements or collections, including binary data types (e.g. images, voltage tracings, videos, audios, etc.) present different storage requirements, often including explicit separation of indexing and other meta-data from the primary data.
The delivery of data is complete when the end-consumer has gained access to the desired data sets; using the required retrieval, analytics and presentation tools or applications; under the proper controls. The user experience might include dashboards or other interactive graphical data displays and navigational UIs; rule-driven alerts to notify critical parties of specific conditions or escalating events; analytic and predictive models for exploring hypotheses or projecting “what if” scenarios; and communication tools for distributing, and corresponding with other stakeholders or trading partners on the underlying issues reflected in the information being consumed and even tracking their resolution.
These are the recurring, primary, top-level building blocks. My next blog will delve into the program components you can apply to drive the definition, design and implementation of your analytics system.
Are you positioned for success?
Successful implementation of BI analytics requires more than a careful selection of technology platforms, tools and applications. The selection of technical components will ideally follow the definition of the organization’s needs for these capabilities. The program components outlined next time will offer a start on the journey to proactive embedded analytics, driving the desired improvement throughout your enterprise.