Six Core Principles for Transforming Healthcare – People, Technology, Data & Analytics

Digital technologies are changing the landscape of healthcare service delivery and raising patient expectations on where, when and how they engage with providers – and payers. Leading organizations are responding to these challenges and opportunities by implementing patient-centric communications and analytical tools and changing how they deliver core services – transforming their business models, operations and the patient experience in the process. To understand the legitimate potential offered by these tools, we need to unpack the buzzwords and examine the benefits and risks of specific digital capabilities – and then consider what they enable in a healthcare service delivery setting.

The following six core principles should be at the heart of every digital transformation initiative, large or small. While we have found these primary drivers to be applicable across various industry settings, here we outline their specific relevance to Healthcare.

1. Business Driven. Many digital technology initiatives in healthcare are driven by one or more core elements of the Triple Aim:

· Improve the health of populations – this principle is driving virtually every organization to identify and track populations of high-risk, over-utilizing patients; establish agreed-upon outcomes goals for defined segments and strata with similar characteristics or needs; and measure the impact of care plans tailored for each individual patient;

· Reduce the per-capita costs of care – value-based reimbursement programs and other risk-based arrangements are focusing attention on both clinical outcomes and financial results – driving the need for self-service analytics for patients, providers and payers – to measure the actual costs of care delivery for each patient;

· Improve the patient’s experience of receiving health care services – increasing transparency and coordinating patient-focused care across an expanding set of partners and providers helps to deliver the right care at the right time in the right setting – increasing patient satisfaction and improving compliance with care plans.

All the above elements are driving the need for better integration of primary service delivery processes and the resulting data streams – motivating an increasing availability of business intelligence (BI) and analytics capabilities and an omni-channel communication platform across the entire enterprise value chain. Digital technologies must be part of every aspect of the overall business-level strategy.

How are you anticipating the needs for and incorporating the capabilities of digital devices and data streams into your business execution and communications strategies?

2. Data is a Core Asset. Organizations that define, measure and adjust their operations using diverse and relevant data sets will realize many performance advantages – to the benefits of all stakeholders.

· Assembling Good Data – capturing enterprise information in digital format – and verifying the quality of those data sets against defined standards for completeness, accuracy and veracity – is an absolute foundation for preparing and enabling digital transformation. The core data systems for the execution of primary transactions and analysis of results must be credible and trustworthy – and this is only achieved – like any relationship – over a period of consistent behavior and positive results.

· Not a Simple Task – for many, this is a major challenge and a significant hurdle to overcome. Most operations are dependent upon data sets that originate in multiple legacy source systems – many of which are too narrowly focused or too closely aligned with aging or inflexible business applications. Understanding the actual contents of these older systems is challenging – envisioning their utility and engineering their transformation for novel purposes represents the “heavy lifting” of data integration. These efforts are difficult to quantify based on a direct ROI – and they are very often on the critical path to deploying and making effective use of newer digital technologies. However, opening these core assets to more transparent use by diverse participants will very often yield unanticipated benefits.

· Incremental Strategy – many organizations will not be able re-architect their data systems from the ground up – in these cases, a more incremental approach is much more viable. Most organizations will begin with a more focused implementation, building the data supply lines to capture and move data from core operational sources into a data warehouse or set of data stores optimized for BI and analytics.

· Managing Data as an Asset – proactive data governance that designates authoritative sources, establishes and enforces quality criteria, defines and assigns roles and responsibilities for managing defined data sets, and facilitates the use of data for various purposes is a critical aspect of any successful implementation.

· Anticipating Scale – the incorporation of so-called “big data” is also growing in importance in healthcare. The volumes, variety and value of these expanding and emerging data sets is driving further elaboration of the data flows, validation criteria, storage approaches and dissemination for novel use cases and analytical applications.

3. Actionable Analytics. Digital interactions – whether improved access to diverse data sources or primary transactions – are most valuable when self-service users can make timely and informed decisions and take appropriate actions based on what the data is indicating.

As the scope, diversity and ubiquity of digital devices continues to grow, the capture and dissemination of data will spread – and more users will be better informed about both the specific details and the broader context of their operating choices.

· Patients can access their care plans – and be completely up to date on their responsibilities for complying with expectations on medications, lab results, diet, exercise, follow-up appointments and monitor their overall progress toward agreed-upon clinical goals;

· Providers can access populations – and can stratify sub-segments of their panels according to clinical risk and compliance – tailoring their communications and interventions to keep patients on-track with their outcomes goals;

· Payers can review patient populations and provider networks – identifying attributed patient groups against value-based performance goals and profile provider effectiveness in meeting clinical and financial goals on risk contracts and alternative payment models.

All these capabilities empower the various user groups to more clearly understand and localize the issues and factors underlying excellent or poor performance – and focus the reinforcing or remedial actions to the benefit of all stakeholders.

4. Patient-Centered Experience. A key driver and a widely recognized benefit of the increasing availability of digital technologies is their ability to both stimulate demand and meet the rising expectations of patients for convenient access to all forms of healthcare information and services through their hand-held or wearable devices.

· Ubiquity – the emergence of the “connected anywhere, information everywhere” operating experience has given patients greater power and influence in engaging and steering their relationship with providers. So-called “activated patients” are more equipped to make informed choices and take the initiative to research their conditions, identify and understand their care alternatives, communicate and coordinate with care providers, exchange stories and find support from other patients in shared-need communities, set agreed-upon goals for their care with their providers, and measure their results.

· Flexibility – providers can no longer hold fast to rigid or single-stream operating models – imposing their internal structures, processes and workflows onto patients from the inside out. For digitally-enabled patients, the care experience is becoming much more of a self-directed journey. Providers who recognize this reorientation to facilitate the “Patient Journey” and unbundle and organize their delivery of services according to this revised model will realize greater patient satisfaction with their own care experiences, better compliance with care plans, and improved outcomes – both clinical and financial.

· Adaptability – similarly, payers are coming under increasing pressure to unbundle and adapt to the disaggregating needs and demands of their patients (members). Patients are seeking customized configurations of benefits packages that are more cost-effective and focused on their specific anticipated needs for services. These trends will continue to play out as more patients enter the individual market for health insurance products and payers are forced to adapt and devise new benefits plans.

5. Agile Technologies, Agile Processes. Agility must be a core value throughout the transformation effort – it must pervade every aspect of envisioning, defining, designing and implementing solutions in this continually evolving setting. The unbundling of service components and their flexible deployment and execution on-demand to patients and other users will create new challenges for providers.

· Feedback and Response – having an agile structure will enable more responsive delivery models – and capturing data at each point of interaction and each touch point along the Patient Journey enables near-real-time analysis of service delivery, care compliance, and their impact on outcomes. It allows feeding of detailed care experience data back into processes and workflows to enable greater personalization, better communication, and more accurate and effective segmentation for population analytics.

The commitment to an agile operation carries additional demands and benefits that must be considered as part of the transformation strategy:

· De-Coupling – tightly coupled databases, applications, custom code, execution logic and various other technical components can complicate the process of revising or enhancing services and their operation – an agile architecture will mandate an explicit de-coupling and un-bundling of tightly-bound components.

· Rapid Application Development – the technical environment and the operational culture must encourage and enable experimentation – where minimally vetted ideas can be prototyped and evaluated – facilitating an ongoing and in some sense relentless exploration of new areas for improvement or innovation.

· Infrastructure – the cloud explicitly provisions a clearly defined, precisely tuned and proactively managed capacity for services delivery and data access – ready to invoke and activate (or deactivate) as the demand specifically ramps up and down – the responsive and adaptive provisioning of this capacity of computing resources increases both the effectiveness and the efficiency of the business operations and the satisfaction of stakeholders.

· Cloud Orchestration – these unbundling and decoupling features combine to enable and facilitate a more agile operation. The execution model for the primary data sources and system services becomes one of flexible activation and deactivation of cloud-deployed capacities at a more granular level – tuned to the needs and demands of the external users rather than the constraints of internal operations.

6. Security & Access Control – the increased openness of these services demands more rigorous and reliable levels of security – including data security, application security, data encryption, compliance with regulations, and more informative monitoring of the ongoing state of the systems. Threats to on-line computing resources continue to rise as the incidence of hacking, data stealing, and denial of service attacks increases in number and sophistication. Added attention to risk management, strict adherence to appropriate security standards, and regular audits must be part of any such initiative.

The increasing availability of digital technologies is reinforcing expectations of timeliness, flexibility and convenience with patients, care givers, providers and payers in an evolving ecosystem of service delivery and information exchange. The relentless focus on quality and outcomes, cost control, value creation, and satisfaction will continue to drive innovation in service delivery across an expanding and diversifying network of healthcare industry participants. Organizations and individuals that respond and adapt will realize distinct advantages in both clinical and financial performance.

Assumptions Are a Necessary Evil

In over 27 years, I have never experienced a major problem on a systems implementation that did not begin with an assumption.

“Of course they can do it; they have a ton of experience.”
“Of course the development servers are being backed up.”
“Of course the new system can do that; it’s a tier 1 ERP- how can it not do it?”
“Of course there’s a compatible upgrade path; the vendor’s web site said so.”

Yeah, well, not always.

Fear the statement that begins, “Of course…”. From a handy web dictionary, assumption is defined as “A thing that is accepted as true or as certain to happen, without proof.”

So, assumptions are bad and should be eliminated. If you get rid of all assumptions, then you are good to go, right?

Yeah, well, not always.

Why? Because eliminating all assumptions takes time. It takes a lot of time and costs a ton of money.

Consider a project to select a new ERP system. A well architected project that includes a good process and the right level of participation from the right people generally takes six months for an average mid-sized manufacturer. If you hit that schedule, you have made a lot of assumptions, whether you know it or not. Why? Because if you try to eliminate every possible assumption, that same selection project would take years, if it could even be finished at all.

The pace of change within your technology environment, much less your business, as well as the tools you are considering, turns a nicely bounded selection project into a fruitless attempt to match your knowledge and certainty to things that are constantly evolving. There would be no end point in that scenario. By the time you have eliminated all assumptions, the people and technology have evolved from underneath all your hard-won knowledge.

So, we have a conundrum: if you make assumptions, you will screw up; yet if you don’t make assumptions, you cannot proceed. Your options appear to be limited. Certainly, there are situations that require eliminating all assumptions – I’m thinking here of building a space shuttle. But if you aren’t shooting for the moon with your project, what do you do?

You must make assumptions to move you forward, while balancing against overall risk. You may never get to the point where you make assumption your ally, but you can at least reach a cautious neutrality with them.

EDGEWATER EXPERT’S CORNER: Diving into the Deeper End of SQL – Part 1

SQL is something of a funny language insofar as most every developer I have ever met seems to believe they are “fluent” in it, but the fact of the matter is that most developers just wade around in the shallows and never really dive into the deep end. Instead, from time to time we get pushed into the deep end learning additional bits and pieces expanding our vocabulary to simply keep from drowning.

The real challenge here is that there are several dialects of SQL and multiple SQL based procedural languages (i.e. PL/SQL, T-SQL, Watcom-SQL, PLpg/SQL, NZPLSQL, etc.) and not everything you learn in one dialect is implemented the same in other dialects. In 1986 the ANSI/ISO SQL standard was created with the objective of SQL interoperability across RDBMS products. Unfortunately, since the inception of this standard and with every subsequent revision (8 in all) since, there are still no database vendors that adhere directly to that standard. Individual vendors instead choose to add their own extensions of the language to provide additional functionality. Some of these extensions go full circle and get folded into later versions of the standard and others remain product specific.

Something of a long winded introduction, but necessary for what I want to discuss. Over the coming months I will be posting some write-ups on the deeper end of SQL and discussing some topics that aimed at expanding our SQL vocabularies. Today, I want to talk about window functions. These were introduced as part of the 2003 revision to the ANSI/ISO SQL standard. Window functions are probably one of the most powerful extensions to SQL language ever introduced, and most developers – yes, even the ones that consider themselves fluent in SQL – have never even heard of them. The short definition of a window function is a function that allows us to perform a calculation or aggregate across set of rows within a partition of a dataset having something in common. Something of lack luster definition you say? I agree, but before you click away, take a peek at a couple of examples below and I am sure you’ll find something useful.

For starters, I would like to explain what a “window” of data is. Simply put, a window of data is a group of rows in a table or query with common partition-able attributes shared across rows. In the table below, I have highlighted 5 distinct windows of data. The windows in this example are based on a partition by department. In general data windows can be created with virtually any foreign key that repeats in a dataset or any other repeating value in a dataset. [Image]

Example 1: Ranked List Function – In this example using the RANK function, I will create a ranked list of employees in each department by salary.   Probably not the most exciting example, but think about alternate methods of doing the same with SQL and not having the RANK function and the simple query below gets really ugly….quick. [Image]

Example 2: Dense Ranked List Function – Similar to the RANK function, but the DENSE_RANK value is the same for members of the window having the same salary value. [Image]

Example 3: FIRST and LAST Functions – Using the first and last functions we can easily get the MIN and MAX salary values for the department window and include it with our ranked list. Yup, you are sitting on one row in the window and looking back to the first row and forward to the last row of the same window all at the same time!   No Cursors Needed!!! [Image]

Example 4: LEAD and LAG Functions – These two are without a doubt a couple of the most amazing functions you will ever use. The LAG function allows us to be sitting on one row in a data window and then look back at any previous row in the window of data. Conversely, the LEAD function allows us to be sitting on one row in a data window and then look forward at any upcoming row in the window of data.

Syntax:

LAG (value_expression [,offset] [,default]) OVER ([query_partition_clause] order_by_clause)

LEAD (value_expression [,offset] [,default]) OVER ([query_partition_clause] order_by_clause)

In the illustration below from within the context of the data window, I am looking up at the previous record and down at the next record and presenting that data as part of the current record. To look further ahead or behind in the same data window, simply change the value of the offset parameter. Prior to the introduction of these functions mimicking the same functionality without a cursor was essentially impossible and now with a single simple line of code, I can look up or down at other records from a record. Just too darn cool! [Image]

Example 5: LEAD and LAG Functions – Just another example of what you can do with the lead and lag functions to get you thinking. In this example, our billing system has a customer credit limit table where for each customer a single record is active and historical data is preserved in inactive records. We want to add this table to our data warehouse but bring it in as a type-2 dimension and need to end date and key all the records as part of the process. We could write a cursor and loop through the records multiple times to calculate the end date and then post them to the data warehouse…or using the LEAD function we can calculate the end date based on the create date of the next record in the window. The two illustrations depict the data in the source (billing system), then in the target data warehouse table.   All of this with just a dozen lines of SQL using window functions – How many lines of code would this take without using a window functions?

Data in source billing system. [Image]

Transformed data for load to data warehouse as T-2 dimension. [Image]

Example 6: LISTAGG Function – The LISTAGG function allows us to return values a column of columns for multiple rows as a single column, aka a “multi-valued field” – Remember PICK or Revelation? [Image]

One closing note; All of the examples shown were in Oracle, but the equilavent functionallity also exists in MS-SQL Server, IBM DB2 & Netezza and PostgreSQL.

So what do you think? Ready to dive into the deep end and try some of this? At Edgewater Consulting, we have over 25 years of successful database and data warehouse implementations behind us so if you’re still wading in the kiddie pool or worse yet, swimming with the sharks! – give us a call and we can provide you with a complimentary consultation with one of our database experts. To learn more about our consulting services, download our new Digital Transformation Guide.

EDGEWATER EXPERT’S CORNER: The Pros and Cons of Exposing Data Warehouse Content via Object Models

So you’re the one that’s responsible for your company’s enterprise reporting environment. Over the years, you have succeeded in building out a very stable and yet constantly expanding and diversifying data warehouse, a solid end-user reporting platform, great analytics and flashy corporate dashboards. You’ve done all the “heavy lifting” associated with integrating data from literally dozens of source systems into a single cohesive environment that has become the go-to source for any reporting needs.

Within your EDW, there are mashup entities that exist nowhere else in the corporate domain and now you are informed that some of the warehouse content you have created will be needed as source data for a new customer service site your company is creating.

So what options do you have to accommodate this? The two most common approaches that come to mind are: a) generating extracts to feed to the subscribing application on a scheduled basis; or b) just give the application development team direct access to the EDW tables and views. Both methods have no shortage of pros and cons.

  • Extract Generation – Have the application development team identify the data they want up front and as a post-process to your nightly ETL run cycles, dump the data to the OS and leave consuming it up to the subscribing apps.
Pros Cons
A dedicated extract is a single daily/nightly operation that will not impact other subscribers to the warehouse. You’re uncomfortable publishing secure content to a downstream application environment that may not have the same stringent user-level security measures in place as the EDW has.
Application developers will not be generating ad hoc queries that could negatively impact performance for other subscribing users’ reporting operations and analytics activity. Generating extracts containing large amounts of content may not be the most efficient method for delivering needed information to subscribing applications.
Nightly dumps or extracts will only contain EDW data that was available at the time the extracts were generated and will not contain the near- real-time content that is constantly being fed to the EDW – and that users will likely expect.
  • Direct Access – Give the subscribing application developers access to exposed EDW content directly so they can query tables and views for the content they want as they need it.

 

Pros Cons
It’s up to the application development team to get what they need, how they need it and when they need it. You’re uncomfortable exposing secure content to application developers that may not have the same stringent user-level security measures in place as the EDW has.
More efficient than nightly extracts as the downstream applications will only pull data as needed. Application developers will be generating ad hoc queries that could negatively impact performance for other subscribing users’ reporting operations and analytics activity.
Near-real-time warehouse content will be available for timely consumption by the applications.

 

While both of the above options have merits, they also have a number of inherent limitations – with data security being at the top of the list. Neither of these approaches enforces the database-level security that is already implemented explicitly in the EDW – side-stepping this existing capability will force application developers to either reinvent that wheel or implement some broader, but generally less stringent, application-level security model.

There is another option, though, one we seldom consider as warehouse developers. How about exposing an object model that represents specific EDW content consistently and explicitly to any subscribing applications? You may need to put on your OLTP hat for this one, but hear me out.

The subscribing application development team would be responsible to identify the specific objects (collections) they wish to consume and would access these objects through a secured procedural interface. On the surface, this approach may sound like you and your team will get stuck writing a bunch of very specific custom procedures, but if you take a step back and think it through, the reality is that your team can create an exposed catalog of rather generic procedures, all requiring input parameters, including user tokens – so the EDW security model remains in charge of exactly which data is returned to which users on each retrieval.

The benefits of this approach are numerous, including:

  • Data Security – All requests leverage the existing EDW security model via a user token parameter for every “Get” method.
  • Data Latency – Data being delivered by this interface is as current as it is in the EDW so there are no latency issues as would be expected with extracted data sets.
  • Predefined Get Methods – No ad hoc or application-based SQL being sent to the EDW. Only procedures generated and/or approved by the EDW team will be hitting the database.
  • Content Control – Only the content that is requested is delivered. All Get methods returning non-static data will require input parameter values for any required filtering criteria – all requests can be validated.
  • Data Page Control – Subscribing applications will not only be responsible for identifying what rows they want via input parameters, but also how many rows per page to keep network traffic in check.
  • EDW Transaction Logging – An EDW transaction log can be implemented with autonomous logging that records every incoming request, the accompanying input parameters, the number of rows returned and the duration it took for the transaction to run. This can aid performance tuning for the actual request behaviors from subscribing applications.
  • Object Reuse – Creation of a generic exposed object catalog will allow other applications to leverage the same consistent set of objects providing continuity of data and interface across all subscribing applications.
  • Nested and N Object Retrieval – Creation of single Get methods that can return multiple and/or nested objects in a single database call.
  • Physical Database Objects – All consumable objects are physically instantiated in the database as user-defined types based on native database data types or other user-defined types.
  • Backend Compatibility – Makes no difference what type of shop you are, i.e.; Oracle, Microsoft, IBM, PostgreSQL or some other mainstream RDBMS; conceptually, the approach is the same.
  • Application Compatibility – This approach is compatible with both Java and .NET IDE’s, as well as other application development platforms.
  • Reduced Data Duplication – Because data is directly published to subscribing applications, there is no need for subscribers to store that detail content in their transactional database, just key value references.

There are also a few Cons that also need to be weighed when considering this path:

  • EDW Table Locks – the warehouse ETL needs to be constructed so that tables that are publishing to the object model are not exclusively locked during load operations. This eliminates brown-out situations for subscribing applications.
  • Persistent Surrogate Keys – EDW tables that are publishing data to subscribing applications via the object model will need to have persistent surrogate primary keys so that subscribing applications can locally store key values obtained from the publisher and leverage the same key values in future operations.
  • Application Connection/Session Pooling – Each application connection (session) to the EDW will need to be established based on an EDW user for security to persist to the object model, so no pooling of open connections.
  • Reduced Data Duplication – This is a double-edged sword in this context because subscribing applications will not be storing all EDW content locally. As a result, there may be limitations to the reporting operations of subscribing applications. However, the subscribing applications can also be downstream publishers of data to the same EDW and can report from there. Additionally, at the risk of convoluting this particular point, I would also point out that “set” methods can also be created which would allow the subscribing application(s) to publish relevant content directly back to the EDW, thus eliminating the need for batch loading back to the EDW from subscribing application(s). Probably a topic for another day, but I wanted to put it out there.

 

So, does that sound like something that you may just want to explore? For more information on this or any of our offerings, please do not hesitate to reach out to us at makewaves@edgewater.com. Thanks!

Our IASA 2017 Key Takeaways

We’d like to offer a big thank you to IASA for a great event. More than 1,000 attendees came together in Orlando for four days of interactive meetings and presentations focused on the transformation taking place in the Insurance industry.

The 2017 IASA Conference lived up to the hype as the most talked about and most attended insurance conference of the year. As I walked around the show floor, attendees and exhibitors were engaging in discussions that centered on the challenges that the insurance industry is facing today. Topics ranged from the industry being at a critical tipping point and the need for transformation, to heightened customer expectations and the concept of defining your customer personas to understand who they are rather than try to give them what the industry feels is right, to the disruption that InsurTech is driving.

 

Beyond Technology, Becoming Customer Obsessed

I had the pleasure of speaking to many business professionals throughout the event. We discussed a variety of topics, but the common thread across all our conversations was the evolution that is taking place in insurance and the steps required to transform your business to survive. While fear of change today is tough, irrelevance tomorrow…is worse.

Many of my conversations centered on how to start building a transformation strategy. Most companies make the mistake in focusing more on the digital than the transformational. This makes the strategy focused on technology rather than being business driven. Digital transformation is a business initiative where technology plays an enabling role. We recommend using technology to support, not guide your strategy.

The challenge is how to redefine the customer journey using technology as an enabler of change. It’s a continuous process – strategy and execution combined. Too often one comes without the other. Too often we think about the technology first, then figure out how to make it work. So, which steps should you take first for the biggest impact?

At Edgewater, we recommend you start by putting consumer engagement at the center of your transformation strategy – your customer personas. Then, you’ll know the right questions to ask: “How will we personalize our products and services for the consumer? How can we unify all of our touchpoints to create a better experience? How should we extract insights from the data we are collecting to deliver future value?” Finally, you’ll see that it doesn’t have to be all or nothing.

Every day we are helping insurers become customer obsessed by building a strategy that focuses on what their customers want – a strategy beyond technology.

A few insights from this year’s event

First, we loved hearing from leading insurers about their own goals for digital business transformation and the role we at Edgewater can play in being a strategic partner to help them accelerate their journey. We have a deep understanding of the business and technology trends impacting the industry as well as the all-important consumer trends that are driving the need for change.

The next was the buzz around the shift to digital. Digital channels, devices and experiences are now disrupting the insurance industry. While technology leaders tend to be more familiar with the changes that are taking place, increasingly, business leaders are eager to understand how they can capitalize on these emerging trends to provide a competitive advantage. Customers are demanding new ways to engage with insurers, expecting a more personalized experience. We work with executives every day to create and implement their digital transformation strategies providing innovative ways for companies to interact and provide value added services to their customers.

And finally, at the event we received positive feedback on our recently published Insurers Guide for Digital Transformation. The guide is a starting point for how leaders should help their companies create a customer engagement strategy. Many of you, whether leaders in business or technology are eager for more information on how to get started with building your transformation strategy. We look forward to working with you as you start or continue your transformation.

If you missed us at the show, you can visit our website to see how we are helping companies begin their transformation.

Empowering digital transformation together at IASA 2017

Our Edgewater Insurance team is packing their bags and is excited to participate in IASA 2017, June 4-7 in Orlando, Florida. We’re proud to, once again, participate in a forum that brings together so many varied professionals in the insurance industry who are passionate about being prepared for the unprecedented change that is sweeping the industry. We look forward to meeting you there to show how our deep expertise, delivering solutions built on trusted technologies, can help you transform your business to become more competitive in a digital world.

Come and see how our experienced team of consultants can help your organization drive change

In industry, technology has often served as a catalyst for modernization, but within insurance we need to do more to understand the consumer to drive change. More than any other opportunity today, CEOs are focused on how to leverage digital technologies within their companies. But there’s still a lot of noise about what digital transformation even means. At Edgewater, we have a unique perspective based on our 25 years of working with insurance carriers. Our consulting team has spent many years working in the industry all the way from producers to adjusters, and in vendor management. We have a deep understanding of the business and technology trends impacting the industry as well as the all-important consumer trends. We know that transformation in any company can start big or of course, it can start small. From creating entirely new business models to remaking just one small business process in a way that delights your customer, changes their engagement model, or improves your speed to market.

We work with executives every day to create and implement their digital transformation strategies. At this event, we will be discussing how digital transformation needs to be at the top of each insurance carrier’s business strategy as the enabler that will bring together consumer, producer, and carrier. Attendees can come and experience first-hand how technology innovations are sweeping the industry, and how insurance carriers are progressing in their efforts to digitize through our interactive solution showcase. You will be able to explore solutions across many functional areas, including creating a unified experience for the consumer, enabling the producer to engage and add value, and how to learn and act on new insights by analyzing the data of transactions and behavior to create more personalized products and services.

But wait, you don’t have to wait

Get a sneak peek at the strategies we’ll be sharing at the event by downloading our Digital Transformation Quick Start Guide for Insurance Carriers at http://info.edgewater-consulting.com/insuranceguide. The guide is a starting point for how leaders should help their companies create and execute a customer engagement strategy. The Quick Start Guide will help you understand

  • What Digital Transformation is and what it is not
  • How producers should be using technology to connect with customers
  • How updating your web presence can improve how you engage with customers

See you there!

If you are planning to be at the event, visit our booth #1110 to meet our team and learn more about Edgewater’s solutions and consulting services for the Insurance industry. We’re excited to help you get started on your digital transformation journey.

Digital Transformation Starts with….Exploring the Possibilities. Here’s how

You can learn a lot about what digital transformation is, by first understanding what it is not. Digital transformation is not about taking an existing business process and simply making it digital – going paperless, as an example. Remaking manual processes reduces cost and increases productivity – no question – but the impact of these changes is not exactly transformative. At some point, you’ve squeezed as much efficiency as you can out of your current methods to the point where additional change has limited incremental value.

Digital transformation starts with the idea that you are going to fundamentally change an existing business model. This concept can seem large and ill-defined. Many executives struggle with where to even start. Half of the top six major barriers to digital transformation, according to CIO Insight, are directly related to a hazy vision for success: 1) no sense of urgency, 2) no vision for future uses, and 3) fuzzy business case.

 

It isn’t a big leap to imagine how Disney might be using the geolocation and transaction data from these bracelets to learn more about our preferences and activities in the park so they could better personalize our experience.

This MagicBand, as an example, immediately generates new expectations from customers that laggards in the industry have a hard time matching quickly.

 

 

At Edgewater, we worked with Spartan Chemical to create an innovative mobile application to drive customer loyalty. Spartan manufactures chemicals for cleaning and custodial services. They set themselves apart by working with us to build a mobile app that allows their customers to inspect, report on, and take pictures of the offices and warehouses they cleaned so that Spartan could easily identify and help the customer order the correct cleaning products.

Once you’ve defined your vision and decided where you will start, you should assess your landscape and determine the personas you will target with this new capability, product, or service.

At Edgewater, we help you create a digital transformation roadmap to define and implement strategy based on best practices in your industry.

To learn more:

The Seven Core Principles of Digital Transformation

Digital Transformation

Digital Transformation has become a hot buzzword recently, being adopted by Microsoft as the overarching theme for their cloud based business apps and the subject of many studies from McKinsey and company, Gartner and other research firms.

I wanted to share some of our approach and lessons learned working with companies in different industries such as Insurance and Manufacturing on their digital transformation initiatives.

A transformation does not happen overnight. It is a long and sometimes painful process that to be honest, never really ends. The rate of innovation and change is increasing and new business and customer needs will constantly emerge.

Therefore, our approach is very much grounded in the concepts of agility. The right foundation built with change in mind. In such an approach, it is not always beneficial to try and document every future requirement to see how to accommodate it but to have a very strong foundation and an agile, open framework that can be easily adapted.

A good way to judge your current agility level is to perform a Digital Agility Gap test. For small, medium size and large changes business has requested in the last year, what is the gap between when the business would like to see the change made to when your organization was able to deploy? The larger the gap, the more acute the need is for a comprehensive digital transformation.

agility-gap

The following 7 core principles should drive every digital transformation initiative, large or small:

  • Business Driven. This may sound obvious but all digital initiatives need to have a business reasoning and business sponsor. Technology can be a game changer but very often, the digital channel needs to be part of an omni-channel approach. eCommerce can augment retails stores or distribution channels but will not replace them for a long while. Digital must be part of the overall business and market strategy. The new role of Chief Digital Officer is a great example for how organizations integrate the digital as a business channel with broad responsibilities and a chair at the executive table. The Digital aspect needs to be part of every major organizational strategy, not a separate one. For example: you are launching a new product, how will you design it, support the manufacturing/supply chain, market, sale and support the product using Digital means?
  • Data is King. Having enterprise information available in digital format with a single source for the truth is the absolute foundation of a digital transformation. Without “Good data” the effect of garbage in, garbage out will produce inconsistent results and systems people can’t trust. This is usually the hardest part for many companies as organizational data may be residing in many legacy systems and too intimately tied to old business applications. It also is hard work. Hard to understand and hard to put a direct ROI on. It is not glamorous and will not be visible to most people. In lieu of complete data re-architecture, most organizations start with master data management and data warehouse / operational datamarts to get around the limitations of the various systems where data is actually stored. The imperative is to know what the single source of the truth is and abstract the details through data access layer and services. The emerging area of Big Data allows capturing and processing ever larger amounts of data, especially related to customer interactions. Data flows, validation and storage needs to be looked at again with new vision into what and how data is captured, stored, processed and managed.
  • Actionable Analytics. Many organizations invested heavily in Business Intelligence and use decision support systems to run analysis and produce reports. The expanding scope of data capture and processing now allows analytics to serve as actionable triggers for real time decisions and other systems. For example, your website’s ability to make customer specific product recommendation can be a result of real time process that conducts a customer analysis and what similar customers have bought and can execute an RFM analysis to assign a tier to the customer and derive relevant offers. Marketing campaigns can target prospects based on predictive analytics etc. Closed loop analysis is critical for understanding the impact of decisions or campaigns. The ability to see the connection between an offer or search campaign and the revenue it generated is the foundation of future investment decisions.
  • Customer Centricity. One of the main drivers and benefits of the digital transformation is the ability to meet the new world of customer expectations and needs. Customers want access to information and ability to take action and interact anytime, anyplace, from any device. The new Digital Experience maps to the customer lifecycle, journey or buying flow and data is collected at every point of interaction to feed personalization, targeting and marketing. When done correctly, an intelligent user experience will improve engagement, loyalty and conversion. In designing new digital user experience, we usually recommend mapping the user interactions across all touch points and focusing on finding common needs rather than a “Persona” driven approach. Those in our experience are too generic and lead to oversimplification of the model.
  • Agility in Technology and Process. Agility is at the heart of our approach and without it you would go through a transformation every few years. It is broader than just IT and impacts many business and operational processes. Few key concepts of planning for agility:
    • De-coupling. A large part of what makes changes hard, is the intertwined nature of most IT environments. Proprietary databases, older applications without outside interfaces, hard coded database calls in code, heavily customized but dated applications, etc. The solution is to de-couple the elements and create a modular, service oriented architecture. Data should be separated from logic, services, and user interaction allowing each tier to grow and evolve without requiring complete system re-write. For example, the biggest driver of transformation in the last few years has been the user experience and the need to support users in various mobile devices. A de-coupled architecture would allow UX overhaul using the same services and backend.
    • Agile / Rapid application development. Application development needs to be able to create prototypes and test ideas on a regular basis. For that to happen, the process of definition, design, implementation and testing software has to be more responsive to business needs. Whether following Agile Methodology principles or just a more iterative version of traditional models, application development has to be able to quickly show business users what they would get, and adopt a minimal viable product approach to releasing software. An emerging model of continuous delivery allows faster, automated deployment of software when it is ready.
    • Cloud and Infrastructure agility. The emergence of cloud services is making agile environments so much easier to implement. From an infrastructure perspective, you no longer need to invest in hardware resources for your worst-case load scenario. The ability to get just as much computing resources as needed on demand and scale as needed in matter of minutes makes platforms like AWS and Azure very appealing. Many applications now offer only cloud based versions and even the large players like Microsoft and Oracle are now pressuring all customers to get on the cloud versions of their applications. The ability easily to plug a cloud application into the environment is the ideal of agility. With a common security and authentication layer, the modern corporate application landscape is comprised of many different cloud applications being available to each user based on their role and integrated to a degree that makes the user experience as seamless as possible.
    • In addition to the environment, software and infrastructure, organizational processes have to be more flexible too. Change management needs to become a process that enables change, not one the stops it.
  • Process Automation: with the new landscape comprised of so many different and independent application, process automation and leverages the open interfaces of application is becoming critical. Traditional Business Process Management application are now morphing into cloud orchestration and an ability to allow processes to be created across multiple applications and managed / updated by business users without IT involvement.
  • Security. Last but not least, the open, flexible nature of the future landscape we were describing here, requires new levels of security that should be an integral part of all facets of the environment. Data security and encryption. Services security, security in application design, all layers and components have to consider the rising threat of hacking, stealing data and denial of service that are more prevalent than ever. We see this as the primary concern for companies looking to adopt a more digital and agile environment and a large emphasis on risk management, security standards and audits should be a primary component of any digital transformation initiative.

Thoughts on the Future of SharePoint

In a recent event, Microsoft outlined their plans for the future of SharePoint, mostly as part of the office 365 family. It was exciting to see SharePoint coming back to the forefront. After a few years in which Microsoft plans for the product were not very clear (No on-prem future. Oh, sorry, Yes on-prem future but with Hybrid focus. Let’s call it Sites, let’s stop supporting external sites, etc.) the fog is starting to clear.

SharePoint is now being smartly positioned as the place where your office 365 experience should start. It was long positioned as such for company Intranets and users default homepage. It is a portal platform after all. It has a new responsive look and the content highlights sites you’ve recently visited or interacted with, benefits of the office Graph.

SharePoint16NewHomePage

Speaking of the Office Graph, love it or hate being tracked, it is the foundation over which all new office 365 applications are built and new API’s will allow developers to take advantage of it in building applications and should extend in the future into Dynamics 365 as well.

The new homepage is also responsive and using a new overall look and an underlying technology called the SharePoint framework. I’ll touch on all these later but let me just say it about time. Nothing made SharePoint look older and out of pace than the clunky experience on mobile. Now all spiffed up, it will offer a modern and mobile first approach throughout.

The full 2 hour presentation + demos

New Features to get excited about:

As I’ve said, it looks like all of a sudden, the flood gates are opening and after a relatively long time of minor updates, we are to expect a deluge of new things in the next few months. Here are the ones we are eagerly awaiting:

First class mobile experience + apps: some of it, like the new SharePoint homepage and iOS app, are already available. Apps for Android and Windows Mobile are coming soon.

SharePointMobileApp

As part of the new mobile first user experience overhaul, a more modern and responsive look is coming to SharePoint sites, list and libraries

Teamsite

To enable these new interfaces (which until not, required using an external JS framework like Bootstrap) Microsoft is introducing a new SharePoint framework. Built in JS and HTML5, it will support responsive design and allow for the creation of richer user experience and apps that run in the browser. Details are yet to be full released but expect it to be the MS version of the popular Angular.JS framework.

SPFramework

Office 365 Groups will be extended into SharePoint. It has long been a source of confusion as to the different types of groups and where they appear. Microsoft is working to extend the office 365 groups into Yammer and now into SharePoint, so that an office 365 group will have a team site and vice versa. IMO, it is a much better solution for storing files and collaboration than doing it though OneDrive as it is currently done. For more on groups: https://sway.com/G_yV0w-GadIB1aA2

Intelligence and analytics. A new analytics dashboard is available in central admin with much broader and visually appealing interface. Now if only this can be available to every site owner..

SPnewanalytics

https://blogs.office.com/2016/03/15/new-reporting-portal-in-the-office-365-admin-center/

Feature-packs: for on-prem customers, Microsoft will be issuing regular feature packs that will add functionality previously released for office 365.

One more thing we are excited about is the upcoming release of Dynamics 365 and the promised common data model and services across all 365 environments. That will allow new levels of integration and automation of processes across the o365 platform from end to end.

Can’t wait!

Digital Insurance – The Myth of the Online Buyer

The insurance industry is currently dealing with digital disruption, and by disruption I’m talking about the change in the consumer and the consumer habits, what I call The New Face of Insurance.

The myth that the insurance consumer is not ready for the digital world must be dispelled. According to The surprising facts about who shops online and on mobile (Business Insider 2015):

  • One in four shoppers is actually over the age of 55
  • Millennials make up the largest portion of online shoppers in terms of dollars spent and yet they earn the least

According to Gartner, 43% of our industry revenue will come directly from digital markets by 2020. Now think about that in our current captive and broker world.

LIMRA says that:

  • 74% of insurance customers want to do research online, educate themselves before they even think about talking to an agent
  • 25% of those people will even buy online, right there and then
  • Sadly, that’s really not available in our industry

We went from captive agents to independent agents and now we’re moving to more of a I-want-to-be-my-own-agent.

An example of this would be a UK company by the name of Beagle Street. They’re attacking the old ways that we do things, attacking the old financial advisers. And what they’re saying is “come and buy online.” So how do we go with this?

Digital Strategy and Digital Footprint website redesign. It’s way more than that. It’s about continually evolving to make it easier for consumers to do business with you. You need to go where your consumers are – you can no longer expect your consumers to come to you.

It’s looking at multi-channel distribution; embracing your agents, embracing online, and embracing the education that people are looking for. Just think about the customer service improvements t by being able to reach out to them through social media when there is a catastrophe.

We’ve been invited to speak on this topic at insurance conferences a lot recently, and we’ve done a short video as well. If you’d like to learn more, contact us.