In this brief demo, you’ll see how to leverage your interactive dashboard to drill down by department, by pay period, by job code, by employee, by any productivity metric you gather, to support and enhance your day-to-day labor management activities.
There is little doubt that Usage Based Insurance (UBI) (a.k.a. Telematics) is a hot topic in the U.S. Insurance Market. A recent survey from Strategy Meets Action found that while only 18 P&C insurers have an active UBI program in more than 1 state, 70% of insurers surveyed are in some stage of planning, piloting, or implementing UBI programs.
A carrier cannot venture into this space without considering the data implications. Usage Based Insurance, whatever its flavor, involves placing a device in a vehicle and recording information about driving behavior. Typical data points collected include: vehicle identifier, time of day, acceleration, deceleration (i.e. braking), cornering, location, and miles driven. This data can then be paired with publicly available data to identify road type and weather conditions.
Now consider, a 20 mile morning commute to work that takes the driver 35 minutes. If the data points noted above (9) are collected every minute, that 20 mile commute would generate 315 data points (about 16 data points per mile driven). If the average vehicle is driven 1000 miles in a month, it would generate 16,000 data points each month or 192,000 data points each year. Now consider what happens if a carrier enrolls even 1000 vehicles in a pilot UBI program. Within a year, the carrier must accommodate the transmission and storage of over 190 million data points. Progressive Insurance, the leader in UBI in the U.S. market, has been gathering data for 15 years and has collected over 5 Billion miles of driving data.
Even more critically, the carrier must find a way to interpret and derive meaningful information from this raw driving data. The UBI device won’t magically spit out a result that tells the carrier whether the driving behavior is risky or not. The carrier must take this raw data and develop a model that will allow the carrier to score the driving behavior in some way. That score can then be applied within rating algorithms to reward drivers who demonstrate safe driving behaviors. As with all modeling exercises, the more data used to construct the model, the more reliable the results.
While data transmission and storage costs are relatively inexpensive, these are still daunting numbers, especially for small and mid-sized carriers. How can they embrace the changes that UBI is bringing to the market?
From a pragmatic perspective, these smaller carriers will need to partner with experts in data management and predictive modeling. They will need to leverage external expertise to help them successfully gather and integrate UBI data into their organizations’ decision making processes.
In the longer term, credible 3rd party solutions are likely to emerge, allowing a carrier to purchase an individual’s driving score in much the same way that credit score is purchased today. Until then, carriers need to make smart investments, leveraging the capabilities of trusted partners to allow them to keep pace with market changes.
We recently were invited to present internally at a prominent health care payer network about the rapidly changin role and importance of web analytics. Gone are the good old days when it was enough to just run a log analyzer or put a simple tag to collect all the information needed about the interactions a customer has with you. Analysis used to be limited in scope and focus on a handful of parameters that could be optimized, such as bounce rates and conversion rates, by tweaking the checkout flows and usability improvements.
Not that conversion rate optimization is less important today but as customer interactions focus less and less on just the company website, the new critical need is to try and get a coherent picture of general customer behavior across all touch points. Instead of trying to infer customer thoughts and concerns through their clickstreams, many are now openly expressing needs and problems through social media.
This goes beyond “cross channel marketing” into the new area Forrester and others are now calling Customer Intelligence (CI). Similar to the way business data evolved from simple reporting into Business Intelligence (BI), as customer data gets more complex and varied, putting everything together and drawing conclusions and trends from it will need to employ similar methods and tools.
This is primarily a mindset change from the somewhat passive “analytics” to the broader and much more active role of managing and providing customer intelligence.
The expectations from Web Analytics professionals and systems are changing as well from the cyclical analysis and response to the providing of on demand, immediate intelligence for both individual and aggregate customer needs and problems. In some companies this evolved into a real “command center” that has 24/7 monitoring and interaction tools to listen, interact and respond to customer needs.
There are a few challenges that mark this transition:
- Quantity: The quantity of interaction points is exploding due to social media, online videos and mobile devices.
- Traceability: It is very hard to identify users across various media. Mapping a web user to a Facebook account or twitter feed is not always possible.
- Immediacy: There is an overwhelming need and expectation for immediate response.
Here is a conceptual diagram of this new reality illustrating all the new interaction points being consolidated into the central Customer Intelligence and the introduction of the analytical services that can be used to optimize the user experience.
These analytical services can work on both an individual and aggregate level:
- Individual: If we can aggregate customer data and interactions from different channels, this will dramatically improve segmentation, insight for sales and customer service professionals interacting with the customer, and services that can target offers or content in real time based on user past interest and behavior.
Collective intelligence: By looking at customer activity across all channels we can:
- Optimize targeting through the different channels and our investment in them
- Improve recommendations
- Identify trends
- Identify problems / issues / sentiment changes and address them quickly.
To start implementing Customer Intelligence, the process is now becoming quite similar to implementing a BI solution
- Expand use of social listening and data capturing tools and store their data
- Adjust data models to accommodate multiple user identifiers, channels, devices etc.
- Redefine KPI’s
- Define and implement analytical services
Adjust reporting and analytics
- Real time
- Dashboard level
The Web Analytics vendors are starting to step up and offer tools and support for Customer Intelligence. In upcoming posts we’ll look into WebTrends, Omniture, Google and IBM to see how their offerings stack up and the type of solutions they support.
Part one of this topic addressed leveraging social media to improve customer satisfaction. This is the initial step towards a broader goal to create a robust Customer Intelligence framework that allows P/C insurers to listen, connect, analyze, respond and market to customers in a much more proactive and targeted way.
Customer Intelligence is the process of collecting relevant and timely information about customers and prospects, consolidating the data from all the different sources into a cohesive structure, and providing the sales, service and marketing functions with tools that can leverage this intelligence. The sources of this data not only include the obvious ones such as a carrier’s Customer Service Center, and Policy or Claims Admin system, but should also originate from the Agent, Marketing Surveys, Telematics, and Social Media, including Twitter and Facebook – all mashed up to produce a Balanced Scorecard and Predictive Analytics.
Most CRM systems need to be updated to include new columns in their user profile for data in addition to email and phone number such as Facebook name, Twitter Handle, etc. With the social listening and response management connected to your CRM, a social inquiry can be viewed in context and the activity recorded for future interactions, available to Customer Service Reps or even Agency personnel. This level of social customer intelligence is going to differentiate companies that do it right, becoming a key element of a carrier’s business strategy.
A fully integrated Customer Intelligence platform provides benefits such as:
- A single integrated interface to many social media outlets
- The ability to manage multiple writing companies
- Create and track cases, contacts, accounts, and leads from real-time conversations
- Manage marketing campaigns and track social media marketing ROI
- Cue CSR’s on upsell and cross sell opportunities
A carrier should determine the Key Performance Indicators (KPIs) that matter most to their business goals, then view the appropriate data in graphical dashboards to track effectiveness of their efforts. It’s important to tie those KPIs to their influence on customer behaviors such as loyalty and increased sales. But carriers must also be aware to not look at positive or negative changes in the wrong way and fully understand the reasons for success or failure. Reacting to success by following up with more online advertising in certain media outlets, may not produce the desired results, when in fact the reason for an increase in sales is due to the upsell and cross sell efforts of CSRs.
What I Learned at Health Connect Partners Surgery Conference 2012: Most Hospitals Still Can’t Tell what Surgeries Turn a Profit
As I strolled around the Hyatt Regency at the Arch in downtown St. Louis amongst many of my colleagues in surgery and hospital administration, I realized I was experiencing déjà vu. Not the kind where you know you’ve been somewhere before. The kind where you know you’ve said the same thing before. Except, it wasn’t déjà vu. I really was having many of the same conversations I had a year ago at the same conference, except this time there was a bit more urgency in the voices of the attendees. It’s discouraging to hear that most large hospitals STILL can’t tell you what surgeries make or lose money! What surgeons have high utilization linked to high quality? What the impact of SSI’s are on ALOS? Why there are eight orthopedic surgeons, nine different implant vendors and 10 different total hip implant options on the shelves? It’s encouraging, though, to hear people FINALLY admit that their current information systems DO NOT provide the integrated data they need to analyze these problems and address them with consistency, confidence, and in real time.
Let’s start with the discouraging part. When asked if their current reporting and analytic needs were being met I got a lot of the same uninformed, disconnected responses, “yeah we have a decision support department”; “yeah we have Epic so we’re using Clarity”; “oh we just <insert limited, niche data reporting tool here>”. I don’t get too upset because I understand in the world of surgery, there are very few organizations that have truly integrated data. Therefore, they don’t know what they don’t know. They’ve never seen materials, reimbursement, billing, staffing, quality, and operational data all in one place. They’ve never been given consistent answers to their data questions. Let’s be honest, though – the priorities are utilization, turnover, and volume. Very little time is left to consider the opportunities to drastically lower costs, improve quality, and increase growth by integrating data. It’s just not in their vernacular. I’m confident, though, that these same people are currently, more than ever, being tasked with finding ways to lower costs and improve quality – not just because of healthcare reform, but because of tightening budgets, stringent payers, stressed staff, and more demanding patients. Sooner or later they’ll start asking for the data needed to make these decisions – and when they don’t get the answers they want, the light will quickly flip on.
Now for the encouraging part – some people have already started asking for the data. These folks can finally admit they don’t have the information systems needed to bring operational, financial, clinical and quality data together. They have siloed systems – they know it, I know it, and they’re starting to learn that there isn’t some panacea off-the-shelf product that they can buy that will give this to them. They know that they spend way too much time and money on people who simply run around collecting data and doing very little in the way of analyzing or acting on it.
So – what now?! For most of the attendees, it’s back to the same ol’ manual reporting, paper chasing, data crunching, spreadsheet hell. Stale data, static reports, yawn, boring, seen this movie a thousand times. For others, they’re just starting to crack the door open on the possibility of getting help with their disconnected data. And for a very few, they’re out ahead of everyone else because they already are building integrated data solutions that provide significant ROI’s. For these folks, gone are the days of asking for static, snapshot-in-time reports – they have a self-service approach to data consumption in real time and are “data driven” in all facets of their organization. These are the providers that have everyone from the CEO down screaming, “SHOW ME THE DATA!”; and are the ones I want to partner with in the journey to lower cost, higher quality healthcare. I just hope the others find a way to catch up, and soon!
It’s not even the reporting tool for which your clinicians have been asking!
I have attended between four and eight patient safety and quality healthcare conferences a year for the past five years. Personally, I enjoy the opportunities to learn from what others are doing in the space. My expertise lies at the intersection of quality and technology; therefore, it’s what I’m eager to discuss at these events. I am most interested in understanding how health systems are addressing the burgeoning financial burden of reporting more (both internal and external compliance and regulatory mandates) with less (from tightening budgets and, quite honestly, allocating resources to the wrong places for the wrong reasons).
Let me be frank: there is job security in health care analysts, “report writers,” and decision support staff. They continue to plug away at reports, churn out dated spreadsheets, and present static, stale data without context or much value to the decision makers they serve. In my opinion, patient safety and quality departments are the worst culprits of this waste and inefficiency.
When I walk around these conferences and ask people, “How are you reporting your quality measures across the litany of applications, vendors, and care settings at your institution?,” you want to know the most frequent answer I get? “Oh, we have Epic (Clarity)”, “Oh, we have McKesson (HBI),” or “Oh, we have a decision support staff that does that”. I literally have to hold back a combination of emotions – amusement (because I’m so frustrated) and frustration (because all I can do is laugh). I’ll poke holes in just one example: If you have Epic and use Clarity to report here is what you have to look forward to straight from the mouth of a former Epic technical consultant:
“It is impossible to use Epic “out of the box” because the tables in Clarity must be joined together to present meaningful data. That may mean (probably will mean) a significant runtime burden because of the processing required. Unless you defer this burden to an overnight process (ETL) the end users will experience significant wait times as their report proceeds to execute these joins. Further, they will wait every time the report runs. Bear in mind that this applies to all of the reports that Epic provides. All of them are based directly on Clarity. Clarity is not a data warehouse. It is merely a relational version of the Chronicles data structures, and as such, is tied closely to the Chronicles architecture rather than a reporting structure. Report customers require de-normalized data marts for simplicity, and you need star schema behind them for performance and code re-use.”
Translation that healthcare people will understand: Clarity only reports data in Epic. Clarity is not the best solution for providing users with fast query and report responses. There are better solutions (data marts) that provide faster reporting and allow for integration across systems. Patient safety and quality people know that you need to get data out of more than just your EMR to report quality measures. So why do so many of you think an EMR reporting tool is your answer?
There is a growing sense of urgency at the highest levels in large health systems to start holding quality departments accountable for the operational dollars they continue to waste on non-value added data crunching, report creation, and spreadsheets. Don’t believe me? Ask yourself, “Does my quality team spend more time collecting data and creating reports/spreadsheets or interacting with the organization to improve quality and, consequently, the data?”
Be honest with yourself. The ratio, at best, is 70% of an FTE is collection, 30% is analysis and action. So – get your people out of the basement, out from behind their computer screens, and put them to work. And by work, I mean acting on data and improving quality, not just reporting it.
Lured by the promise of big data benefits, many organizations are leveraging cheap storage to hoard vast amounts of structured and unstructured data. Without a clear framework for big data governance and use, businesses run the risk of becoming paralyzed under an unorganized jumble of data, much of which has become stale and past its expiration date. Stale data is toxic to your business – it could lead you into taking the wrong action based on data that is no longer relevant.
You know there’s valuable stuff in there, but the thought of wading through all THAT to find it stops you dead in your tracks. There goes your goal of business process improvement, which according to a recent Informatica survey, most businesses cite as their number one Big Data Initiative goal.
Just as the individual hoarder often requires a professional organizer to help them pare the hoard and institute acquisition and retention rules for preventing hoard-induced paralysis in the future, organizations should seek outside help when they find themselves unable to turn their data hoard into actionable information.
An effective big data strategy needs to include the following components:
- An appropriate toolset for analyzing big data and making it actionable by the right people. Avoid building an ivory tower big data bureaucracy, and remember, insight has to turn into action.
- A clear and flexible framework, such as social master data management, for integrating big data with enterprise applications, one that can quickly leverage new sources of information about your customers and your market.
- Information lifecycle management rules and practices, so that insight and action will be taken based on relevant, as opposed to stale information.
- Consideration of how the enterprise application portfolio might need to be refined to maximize the availability and relevance of big data. In today’s world, that will involve grappling with the flow of information between cloud and internally hosted applications as well.
- Comprehensive data security framework that defines who is entitled to use the data, change the data and delete the data, as well as encryption requirements as well as any required upgrades in network security.
Get the picture? Your big data strategy isn’t just a data strategy. It has to be a comprehensive technology-process-people strategy.
All of these elements, should of course, be considered when building your big data business case, and estimating return on investment.
So, you’ve decided to go with Epic or Centricity or Cerner for your organization’s EMR.
Good, the first tough decision is out of the way. If you’re a medium to large size healthcare organization, you likely allocated a few million to a few hundred million dollars on your implementation over five to ten years. I will acknowledge that this is a significant investment, probably one of the largest in your organizations history (aside from a new expansion, but these implementations can easily surpass the cost of building a new hospital). But I will argue: “Does that really mean the other initiatives you’ve been working should suddenly be put on hold, take a back seat, or even cease to exist?”Absolutely not. The significant majority of healthcare organizations (save a few top performers) are already years and almost a decade behind the rest of the world in adapting technology for improving the way the healthcare is delivered. How do I know this? Well, you tell me, “What other industry continues to publicly have 100,000 mistakes a year?” Okay, glad we now agree. So, are you really going to argue with me that being single-threaded, with a narrow focus on a new system implementation, is the only thing your organization can be committed to? If you’re answer is yes, I have some Cher cassette tapes, a transistor radio, a mullet, and some knee highs that should suit you well in your outdated mentality.
An EMR implementation is a game-changer. Every single one of your clinical workflows will be adjusted, electronic documentation will become the standard, and clinicians will be held accountable like never before for their interaction with the new system. Yes, it depends on what modules you buy – Surgery, IP, OP, scheduling, billing, and the list goes on. But for those of us in the data integration world, trying every day to convince healthcare leaders that turning data into information should be top of mind, this boils down to one basic principle – you have added yet another source of data to your already complex, disparate application landscape. Is it a larger data source than most? Yes. But does this mean you treat it any differently when considering its impact on the larger need for real time, accurate integrated enterprise data analysis? No. Very much no. Does it also mean that your people are suddenly ready to embrace this new technology and leverage all of its benefits? Probably not. Why? Because an EMR, contrary to popular belief, is not a panacea for the personal accountability and data problems in healthcare:
- If you want to analyze any of the data from your EMR you still need to pull it into an enterprise data model with a solid master data foundation and structure to accommodate a lot more data than will just come from the system (how about materials management, imaging, research, quality, risk?)
- And please don’t tell me your EMR is also your data warehouse because then you’re in much worse shape than I thought…
- You’re not all of a sudden reporting real time. It will still take you way too long to produce those quality reports, service line dashboards, or <insert report name here>. Yes there is a real time feed available from the EMR back end database, but that doesn’t change the fact that there are still manual processes required for transforming some of this information, so a sound data quality and data governance strategy is critical BEFORE deploying such a huge, new system.
The list goes on. If you want to hear more, I’m armed to the teeth with examples of why an EMR implementation should be just that, a focused implementation. Yes it will require more resources, time and commitment, but don’t lose sight of the fact that there are plenty more things you needed to do with your data before the EMR came, and the same will be the case once your frenzied EMR-centric mentality is gone.
Independent research firm Forrester recently released their annual “Forrester Wave: Web Analytics, Q4 2011” report naming Adobe, IBM, comScore, and WebTrends as the current leaders of the web analytics industry. AT Internet and Google Analytics were also included as “strong performers” while Yahoo Analytics took 7th place as the lone wolf in the “contender” category.
Not surprisingly Adobe Site Catalyst and IBM Coremetrics stood out with the top two scores overall but WebTrends Analytics 10 and comScore Digital Analytix showed major stengths as well. Unica NetInsight, another offering from IBM did not make the list because of its inevitable fate to be merged with Coremetrics. In 2010, IBM acquired both Unica and Coremetrics. The Forrester report states, “IBM is incorporating the complementary and notable features of Unica NetInsight into a merged web analytics solution based on the Coremetrics platform.”
IT’S UNFORTUNATE: Large amounts of money are spent on new hires, yet little is left for employee and data improvement
I recently had an Executive Director of a Cancer Institute tell me,
“At this time, we plan to use simple spreadsheets for our database. We are committing more than $500,000 for investment in personnel to start our translational laboratory this year. I hope we can subsist with simple spreadsheet use for our pilot studies.”
This sentiment immediately followed a detailed discussion, one that I’m very familiar with, concerning disparate researchers’ databases and how organizations’ needs remain unsatisfied, suffering from lack of integrated data.
Just so we’re all on the same page, let me make sure I understand this situation correctly –
- You are currently using “simple spreadsheets” to assist researchers with all things data. You’ve astutely noticed that these stale methods don’t meet your needs, and you agreed to a meeting with Edgewater because you’ve heard positive success stories from other cancer centers.
- You just spent three quarters of a million dollars on fresh staff for a new translational lab.
- You are now budget-constrained because of this arrangement and want these new hires to use “simple spreadsheets” to do their new job… the same ineffective and inefficient spreadsheets, of course, that caused the initial trouble.
Did I understand all that correctly? I didn’t grow up in the ’60s, so I’ll continue to pass on what he’s smoking.
So who wins with this strategy, you ask? No one!
It’s unfortunate for the researchers because they continue to rely on an antiquated approach for data collection and analysis that will continue to plague this organization for years to come.
How many opportunities will be overlooked because a researcher becomes overwhelmed by his data?
It’s unfortunate for the organization because it’s nearly impossible to scale volumes (data aggregation, analysis, more clinical trials, more federal/state grant submissions, etc.) with such a fragmented approach. How much IP will walk out of the door for these organizations on those simple spreadsheets?
It’s unfortunate for the brand because it can’t market or advertise any advances, operationally or clinically, that will attract new patients.
It’s unfortunate for the patients because medicine as an industry collectively suffers when:
- Surgeons under the same roof don’t recognize and notify their counterpart researchers that they have perfect candidates for the clinical trials they’re unaware of.
- Executives continue to suffer budget declines from lower patient volumes and less additional revenue from industries partnering with cancer centers that have their act together.
- Researchers under a single roof don’t know what each other are doing.
As in the picture above, “more” doesn’t necessarily mean “better.” Ancillary personnel and sheets of data don’t necessarily equate to a better outcome. Why continue to add more, knowing that this won’t solve the problem? Why infect more new hires with the same sick system? Why addition instead of introspection?
So, just as I told him in my response, I look forward to hearing from you in about 12-18 months; that’s roughly the amount of time it took the last dozen clients to call Edgewater back to save them from themselves.