Usage Based Insurance and Big Data – What is a Carrier to Do?

sma ubi tableThere is little doubt that Usage Based Insurance (UBI) (a.k.a. Telematics) is a hot topic in the U.S. Insurance Market. A recent survey from Strategy Meets Action found that while only 18 P&C insurers have an active UBI program in more than 1 state, 70% of insurers surveyed are in some stage of planning, piloting, or implementing UBI programs.

A carrier cannot venture into this space without considering the data implications. Usage Based Insurance, whatever its flavor, involves placing a device in a vehicle and recording information about driving behavior. Typical data points collected include: vehicle identifier, time of day, acceleration, deceleration (i.e. braking), cornering, location, and miles driven. This data can then be paired with publicly available data to identify road type and weather conditions.

Now consider, a 20 mile morning commute to work that takes the driver 35 minutes. If the data points noted above (9) are collected every minute, that 20 mile commute would generate 315 data points (about 16 data points per mile driven). If the average vehicle is driven 1000 miles in a month, it would generate 16,000 data points each month or 192,000 data points each year. Now consider what happens if a carrier enrolls even 1000 vehicles in a pilot UBI program. Within a year, the carrier must accommodate the transmission and storage of over 190 million data points. Progressive Insurance, the leader in UBI in the U.S. market, has been gathering data for 15 years and has collected over 5 Billion miles of driving data.

Even more critically, the carrier must find a way to interpret and derive meaningful information from this raw driving data. The UBI device won’t magically spit out a result that tells the carrier whether the driving behavior is risky or not. The carrier must take this raw data and develop a model that will allow the carrier to score the driving behavior in some way. That score can then be applied within rating algorithms to reward drivers who demonstrate safe driving behaviors. As with all modeling exercises, the more data used to construct the model, the more reliable the results.

While data transmission and storage costs are relatively inexpensive, these are still daunting numbers, especially for small and mid-sized carriers. How can they embrace the changes that UBI is bringing to the market?

From a pragmatic perspective, these smaller carriers will need to partner with experts in data management and predictive modeling. They will need to leverage external expertise to help them successfully gather and integrate UBI data into their organizations’ decision making processes.

In the longer term, credible 3rd party solutions are likely to emerge, allowing a carrier to purchase an individual’s driving score in much the same way that credit score is purchased today. Until then, carriers need to make smart investments, leveraging the capabilities of trusted partners to allow them to keep pace with market changes.

What I learned at HFMA’s Revenue Cycle Conference at Gillette Stadium

(…while the Patriots prepared to get their butts kicked)

Right from Jonathan Bush, the co- founder and CEO of athenahealth [the keynote speaker]: “Make Hospitals Focus on What They’re Good At – Everything Else, “Seek Help!” I can help define “everything else”. For now, I will keep it generally confined to the world of healthcare data – because I would argue more time, money, and effort is wasted on getting good data than almost any other activity in a hospital.

If you are a Chief Quality Officer, or Chief Medical Informatics Officer, or Chief Information Officer – what would you rather spend your budget on?

data analysisYour analysts collecting data – plugging away, constantly, all-day into a spreadsheet?

Outcomes: Stale data in a static spreadsheet…that probably needs to be double/triple-checked…that probably is different than what the other department/analyst from down the hall gave you…that you probably wouldn’t bet your house on is accurate.

Or your analysts analyzing data and catalyzing improvement with front line leaders?

Outcomes: Real time data in a dynamic, flexible multi-dimensional reporting environment…that can roll up to the enterprise level…and drill down to the hospital → unit → provider → patient level.

Here’s a hint – this isn’t a trick question. Yet, for some reason, as you read this, you’re still spending more money on analysts reporting stale, static, inaccurate data than you are on analysts armed with real time data to improve the likelihood of higher quality and patient satisfaction scores and improved operational efficiency.

The majority of the speakers at this year’s HFMA Revenue Cycle conference seemed to accept that providers are NOT good at collecting and analyzing data, or using it as an asset to their advantage. They also seemed to align well with other speakers I’ve heard recently at HIT conferences. If you’re like 99% of your colleagues in this industry, you probably don’t understand your data either. So do what Jonathan Bush said and GET HELP!

Are you “ACO IT-Ready”?

First things first, I believe the push for accountable care is here to stay. I do not think that it is a fad that will come and go as many other attempts at healthcare reform have. Having said that, I also strongly believe that very few organizations are positioned to start realizing the benefits that will come from this reform any time soon. It’s not for lack of trying, as many organizations are already recognized as Pioneer ACO’s. But the hard part is not being established as an ACO – it’s proving you’re reducing costs and improving quality for targeted patient populations.

The first step will being January 1st, 2013. Some ACO’s will be required to start reporting quality measures – for instance the Shared Savings program from CMS for both the one-sided and two-sided models require reporting 33 quality measures. Notice I said “reporting”. So for the first year, it’s “pay for reporting”. Years 2 and 3 is when the rubber really meets the road and it becomes “pay for performance”. “Don’t just show me you are trying to reduce costs and improve quality, actually reduce and improve or realize the consequences.“

With ACO’s come reporting requirements. We in healthcare are used to reporting requirements. And those of us willing to publicly acknowledge it, more reporting means more waste. Why? Because there is job security in paying people to run around and find data…and to eventually do very little with it other than plug it in a spreadsheet, post it to a SharePoint site, email it to someone else, or well, you get my drift. Regardless of your view on these new requirements, they’re here to stay. So the $64,000 question is, are you ready to start reporting?

There is a wide range of both functional and technical requirements that healthcare providers and payers will need to address as they start operating as an ACO.  Many of the early and emerging ACOs have started the journey from a baseline of targeted patient panels to the optimized management of a population, progressing through a model with some or all of the following:

These are 7 simple questions you must be able to answer and report on DAY 1:

  1. Can you define and identify your targeted patient populations?
  2. Are you able to measure the financial and quality performance and risks of these patient panels and populations?
    1. Can you quickly, easily and consistently report quality and financial measures by Physician, Location, Service, or Diagnosis?
  3. Can you baseline your expenditures and costsassociated with various targeted patient populations?
    1. How will you benchmark your “before ACO” and “after ACO” costs?
  4. Can you accurately monitor the participation, performance and accountability of the ACO participants involved in coordinated, collaborative patient care?
  5. Will you be able to pinpoint where and when the quality of care begins to drift, so as to quickly intervene with care redesign improvements to limit the impacts on patients and non-reimbursable costs?
    1. Are you able to detect “patient leakage and provide your organization the information for its’ management? (Patient leakage is when a patient that you are treating as an ACO for a bundled payment, leaves the network for their care)
      1. Is a particular provider/provider group sending patients outside of the ACO?  If so, is it for a justified reason?
      2. Does the hospital need to address a capacity issue?
  6. Can you reconcile your internal costs of care with bundled reimbursements from payers?
  7. Are you positioned for population health management and achieving the Triple Aim on a continuing basis?

In order to answer these questions you must have a highly integrated data infrastructure. It seems I’m not the only one who agrees with this tactical first step:

  • The Cleveland Clinic Journal of Medicine agreed as it listed as one of its’ 5 Core Competencies Required to be an ACO “Technical and informatics support to manage individual and population data.”
  • Presbyterian Healthcare Services (PHS) has been a Pioneer ACO for over a year. Tracy Brewer, the lead project manager was recently asked by Becker’s Hospital Review, “What goals did you set as an ACO in the beginning of the year and how have you worked to achieve them” and her answer – “One of the major ones [goals] was updating our administrative and IT infrastructure. We had to make sure we had all the operational pieces in place to function as ACO. We also completed some work on our IT infrastructure so that once we received the claims data from CMS, we could begin analysis and really get value from it.”

The ACO quality measures require data from a number of different data sources. Be honest with me and yourselves, how confident are you that your organization is ready? Is your data integrated? Do you have consistent definitions for Providers, Patients, Diagnosis, Procedure, and Service? If you do, great you don’t have much company. If you don’t, rest assured there are organizations that have been doing data integration for nearly two decades that can help you answer the questions above as well as many more related to this new thing they call Accountable Care.

Are you Paralyzed by a Hoard of Big Data?

Lured by the promise of big data benefits, many organizations are leveraging cheap storage to hoard vast amounts of structured and unstructured data. Without a clear framework for big data governance and use, businesses run the risk of becoming paralyzed under an unorganized jumble of data, much of which has become stale and past its expiration date. Stale data is toxic to your business – it could lead you into taking the wrong action based on data that is no longer relevant.

You know there’s valuable stuff in there, but the thought of wading through all THAT to find it stops you dead in your tracks.  There goes your goal of business process improvement, which according to a recent Informatica survey, most businesses cite as their number one Big Data Initiative goal.

Just as the individual hoarder often requires a professional organizer to help them pare the hoard and institute acquisition and retention rules for preventing hoard-induced paralysis in the future, organizations should seek outside help when they find themselves unable to turn their data hoard into actionable information.

An effective big data strategy needs to include the following components:

  1. An appropriate toolset for analyzing big data and making it actionable by the right people. Avoid building an ivory tower big data bureaucracy, and remember, insight has to turn into action.
  2. A clear and flexible framework, such as social master data management, for integrating big data with enterprise applications, one that can quickly leverage new sources of information about your customers and your market.
  3. Information lifecycle management rules and practices, so that insight and action will be taken based on relevant, as opposed to stale  information.
  4. Consideration of how the enterprise application portfolio might need to be refined to maximize the availability and relevance of big data. In today’s world, that will involve grappling with the flow of information between cloud and internally hosted applications as well.
  5. Comprehensive data security framework that defines who is entitled to use the data, change the data and delete the data, as well as encryption requirements as well as any required upgrades in network security.

Get the picture? Your big data strategy isn’t just a data strategy. It has to be a comprehensive technology-process-people strategy.

All of these elements, should of course, be considered when building your big data business case, and estimating return on investment.

BIG DATA in Healthcare? Not quite yet…

AtlasLet’s be honest with ourselves. First –

“who thinks the healthcare industry is ready for Big Data?”

Me either…

Ok, second question,

“who thinks providers can tackle Big Data on their own without the help of healthcare IT consulting firms?”

Better yet,

“can your organization?”

Big data” seems to be yet another catch phrase that has caught many in healthcare by surprise. They’re surprised for the same reason I am which was recently summed up for me by a VP of Enterprise Informatics at a 10 hospital health system – “how can we be talking about managing big data when very few [providers] embrace true enterprise information management principles and can’t even manage to implement tools like enterprise data warehouses for our existing data?” Most people in healthcare who have come from telecommunications, banking, retail, and other industries that embraced Big Data long ago agree the industry still has a long way to go. In addition vendors like Informatica who have a proven track record of helping industries manage Big Data with their technology solutions, still have yet to see significant traction with their tools in healthcare. There are plenty of other things that need to be done first before the benefits of managing Big Data come to fruition.

Have we been here before? Didn’t we previously think that EMR’s were somehow going to transform the industry and “make everything simpler” to document, report from, and analyze? Yes we now know that isn’t the case, but it should be noted that EMR’s will eventually help with these initiatives IF providers have an enterprise data strategy and infrastructure in place to integrate EMR data with all the other data that litters their information landscape AND they have the right people to leverage enterprise data.

Same can be said of Big Data. It should be relatively easy for providers to develop a technical foundation that can store and manage Big Data compared to the time and effort needed to leverage and capitalize on Big Data once you have it. For the significant majority of the industry the focus right now should be on realizing returns in the form of lower costs and improved quality from integrating small samples of data across applications, workflows, care settings, and entities. The number of opportunities for improvement in the existing data landscape with demonstrable value should be top priority to mobilize stakeholders to action. Big Data will have to wait…for now.

Big Data + Small Process Thinking = Disappointing Results

Big data is in the news this week.  In a recent Forbes article describing the hidden opportunities of big data, Albert Pimentel Chief Sales and Marketing officer at Seagate quoted Mark Dean, an IBM fellow and director of the Almaden Research Center as saying, “Computation is not the hard part anymore.”  As with most big technology transformations, one of the hardest parts is always getting the process and people part right.

Big data has the potential to position businesses to outperform their competitors, as described in a recent McKinsey article that dubs big data the next frontier for innovation, competition, and productivity.  As businesses race to implement big data technology, there are some serious business process transformations that need to take place to fully leverage the investment in any big data initiative.

In the Big Data-driven approach to business transformation, the most important business processes are those that relate to Customer Experience Management across all fronts:

  • Manage customer loyalty
  • Manage customer value
  • Manage customer relationships
  • Manage customer feedback

These processes cross the more  traditional high level process siloes of “Manage Sales, Manage Marketing, Manage Customer Service, ” which were usually organized along departmental lines.

What actions will be taken based on the actionable intelligence that big data provides? Initiatives across departmental siloes must be closely orchestrated or the customer experience will become chaotic and confusing. Marketing campaigns have to be coordinated with activities across all customer facing roles in the organization. Effective enterprise program management is critical to this successful coordination. Marketing has to be thought of less as a department and more as a shared business responsibility.

When trying to leverage big data, it’s important to step back and answer critical questions before moving forward on multiple fronts:

  • What strategies and processes do you use to influence customer behavior on your website, in your retail outlets, at virtual and real time events? Are they working synergistically, or are they are crossed purposes?
  • What change management principles do you apply to shift customer attitudes towards your company, your employees, your products? Are you fully leveraging the power of third party change agents, or only applying  traditional, direct influence measures?
  • Are our processes too rigid to allow us to be a world-class, big data-driven organization? Should we concentrate on defining broad strokes strategies instead?

At the end of the day, the most successful businesses will be those that harness the power of big data and big process thinking to outrun the competition. More food for thought on the intersection of big data and big process can be found at:

Healthcare Analytics – A Proven Return on Investment: So What’s Taking So Long?

So what do you get when you keep all your billing data in one place, your OR management data in another, materials management in another, outcomes and quality in another, and time and labor in yet another? The answer is…………..over 90% of the operating rooms in America!

That’s right; the significant majority of operating rooms DO NOT have an integrated data infrastructure. In the simplest terms, that means that the average OR Director/Administrator CAN’T give answers to questions like, “of all orthopedic surgeons performing surgery within your organization (single or multi-facility), what surgeon performs total knee replacements with the lowest case duration, least number of staff, lowest rate of complication, infection, and re-admission rate at the lowest material and implant cost with the highest rate of reimbursement?” In other words, they can’t tell you who their highest quality, most profitable, least risky, least costly, best performing surgeon is in their highest revenue surgical specialty. Yes, I’m telling you that they can’t distinguish the good from the bad, the ugly from the, well, uglier, and the 2.5 star from the 5 star. Are you still wondering why there is such a strong push for transparency of healthcare data to the average consumer?

You’re sitting there asking yourself, “Why can’t they answer questions like whose the least costly, most profitable and highest quality surgeon?” and the answer is simple, “application-oriented analysis”. Hospitals have yet to realize the benefits of healthcare analytics. That is, the ability to analyze information that comes from multiple sources in one location, instead of trying to coordinate each individual system analyst and have them hand their spreadsheet off to the other analyst that then adds in her data and massages it just right to hand it off to the next guy, and then….ugh you get the point. If vendors like McKesson, Cerner, and Epic could make revenue off of sharing data and “playing well with others” they would, but right now they don’t. They make their money off of deploying their own individual solutions that may or may not integrate well with other applications like imaging, labs, pharmacy, electronic documentation, etc. They will all tell you that their systems integrate, but only once you’ve signed their contract and read that most of the time, it requires their own expertise to build interfaces, so you’ll need to pay for one of their consultants to come do that for you – just ask anyone who has McKesson Nursing Documentation how long it takes to upgrade the system or how easy it is to integrate with their OR Management system so floor nurses can have the data they need on their computer screen when the patient arrives directly from surgery. Out of the box integrated functionality/capability? Easy-to-read, well documented interface specifications that a DBA or programmer could script to? Apple plug-n-play convenience? Not now, not in healthcare.

Don’t get too upset though, there are plenty of opportunities to fix this broken system. First, understand that organizations such as Edgewater Technology have built ways to integrate data from multiple systems and guess what – we integrated 5 OR systems in a 7 hospital system and they saved $5M within the first 12 months of using the solution, realizing a ROI 4 times their original cost.  Can it be done? We proved it can. So what is taking so long for others to realize the same level of cost savings, quality improvement and operational efficiency? I don’t know, you tell me? But don’t give me the “it’s not on our list of top priorities this year” or the “patient satisfaction and quality mandates are consuming all our resources” or don’t forget the “we’re too busy with meaningful use” excuses. Why? Because all of these would be achievable objectives if you could first understand and make sense of the data you’re collecting outside the myopic lens you’ve been looking through for the past 30 years. Wake up! This isn’t rocket science, we’re trying to do now what Gordon Gekko and Wall Street bankers were doing in the 80’s – data warehousing, business intelligence, and whatever other flashy words you want to call it – plain and simple, it’s integrating data to make better sense of your business operations. And until we start running healthcare like it’s a business, we’re going to continue to sacrifice quality for volume. Are you still wondering why Medicare is broke?

“Get Real” with your CRM solution

A couple years back, Gartner Group released a CRM research study that predicted “through 2006, more than 50 percent of all CRM implementations will be viewed as failures from a customer’s point of view….”

Sad to say but things are not much better today! Retaining and enhancing customer relationships remains a top 5 business issue on the Gartner CIO Agenda. It is critical, especially in today’s economy, that companies continue to invest in managing its most valuable asset, its customers. According to AMR Research, companies are investing in CRM to the tune of $14 billion dollars a year.

Is my CRM Solution Successful?

So, you have made your investment and implemented your CRM tool. Are you part of the 35% of successful deployments, or sadly the 65% that have fallen short?

If there is any doubt where your company falls, consider the following questions…

  • Are you still limited in your ability to grow your top tier accounts?
  • Are your national accounts meeting their revenue commitments?
  • Are your people using Excel and Outlook to fill gaps that your CRM tool is not meeting?
  • Can you track and measure your sales team performance? Are they closing activities and, more importantly, deals?
  • Can you assess the effectiveness of your marketing spend and campaigns? A CRM worth its salt should have great marketing capability. Your CRM should seamlessly integrate with a commercial grade marketing vendor, like email marketers and direct mail vendors. These vendors provide essential functionality including email tracking and spam prevention, integrate MS Word mail merge and direct marketing capabilities, and campaign budget management.  The bad news is that CRM solutions rarely offer this level of marketing functionality out of the box.
  • Is your CRM system a data silo? Is your customer data not feeding other enterprise tools, or are you not interfacing with transaction data to measure projected revenue to actual revenue?

If you answered yes to any of those questions, your implementation may be heading in the wrong direction. Now that you realize the hard reality, how did you right the ship?

  • Perhaps you drank the ASP/SaaS model kool-aide. The cloud is a great and cost effective solution…however, “Success not Software” does not mean “No Risk”. While the promise of speed to market with limited CapEx layout is attractive to everyone, the shortcuts you take early on in your deployment will cost you double down the road.  I have spoken with many people that have struggled to scale their CRM solution to meet the changing needs of their business and is cost prohibitive to deploy across the enterprise.
  • You assumed that CRM success comes out of the box. CRM software is usually designed for many types of business. To make the software work for you, no matter what the vendor tells you, there will be configuration and some customization to meet the unique needs of your industry. CRM packages are designed to meet the needs of the largest common denominator. A Product screen for an insurance company will not look like the Product screen for a hospitality company.
  • You did not focus on critical factors that drive user adoption such as, familiar interface, standardized processes, organizational readiness, change management, or ease of use.

Get on the Road to Recovery

Determining that your CRM implementation has problems may be instant or it may materialize over time. Regardless, you must analyze the situation and determine your rescue strategy. So, what can you do now?

Treat your CRM system as the critical Enterprise solution that it is. You would not implement a new accounting system without first understanding accounting practices across the organization, identifying integration points, and dealing with change management. It is important to note that the success factors for CRM implementations rest largely outside the scope of the software itself. To establish the best implementation strategy, you must

  • Identify the Organizational impact early on and build change management strategy and tools accordingly.
  • Get executive buy-on on the new strategy and include stakeholders from across the organization (Marketing, Sales, Finance, Customer Service, IT)
  • Pay attention to data quality, specifically data governance practices and procedures. There is nothing more detrimental to user adoption than bad data.
  • Standardize AND SUPPORT your business processes across the enterprise
  • Highlight benefits to sales team often (i.e. up-sell, cross-sell, single system, clean data, 360° view of the customer)
  • Understand your industry-specific CRM needs and ensure your can configure screens and workflows to match
  • Address the complexities of integrating into your enterprise architecture with multiple legacy data silos and even offline processes
  • Provide the right customer intelligence with dashboards and robust reporting to the right people, at the right time, to effect change

Carefully consider the points above and invest in CRM tool experience when designing and implementing your solution. Don’t fall into the trap of trying to learn a new tool on the job! There are many functions and features available within the package, and outside through the open source collaborative environment. Bending your business processes to meet your software will affect user adoption.

Look for a CRM partner with a broad and proven resume of integration and enterprise information system implementations. If you treat your CRM implementation like the critical enterprise application it is, you are more likely to be part of the 35% of CRM implementations that succeed!

Picis Exchange Global Customer Conference – “It’s All About the Data”

The Picis Exchange Global Customer Conference went off without a hitch last week in Miami. The main information sessions were categorized by the four areas of a hospital Picis specializes in: Anesthesia and Critical Care, Emergency Department, Perioperative Services, and Revenue Management Solutions (via its acquisition of LYNX Medical Systems). I was able to attend a number of sessions, network with both the company and its customers, and hear what the top priorities for this diverse group are over the next few years. As I reviewed my notes this weekend, thinking back to all the conversations I had with OR Directors, Quality Compliance Managers, Clinical Analysts, Billing and Coding Auditors, Anesthesiologists, and IS/IT Directors, one theme emerged – it’s all about the data!

The most frequent discussions centered around a few major challenges the healthcare industry, not just Picis clients, must deal with in the coming months and years. These challenges vary in complexity and impact on the 5 P’s [Patients, Providers, Physicians, Payers, and Pharmaceutical Manufacturers]. Picis customers and users, who collect, analyze, present and distribute data most efficiently and effectively related to the following challenges, position themselves as stable players in an increasingly turbulent industry:

  • Meaningful Use – “What data must I show to demonstrate I’m a meaningful user of Healthcare IT to realize the greatest number of financial incentives available? How can I get away from free-text narrative documentation and start collecting discrete data in anticipation of the newly announced HIMSS Analytics expanded criteria?
  • Quality & Regulatory Compliance – “How can I improve my quality metrics such as Core Measures and keep them consistently high over time? How can I reduce the amount of time it takes for me to report my data? How can I improve my data collection, analysis, and presentation to enable decision makers with actionable data?”
  • ICD-9 to ICD-10 Conversion – “What data and processes must I have in place to demonstrate use of ICD-10 before the looming deadline? Is my technical landscape integrated and robust enough to handle the dramatic increase in ICD-10 codes? Does my user community understand the implications of the changes associated with this conversion?”
  • Resource Productivity – “How can I reduce the amount of time my staff spends chasing paper, manually abstracting charts, and analyzing free-text narrative documentation? What percentage of these processes can I automate so my staff is focused on value-added tasks?”
  • Revenue Cycle Improvement & Cost Transparency – “How can I integrate my clinical, operational, and financial data sets to understand where my opportunities are for enhanced revenue? How can I standardize these as best practices? Can I cut costs by reducing inventory on hand and redundant vendor/supply contracts or by improving resource utilization and provider productivity? How will this impact patient volume? Am I prepared for healthcare reforms’ “call for transparency?”

All of these challenges, although unique, have fundamental components in common that must be established before any progress is made. Each instance requires that processes are established to standardize the collection of data to ensure accuracy and consistency so users can “trust the data”. A “single version of the truth” is essential; without this your hospital will continue to be pockets of siloed expertise lying in Excel spreadsheets and Access databases (best case), or paper charts and scanned documents (worst case) that are laboriously re-validated at every step in the information lifecycle.

Picis did a wonderful job of reinforcing its commitment to its customer base. It promised improved product features, more intuitive user interfaces, an enhanced user community for collaboration and idea sharing, and more opportunities for training. Fundamentally, Picis is a strong player in a market that seems ripe for consolidation and its potential for growth is very high. Yet, Picis will always be just that, a product company. The healthcare industry no doubt needs strong products such as Picis to drive critical operations, and collect the data necessary for improved decision making and transition from paper to automation. But Picis acknowledged, through its evolving collaboration with partners such as Edgewater Technology that understand both the technical landscape and clinical domain, that the true spark for change will come when the people and processes align with these products more effectively. This combination will be the foundation for a heightened level of care from an integrated data strategy that propagates a formula of superior patient outcomes from every dollar spent.

Analyzing Clinical Documentation Requires Discrete Data

How many of your patients’ paper medical charts look something like this?  How many similar piles are on the front desk of the OR? The PACU managers office? The scheduling department? Your office?

I know it’s not pretty, it’s legible…barely, it’s written free hand, it’s clunky, it’s outdated, it’s like hearing your favorite song on an 8-track or cassette tape, it’s simply a thing of the past. Oh, and it takes a lot of time which means it costs a lot of money.

Doctors spend a lot of time and money going to school to become experts on the human body – that’s who I want taking care of me. Unfortunately, they are burdened by a system that requires they write specific phrases, terms, and codes just to get paid essentially becoming experts in understanding a set of reimbursement business rules – that’s not who I want taking care of me. Healthcare is an industry that’s core infrastructure, its backbone of information centered on diagnosis, procedure, and other treatment and care delivery codes, is broken. Why? Because all of that information is currently written down – not electronic!

I’m prepared to help fix a broken system. I have personally seen over 100 different ways for a physician to write down their observation after a routine visit with a patient. This includes the phrasing of the words, penmanship/legibility, abbreviations (only officially “accepted” abbreviations though), and interpretation.  The same thing goes for an appendectomy, blood work, an MRI, and an annual physical. This is unacceptable. The important information that a physician records must be entered as discrete data elements directly into a computer. This means that each piece of data has its own field – sorry circulating nurses who love free-text “case notes” sections at the end of surgery – and the time of free text and narrative documentation is over. Do you know how much time and money can be saved by avoiding the endless paper chasing and manual chart abstraction? Me either, but I know it’s a lot!

How do you fix it? I’m not going to lie and tell you it’s easy. Governance helps. You can guarantee that surgeons, anesthesiologists, hospitalists, specialists and the rest will all have their needs and comforts…and opinions. “If you want to perform surgery at this facility you need to document your information discretely, electronically, consistently and in a timely fashion.” Physicians are used to writing stuff down, its familiar, its comfortable, it’s home cooking. In order to change that comfortable behavior you must emphasize the benefits:  they will spend less time documenting, they will have faster clinical decision support, they will have automated and timely reporting capabilities, they will have near real time feedback on their performance, benchmarks against best standards, and opportunities for improvement. Doctors can appreciate an investment in an evidence-based approach. In order to automate the collection, reporting, and analysis of the mountain of information collected every day, on every patient, in every part of the hospital, it must be entered discretely. That or you waste more time and money than your competitor who just went all electronic. Do you really want to control costs and get paid faster?  Stop using paper and join the 21st century!