Straight Through Processing: BUILDING A STRONG SOLID FOUNDATION

Some great feedback on my previous post – “Straight Through Processing & Underwriting  –  The Starting Point”.

Many thanks to those who responded with your comments and feedback.

In that post I stated: “STP helps to drive efficiency and consistency” throughout the life cycle of all policies and that insurers need to start with their underwriting process.  Many of you posed the question “What about the new business process as well as workflow processing — why are these not part of the starting point well?”  A+ – Congratulations – THEY ARE!!

Fact: Underwriting is a set of guidelines that determines the eligibility of a client to receive an insurer’s product. New business rules determine if the client meets the product’s prerequisites as defined by the insurer.   One differs from the other, yet both compliment each other in the final decision process.

Fact: New business coming in the door for an insurance carrier starts the business process for new policies.  It only makes sense then that this starting point for building a solid STP business models resides in both underwriting and new business processing.  They support and complement each other hence the need for both to be worked concurrently.  Supporting these two functions is a much needed workflow management process/system.  All three business functions forms the foundation you need to continue to build your STP process.

Working extensively with our clients to forge a path to a full fledged STP business model, we begin building this foundation by reviewing and dissecting both their underwriting and new business processes.  We also begin the much needed workflow analysis and build process.

Underwriting Guideline Review

The focus here is to concentrate on the underwriting guidelines that support each product within a given product portfolio.  We review manual and automation processes.  We determine what guidelines are consistent within the portfolio and what guidelines are unique to a specific product within the portfolio.  Once we have a clear understanding and have the details we DOCUMENT!  Documenting this process is no easy task and is often what prevents insurers from this review process.

New Business Rules Review

As with the underwriting guidelines, the focus here is on the new business rule sets that support each product within the portfolio.  This tends to be more grueling since each product in the portfolio is unique.  We look first for common business rules across all products in the portfolio, then determine what rules are unique by product.   Again once we have a clear understanding and the details we DOCUMENT!

Workflow Review

Often overlooked in the early stage by insurers is the need for a detailed review and possible re-alignment of workflow procedures.  Workflow is what moves your policies through the new business and underwriting process and is critical to the success of your STP initiative.  This step must occur concurrently with the underwriting and new business reviews.

Figure 1 below depicts the business process model that can be followed to develop that solid core foundation needed for your STP.

stp workflow

FIGURE 1

With that stated I hope this helps to address the questions and comments from my previous post.  Now I must end with asking this question – if you have your core foundation what is your next step in the STP process?  From my perspective the next step that I take with my clients is ……. to be continued….

Straight Through Processing & Underwriting ~ THE STARTING POINT

Over the past several years, insurance carriers have engaged more and more in Straight Through Processing (STP) initiatives.  I see many different areas where STP can play a significant role for carriers:

  • Underwriting
  • New Business
  • Billing and Collections
  • Policyholder Services

Early adopters honed in on the imaging of paper documents as the starting point for STP automation.  The focus should be less about imaging of documents and more about capturing data in electronic format, then using the data in the life cycle of a policy from underwriting through the claims process. 

As I work more and more with our clients on these initiatives I often hear the following:

“Our STP initiative is focused on the  automation of our new business process, utilizing electronic policy data from our agents/agencies to feed directly into our Policy Administration System”. 

I am not in complete disagree with this statement, in fact I am in favor of it.  But, I do not believe that this is the best starting point.  The underwriting business process and the underwriters are where STP needs to start in order to drive efficiency and consistency throughout organization.

Why do I believe this?

For one key reason – the ultimate objective of STP is the ease of doing business with agents and policyholders.  This starts with new business, which is rooted in the underwriting process.  New business cannot exist without underwriting, so why start with new business?  You must start with the foundation of new business – Underwriting.

The goal for carriers then, is the optimization of the underwriting process, because it sets the foundation for the issuing or declining of policies.  Key objectives of insurance carriers is to consistently work with their agents and brokers to give them the products they need, AND streamline the issuing process to be more efficient and cost effective. 

Too often in my analysis of current underwriting practices at P&C and LH&A carriers I see underwriters handling every policy that an agent or agency submits.  I often ask “Are you adding any value when you touch it?”  If the answer is “NO”, then I say automate!  If you start with a goal of automating 25% of the underwriting process and an average underwriter handles 50 policies a day, that equates to 12 policies taken off the hands of the underwriter.  These policies go through the process with little or no human intervention – the ultimate goal of STP. 

Now, you maybe asking “What do I automate and when?”

Great question and one that has been asked many times over.  You begin with four developmental points:

  1. Develop or purchase a “Business Rules Engine” to add and support your underwriting rules.  Start small, with a select group of easy to automate products, such as Term Life or Auto insurance;
  2. Incorporate the ease of electronic submission;
  3. Establish or improve your agent or agency interfaces for faster and more efficient uploads and downloads of electronic data and;
  4. Create a real-time policy decision process, and put it in the hands of the agents (accept, pend or decline).

Figure 1 below depicts the process of using an Underwriting Rules Engine as the driver for the new business STP process.  

 stp-figure-1

Figure 1

The underwriting engine houses all the underwriting rules for those products you have selected.  The engine then impacts 4 critical areas in your STP process:

  1. Supports on-site policy issue by allowing underwriting rules to be downloaded to an agent or business rep’s laptop;
  2. Utilized by your administration system for further evaluation of a policy;
  3. Supports one central location where underwriting rules are stored, and;
  4. Allows the business owners and not IT to update the rules engine in a fast and efficient way, leading to a quicker turnaround on new or updated products.

In helping clients move forward with their STP initiatives I will consistently start with a detailed analysis of their underwriting process and build the foundation from the underwriting perspective. 

Where are YOU in your STP process and is underwriting part of it?
 

Claims Outsourcing Strategy – Managing a Smooth and Seamless Transition

Part 3

The focus of this three part series is to provide insight into managing a smooth and seamless transition for outsourcing claims business processes.  Part 1 concentrates on the upfront gathering of current and future requirements, the Request For Information (“RFI”) and Request For Proposal (“RFP”) process, the selection of the “right” vendor,handshake and a brief on contract negotiation.  Part 2  focuses on  the development, testing, and conversion that takes place between both organizations, and some of the pitfalls to avoid.  Part 3 will focus on maintaining a productive,  long term partnership with your vendor.

One of the biggest pitfalls seen in an outsourcing arrangement is the absence of a true partnership between the client and the vendor.  In the absence of a partnership, the replacement is a customer-vendor relationship.  In this type of relationship the customer is looking for one deal while the vendor is looking for another.  This arrangement creates a lack of trust between both parties involved, that will eventually make the relationship sour because it becomes disconnected from the true business needs and requirements.

Secondary to the partnership but equally as important is the communication between parties.  Communication ensures business interests are aligned and understood. Lack of communication throughout the life of the relationship creates tensions that will definitely hinder future value creation.  Effective and continuous communication ensures both companies are responsive, deal with the facts and not assumptions, keep all stakeholders in the loop, and make decisions in the spirit of a partnership.  Successful outsourcing arrangements, those that last for years, put in place a joint planning process between the client and vendor.  Regularly scheduled joint planning meetings every six (6) to nine (9) months assure that both the client and vendor monitor the health of their relationship.  By continuously reviewing the strengths, weaknesses and opportunities of the relationship, agreeing upon recommendations and placing those recommendations into action continually improves the relationship.  

Most often seen in healthy outsourcing relationships is an effective governance methodology or framework.  Both parties must agree early on to operate in a collaborative environment, as noted above.  In the absence of a governance structure, the resulting implications could be devastating.  This could lead to unclear roles and responsibilities between the client and vendor, challenges encountered that are not overcome and linger, problems not resolved in a timely fashion, and unmet expectations . 

Conclusion

The worth of an outsourcing agreement is generated when both companies strike a mutual agreement that forms the foundation for a long term partnership.  When both parties buy into these steps as well as avoid the pitfalls noted, the framework and foundation has been set for a long term successful partnership.  By doing so, both parties have put in place the tools, design, contracts, and methodologies that will ensure success. Failure comes when anyone of the steps are short-changed, missed or are misunderstood by either or both parties involved.  When your company makes that strategic decision to outsource, make sure you make the same decision to be successful by employing these  best practices.

Claims Outsourcing Strategy – Managing a Smooth and Seamless Transition

Part 2

In Part 1, we focused on creating the core structural foundation of a Claims Outsourcing Strategy:

  • gathering the business and technical requirements
  • creating a detailed Request For Proposal (“RFP”)
  • selecting the correct claims outsourcing vendor
  • contract negotiations

In Part 2, we will concentrate on the development, testing and conversion that take place between both companies, identifying some of the pitfalls that can be avoided during this stage.  In Part 3, we will wrap up with a focus on maintaining a strong, healthy partnership.

Development Phase

 Prior to the development phase you should ask yourself as well as your staff the following question:

“What is the best strategic approach that both companies should utilize to obtain optimal success during the development phase?”

Without a key strategy agreed upon by you and your vendor, many unforeseen obstacles will soon be on your door step, obstacles that you’ll need to juggle and resolve. 

One pitfall many organizations fail to identify prior to the start of the development phase is the common “siloed development approach”. Your company works against a set of functional and process requirements specific to your own system to support the outsourcing project.  Your vendor works against another set of functional and process requirements specific to their system.  There is no known intersection of the functional or process requirements between the two companies. By no means is this ever considered the correct strategic approach to take.

outsourcing-CF-requirements

 

By strategically working together prior to the development phase, both organizations will find the intersection or “Joint Functional and Process Requirements”, so all three pieces of the puzzle can come together and be managed accordingly.  If you skip these requirements, the puzzle is not complete and managing this process will become a nightmare.

Testing Phase

As with the development phase, a joint strategic approach for testing should be developed and agreed upon prior to testing beginning.  Many organizations fall into one of the biggest known pitfalls – unclear definition of the different phases of testing.  Ask your IT and Business staff as well as your vendor this question:

“What is considered unit testing, system testing, User Acceptance Testing (“UAT”) and Integration testing?”

I promise you this, you will not receive a unified answer from your staff or vendor as to what each phase represents.  Clearly identify in a Testing Strategy Plan what each phase’s primary purpose is, who is responsible for executing the phase within and between both companies, and what is the measurement of success for each phase.

Without a doubt you can utilize the integration concept from the development phase in the testing phase.  Developing test plans to support this process should include resources from both organizations.  Agreement on what is to be tested and who should test will lead to the optimal testing results.  This joint planning should take place early on so that both companies know what is expected of them during this joint testing effort, and have their resources allocated when it come time to start.

Conversion Planning/Execution

Conversion is one area we consistently see that both companies have not spent the quality of time or effort needed to succeed in their conversion efforts.  Conversion of data and files to and from both companies is equally as important as the development and testing efforts.  Yet most companies will spend less time understanding the data and files needed in order to make the transition and outsourcing arrangement successful.

Time and effort must be allocated to analyze and enhance the data to support the outsourcing initiative.  This includes putting a team in place to focus on:

  1. Identifying and measuring the data in your systems(s) today — determine the quality of the data or the lack thereof.
  2. Identifying a set of data quality rules and targets that must be met prior to the conversion of the data.
  3. Designing and implementing data quality improvements processes where needed, that make the data ready for the conversion.

If the correct analysis and time is not spent upfront to understand the data, files, and conversion planning, the project will come to a screeching halt. Prevent this by investing in a detailed Data and Data Quality Assessment.  Figure 1 below depicts at a high level the data process life cycle that companies must take in order to understand, scrub and enhance their data in preparation for a successful conversion to the vendor system(s).

outsourcing-analyze-enhance

Figure 1

Claims Outsourcing Strategy – Managing a Smooth and Seamless Transition

Part 1

The focus of this three part series is for the reader to gain insight into and knowledge about managing a smooth and seamless transition for outsourcing claims business processes.  Part 1 concentrates on the upfront gathering of current and future requirements, the Request For Information (“RFI”) and Request For Proposal (“RFP”) process, the selection of the “right” vendor, and a brief on contract negotiation.  Part two will focus on  the development, testing, and conversion that takes place between both organizations, and some of the pitfalls to avoid.  Part three will focus on maintaining the long term partnership relationship with your vendor. 

Outsourcing  for many Insurance and Financial Services organizations is viewed as a strategic tool for bringing about a more productive, cost effective, and profitable business.  The key benefit to outsourcing a segment of a business or an entire business unit is to enable the organization to focus on their core competencies and deploy resources with regard to these core competencies. 

When the strategic decision to outsource has been made, there are plans and best practices organizations must follow in order to guarantee a successful, smooth and seamless transition.  Foregoing any of these steps could lead to a potential disaster and often times a long, drawn out process. 

Detailed Business, Functional, and Technical Requirement

All too often I’ve seen organizations fail at one of the most important steps in the process, creating a complete set of claims business and functional requirements, documenting well defined workflows, and detailing technical and infrastructure reviews and requirements.  Organizations need to allocate the right amount of time and resources upfront to gather and document these requirements and workflows.  Unwilling to do so or short-changing the process will lead to a long and expensive outsourcing transition.

You maybe asking yourself, “Why is this step so important?”

The answer — the detailed documentation sets a strong foundation for all activities and processes that follow, both within your organization and eventually with the outsourcing organization you select.   The goal here is to achieve a clear, detailed, and descriptive understanding of your claims business.  As daunting as this may be the benefits of having your claims processes and workflows clearly defined outweighs the many pitfalls you will encounter should you not execute this step.

Selection of the Outsourcing Organization That Fits Your Needs

Finding the right organization to administer your claims business is critical to the success of your outsourcing strategy.  But how does one find which organization is the “RIGHT” fit for their business?  By creating a high level Request For Information (“RFI”) followed by a detailed Request For Proposal (“RFP”).

If your organization takes  the time upfront to detail the business, functional and technical requirements, then your RFI and RFP are nearly complete. Generating the RFI and RFP is a matter of taking the information documented and formatting it apprpriately.  The RFI is then used to narrow your search from a long list of possible outsourcing organizations to a handful of potentially qualified options.  Once you have narrowed your search, the RFP becomes your driver.  It helps to further narrow the selection process to 2 or 3 organizations.  claims-outsourcing-image-11

Figure 1 — RFI / RFP Process

Contract Negotiations

Once the outsourcing organization is selected the contract negotiations begin.  The outcome of this step is to finalize a contract that both organizations can agree to.  The focus here is to define who is responsible for what, and when.  Items that are imperative to the contract are:

  • Service Level Agreements (“SLA”)
  • Vendor modifications and who pays for them
  • Escalation procedures for disputes and interpretation of the contract
  • Cost basis per transaction

Contract negotiations are difficult and time consuming.  If done right this will set the stage for a successful transition of business and for a long term arrangement between your company and the organization you selected to handle your business.

claims-outsourcing-2 

Figure 2 – Phase 1 Tactical Roadmap

Legacy Modernization: Don’t forge a Jackson Pollock

jp5Well here I am again and “here he goes”, I hear you say as you brace yourself for some off the wall tenuous link. Well this time it is even more fun: Jackson Pollock, creator of No. 5, the world’s most expensive piece of art is back alive and kicking creating wonderful deployment diagrams at an enterprise near you.

Don’t believe me? Quick have a little play with this great web app and see if you can create the chart that you already have in Visio; you may be surprised.  That’s right, just move the mouse around and off you go, you are an online abstract artist or an individual with a serious issue – an ever evolving and morphing enterprise that seems to simply grow.

Alright, let us get serious here for a second.

Over the past decade or even decades we have heard from all the brightest and smartest in the world that Legacy applications were dead, your mainframe was going to keel over and draw its last breath and there was nothing you could do about it but go buy the latest technology and sunset. How did that work out?  Oh yeah, what happened was we all went around adding more and more applications to our architecture without the real stability or payback to ever do the sunset.

Well done to the brightest and smartest, not only was their advice totally wrong and unfounded it came with a heavy price tag: an enterprise more convoluted and costly to run than before with so many places for change that time to market, the holy grail, became longer and more cumbersome.

I have ranted enough……for now.

So what can we do? Well the answer comes with integration, it comes with technology and innovation – I know I know, I said we tried the latest technology and it failed – but we are not talking about wholesale replacement, we are talking about the use of technology to remove the need to sunset – ah ha!!!

Sure there are applications that make sense to re-platform or re-write; if your app changes heavily and would be better served with the ability to extend and integrate readily then yes, let’s take that Cobol app and create the wiz-bang .Net app. However, there are many others where it makes more sense to simply integrate – why try to port a 20 year old admin system that runs perfectly fine? Why not take 5 or 6 of them and represent them on the web with one integrated, low maintenance front end? Can it be done? Of course it can, it was just we were blinkered by the “sunsetters” and I do not mean late afternoon autumn drinks on the porch, which would be a good thing right now.

So what about Mr. Pollock? Expensive art, expensive deployment model……let’s go with Single View – the minimalist approach…..plus it is a lot easier to explain to guests you have over.

Image courtesy of the National Portrait Gallery (http://www.npg.si.edu/)

Business Intelligence: Avoiding “Operation Mincemeat”

What? Operation what? That’s right, Mincemeat and I am not referring to scrumptious pies at Christmas…..no I am referring to the reality of misinformation. What was a successful operation and asset to the Allies has become a minefield for today’s enterprise and a thirst, or more likely a downright need, for the ability to utilize our valuable data in a meaningful manner.

Let me digress for a second……I do not think we all need to be reminded of what Business Intelligence can do for us today – the facts and the benefits are plain and clear – take your data and make it actionable. Remove the idea of reporting on data and realize the vision of using data….

So one wonders why we have not all embarked upon our voyage of discovery aboard the great ship “B.I. Enlightenment.” And when we start to walk up the boarding ramp, waving goodbye to the stale data of yesteryear and the meaningless seventy characters of green bar reports we never understood anyway, we spare a moment to think “are there any icebergs in this sea?”

Of course, when one makes that pause there is a realization — what are you really gaining intelligence into? For many enterprises, our data is split across several systems and platforms; some of which are real time in nature others of which may be a week behind the times.

From what I have seen, many people are requesting more information on the tools – Which is best of breed? What do I get out of the box? Can my analysts use it? Can my dog understand it as he brings me my Sunday morning paper? How quickly can you show me my data in action?  These questions can all be answered and the “wow factor” of BI can take precedent and the definition of KPIs ensues full steam ahead for some people.

What is missing? I appear to be on the first landing but I do not remember taking the first flight of stairs…….well let’s flip back to “Operation Mincemeat”: or now to be known as “Is my data ready for Intelligence?” A critical step that must be considered lies not in the value of BI but in the readiness of your data. Misinformation was wonderful in the 1940’s but it has no place in the business arena.

When your data becomes an actionable entity you must be able to rely on its accuracy and ease of access. Reporting on reports was the way of the past – “That looks great Bill but can you cross reference that with data store “x” as sometimes we can be a little stale”. The true key to embarking on Business Intelligence is understanding where you are today in your data maturity and most important, how do you get where you need to be – reliable, reusable, actionable information.

So what does it all mean?  It means, assess where you are before you set sail……the voyage is glorious and the sights not to be missed but make sure you have a ticket for the right ship.