Daily Matching and Merging of Guest Data to Make the Guest Unique

In the hospitality industry, storing information about guests and their stays is critical to tailoring offers to each guest. Information regarding the guest and the stay is stored in a system called the property management system at the hotel franchise’s corporate headquarters.  The property management system stores information for each guest such as: (This is a sample list of attributes, not a complete list):

  • Prefix (Mr., Mrs., Miss, Dr., etc.)
  • Full name • Company name
  • Physical address
  • One or more e-mail addresses
  • One or more telephone numbers
  • Gender

The property management system stores information such as the following for each reservation that a guest makes (this is a sample list of attributes, not a complete list):

  • Booked date
  • Arrival date
  • Departure date
  • Folio status (booked, checked-in, checked-out, cancelled, no show, etc.)
  • Property where the guest will stay
  • How the reservation was made
  • Promotion response code (this is the code that is captured if the reservation was made after receiving promotional e-mail such as providing a discount if a reservation is booked in the next 5 days)
  • Room type booked
  • Length of stay
  • Method of payment
  • Was the reservation an advance purchase

Property management systems can be setup with centralized or decentralized data storage architecture. Let’s discuss what each of these means to storing guest profile and stay data.

Centralized PMS Data Repository:

With this type of storage architecture, all of the guest profile and guest stay data is stored in one location. Therefore, the franchise properties for a particular franchise will access all of the guest profile and stay data from a single location. What does this mean? It means that this system:

  • Reduces the amount of duplicate data entered as searching for a guest can be done against a single source
  • Records all guest stay data to a single guest profile record
  • Eliminates the need to cleanse the data by a data processing company

Decentralized PMS Data Repository:

With this type of storage architecture, the guest profile and guest stay data is stored in multiple locations. Each franchise property has its own storage and the data from each of these sources of rolls up to a single storage located at the franchise headquarters. This type of operating environment makes it very difficult to provide for the daily matching and merging of guest data to make guest data unique. Therefore, there is a definite need to cleanse the data and remove the duplicates by a data processing company. This can be done on a daily, weekly, or monthly basis.  However, frequent cleansing may not be very cost effective. So, what are potential causes for the duplicated data?:

  • There is no central database of guest profile data
  • There is no visibility into guest stays at other properties
  • Lack of a single unique identifier for the guest

While there are three main causes for duplicate data, there are also guest data anomalies/concerns that can contribute to the duplicate and inaccurate data. Here are some of the anomalies/concerns:

  • Guest data inconsistently entered across properties
  • Guest data from external sources (Hotwire, Priceline, etc.)
    • Not providing guest personal data (i.e. generic address, generic email)
  • Administrators at a company may be making the reservations for a group of people with different names and email addresses
    • Front desk staff are not updating the correct guest profile information
  • Data quality issues:
    • Guest name example: first name was blank and last name was Hotel Beds (Wholesale reservation channel)
    • Incorrect guest address, which impacts guest match rate
    • Email address of hotel property entered instead of that of the guest, with insufficient guest profile information
  • Operational Data Collection Issues:
    • The field ‘Email’ is not required to be completed
    •  Mobile phone numbers are not in PMS with proper masking and numeric fields
  • Guest data cleanup initiative is required

As one can see, a decentralized PMS architecture tends to have challenges around the uniqueness of guest profile data. The rest of this blog will focus on processes surrounding how to make a guest unique in an operating environment that has a decentralized PMS system.

The following section depicts an example solution created with Microsoft Dynamics CRM 2011 on making a guest unique in a decentralized PMS environment.  This solution includes the following components and processes:

  • PMS Database Tables: these are the tables in the PMS System that store the guest profile information such as name, address, e-mail, etc. and the reservations made by each guest
  • Guest Profile:
    • Definition: this is the record in CRM that stores basic information about the user and is a cumulative rollup of certain guest stay information
    • The basic information stored is:
      • First name
      • Last name
      • Physical address
      • Zip code
      • Telephone number(s)
      • E-mail address(s)
      • Opt-in
      • Gender
    • The cumulative rollup information stored is the following:
      • What was the method used to book the last reservation?
      • What is the frequent method of booking reservations?
      • What is the total room nights stayed across all reservations?
      • What is the total sum of revenue spent?
      • What is the total number of stays?
      • What was the last date stayed?
      • Is the guest in-house or does he/she have a pending reservation?
      • What was the average daily rate paid by the guest?
      • What is the average length of stay?
  • Guest Stay:
    • Definition: this is the record in CRM that stores information about each guest stay. A guest stay record is created anytime a guest books a reservation at a particular franchise property.  Therefore multiple guest stays can be associated to a single guest profile record
    • The information captured on the guest stay form is:
      • The date the reservation was booked
      • The date the guest plans on arriving
      • The date the guest plans on departing
      • The franchise property where the guest will be staying
      • The folio status (booked, checked-in, checked-out, cancelled, no show, etc.)
      • The type of room booked
      • The average daily rate for each particular stay
      • The length of stay for each particular stay
      • The method of payment used to book the reservation
      • Was the reservation an advance purchase?
  • Now that we know what the components are, lets discuss how these components are used in the following three processes:
    • Daily loading
    • Cleansing
    • Reconciliation

Figure 1: Guest Data LifeCycle

guest data 5-20-13

 Daily Loading Process:

The first process we will discuss is the daily loading of the reservations. For each reservation loaded into Dynamics CRM, a guest profile record is created and a corresponding guest stay record is created which is associated to the guest profile record.  For example, if John Smith makes two reservations (regardless of the reservation channel) for two rooms, he will have two guest profile records and two guest stay records in the PMS. Each guest stay record will be associated to one guest profile record.  Since the data is stored in this manner in the PMS, it is loaded from the PMS into Dynamics CRM in the same way. This means that every time a guest makes a reservation, there will be a guest profile record and an associated guest stay record created for each reservation, even if the guest stays multiple times in a short period such as a month at a franchise property. This is referred to as un-cleansed data.

Cleansing Process:

On a regularly scheduled basis, such as a weekly or monthly period, the guest profile data from the PMS is compiled into a file and sent to a data processing company. The data processing company reviews for duplicates based on the following criteria:

  • First name
  • Last name
  • E-mail address
  • Physical address:
    • Street 1
    • Street 2
    • City
    • State
    • Zip code

Reconciliation process:

(this can be weekly or monthly basis. For this discussion the reconciliation process will be monthly at the end of each month)

  • Step 1: Once the duplicate records are removed, and the cleansed file is returned, the new guest profile data is loaded into a staging database and in this database all the cleansed guest profile records are associated to their guest stay records. For example, a file containing un-cleansed guest profile records for the entire month of January was sent to the data processing company.  If there were two John Smith’s in the un-cleansed file, and then based on the above criteria, the two matching records will be compared and the one that has the most completed information would be kept.  With the returned cleansed file containing only the one John Smith guest profile, the next step is to associate all the guest stays that John Smith had for the month of January to his cleansed guest profile record.  For example, if John Smith stayed five times in the month of January, then all five stay records would now be associated with his cleansed guest profile record.
    In a nutshell when the un-cleansed file is sent to the data processing company at the end of each month it, the data being cleansed is for the month that just ended. Once the cleansed file is returned with unique guests for the month, then the guest stays for the each of those guests for that particular month is associated to each unique guest profile record.
  • Step 2: Once step 1 is completed, then the identified unique guest profiles with their associated guest stays for the previous month are loaded into Dynamics CRM 2011.  This is performed in two separate steps:
    • First the unique guest profiles are loaded
    • Then the guests’ stays are loaded and associated to the guest profile records
  • Step 3: Once step 2 is completed, then the un-cleansed guest profile and guest stay data loaded in the previous month is purged. This process will only delete all the un-cleansed guest profile and guest stay data. The cleansed guest profile and guest stay data will remain in Dynamics CRM permanently.

As one can see, when a PMS has a decentralized storage architecture, there are some processes that need to be put in place to make a guest unique.

Trigger Based Marketing with Dynamics CRM 2011


Creating and maintaining profitable customers is the main aim of business. Therefore, customer satisfaction leading to profit is the central goal of hospitality marketing. In the hospitality industry, the marketing department tends to be responsible for both Business-to-Consumer (B2C) and Business-to-Business (B2B) marketing.  The marketing team needs detailed data on prior leisure and business guests to successfully target marketing campaigns to the appropriate audiences.

The main purpose of this article is to discuss B2C marketing and in particular a process called Trigger based Marketing in Microsoft Dynamics 2011.

What is trigger based marketing?  For the purposes of this article, trigger based marketing is defined as consumer profile or stay data meeting the criteria of a marketing list for an active campaign. This active campaign then processes an e-mail blast to the recipients of the marketing list on a scheduled and automated basis.


Assume the requirement is to create a process that allows for the automation of trigger based campaigns which, in this case, means guest satisfying particular marketing criteria.  This requirement of automating trigger based campaigns is dependent on daily matching and merging of guest data to make the guest unique (the process of ensuring a guest is unique will be discussed in a separate blog). The following are potential trigger based campaigns:

  • Driven by check-ins – “Welcome”
  • Driven by check-outs – “Thanks for Staying”
  • Reactivation
    • “We miss you!”
    • Requires guest stay history
  • In-house guest campaigns
    • Loyalty related
    • Requires guest stay history

So, the next question becomes, “Once the triggered campaign requirement is satisfied how does the e-mail blast execute on a scheduled basis?” To satisfy this requirement a third party marketing integration product such as ExactTarget, CoreMotives, or ClickDimensions can be used.  These are just some third party marketing integration products for CRM 2011, there are additional products out there.  Most of these third party marketing add-ons tend to have Application Programming Interfaces (APIs) available that can be programmed. In this particular instance, ExactTarget and the ExactTarget API were used to achieve the requirement of executing the e-mail blast on a determined schedule.


The processes developed contain a combination of manual and automated processes to complete the triggered e-mail process.

Manual Processes

  • STEP 1: Marketer contacts advertising company to create art design
  • STEP 2: Approved art design is converted to HTML format
  • STEP 3: Receives completed HTML
  • STEP 4: Prepare website for marketing campaign with website design company
  • STEP 5: Create an e-mail for each triggered e-mail process in ExactTarget
  • STEP 6: Create a dynamic marketing list for each triggered e-mail scenario
  • STEP 7: Create a campaign for each triggered e-mail scenario
  • STEP 8: Create an ExactTarget automated send record for each triggered e-mail scenario
  • STEP 9: Create an application configuration record for each triggered campaign

Automated Processes

  • STEP 10: Custom application runs at a schedule time on a nightly basis and performs a lookup against active campaign configuration records.  Deactivated campaign configuration records will not be executed against.  While this application will run on a nightly basis by default, it will use the configuration record for each active triggered campaign to determine the appropriate schedule (daily, weekly, monthly) it needs to run at for the automated campaigns. For daily processing, the custom application will run every night at a given time.  For weekly processing, the custom application will check the day of each week of the current day and compare it to the list of days of the week selected in the configuration record. If the current day of the week matches any of the selected days of the week in the configuration record, it will run.  For monthly processing, the custom application will run on the date selected on the configuration record. Therefore, the custom application will run on the exact same day each month. The custom application is setup and scheduled using the windows task scheduler.
  • STEP 11a: Create a copy of the dynamic marketing list as a static list
  • STEP 11b: Compare the new static list created in the step above to the static control group list and removes the contacts matching in both lists from the static marketing list, which is then used as the list to process the e-mail blast
  • STEP 11c: Create a recipient record for each daily email blast and then the recipient record will be associated to the marketing list attached to the campaign
  • STEP 11d: Associate the recipient records to the appropriate send record designated for each campaign
  • STEP 12: Results from the email blasts will be processed as ExactTarget response activity type records directly into CRM against the guest profile record

metesh email process graphicCONCLUSION

In conclusion, while this is only one solution of meeting a requirement to process triggered campaigns using Dynamics CRM 2011 and ExactTarget,  other solutions can be created using Dynamics CRM 2011 and other third party marketing integration products.

Paying Too Much for Custom Application Implementation

Face it. Even if you have a team of entry-level coders implementing custom application software, you’re probably still paying too much.

Here’s what I mean:

You already pay upfront for fool proof design and detailed requirements.  If you leverage more technology to implement your application, rather than spending more on coders, your ROI can go up significantly.

In order for entry-level coders to implement software, they need extra detailed designs. Such designs typically must be detailed enough that a coder can simply repeat patterns and fill in blanks from reasonably structured requirements. Coders make mistakes, and have misunderstandings and other costly failures and take months to complete (if nothing changes in requirements during that time).

But, again…   if you have requirements and designs that are already sufficiently structured and detailed… how much more effort is it to get a computer to repeat the patterns and fill in the blanks instead?   Leveraging technology through code generation can help a lot.

Code generation becomes a much less expensive option in cases like that because:

  • There’s dramatically less human error and misunderstanding.
  • Generators can do the work of a team of offshored implementers in moments… and repeat the performance over and over again at the whim of business analysts.
  • Quality Assurance gets much easier…  it’s just a matter of testing each pattern, rather than each detail.  (and while you’re at it, you can generate unit tests as well.)

Code generation is not perfect: it requires very experienced developers to architect and implement an intelligent code generation solution. Naturally, such solutions tend to require experienced people to maintain (because in sufficiently dynamic systems, there will always be implementation pattern changes)  There’s also the one-off stuff that just doesn’t make sense to generate…  (but that all has to be done anyway.)

Actual savings will vary, (and in some cases may not be realized until a later iteration of the application)but typically depend on how large and well your meta data (data dictionary) is structured, and how well your designs lend themselves to code generation.  If you plan for code generation early on, you’ll probably get more out of the experience.  Trying to retro-fit generation can definitely be done (been there, done that, too), but it can be painful.

Projects I’ve worked on that used code generation happened to focus generation techniques mostly on database and data access layer components and/or UI.  Within those components, we were able to achieve 75-80% generated code in the target assemblies.  This meant that from a data dictionary, we were able to generate, for example, all of our database schema and most of our stored procedures, in one case.  In that case, for every item in our data dictionary, we estimated that we were generating about 250 lines of compilable, tested code.  In our data dictionary of about 170 items, that translated into over 400,000 lines of  code.

By contrast, projects where code generation was not used generally took longer to build, especially in cases where the data dictionaries changed during the development process.  There’s no solid apples to apples comparison, but consider hand-writing about 300,000 lines of UI code while the requirements are changing.  Trying to nail down every detail (and change) by hand was a painstaking process, and the changes forced us to adjust the QA cycle accordingly, as well.

Code generation is not a new concept.  There are TONs of tools out there, as demonstrated by this comparison of a number of them on Wikipedia.  Interestingly, some of the best tools for code generation can be as simple as XSL transforms (which opens the tool set up even more).  Code generation may also already be built into your favorite dev tools.  For example, Microsoft’s Visual Studio has had a code generation utility known as T4 built into it for the past few versions, now.   That’s just scratching the surface.

So it’s true…  Code generation is not for every project, but any project that has a large data dictionary (that might need to be changed mid stream) is an immediate candidate in my mind.  It’s especially great for User Interfaces, Database schemas and access layers, and even a lot of transform code, among others.

It’s definitely a thought worth considering.

Are you really listening to your customers?

customer-serviceIf the pressure to obtain and implement Customer Relationship Management software is any indication, companies are recognizing the increasing importance of customer knowledge. Indeed, customer insights can lead companies to their best opportunities for growth far more accurately than that marketing presentation in the boardroom. The increasingly-reluctant-spending-customer needs to be better understood because company growth depends on it. The challenge is that customer interactions are not typically structured information that is easily analyzed to be acted upon, but are increasingly emails, phone conversations, web-based chat support and other unstructured information.

Outbound direct mail or telemarketing is simply not getting results for marketing departments. The focus needs to shift to creating a great customer experience on the inbound approach as an alternative. Doesn’t everyone enjoy doing business with a company that makes it easy to find and obtain what you are looking for? You don’t have to look far for proof of this idea. No longer able to differentiate on brand reputation, leading companies instead are focusing on customer experience—the all important feelings that customers develop about a company and its products or services across all touch points—as the key opportunity to break from their competition. Evidence of this new emphasis is found in the emergence of the “Chief Customer Officer (CCO)” role across the Fortune 1000 community. Companies such as United Airlines, Samsung and Chrysler have all recently announced chief customer officers as part of their executive suites.

The first challenge faced by these newly minted executives is customer experience management (CEM)—the practice of actively listening to customers, analyzing what they are saying to make better business decisions and measuring the impact of those decisions to drive organizational performance and loyalty. Enter a new technology to address all of the unstructured information that comes from customer interactions – text analytics. Text analytics is specialized software that annotates and restructures text into a form suitable for data mining. Text mining comes from data mining, a statistically rooted approach to classification, clustering, and derivation of association rules. Fortunately, there is much to be learned about how to handle unstructured data from two decades of struggling with similar problems in the structured data world. We now know as needs change and evolve, organizations will require the flexibility to integrate the most appropriate text processing technologies to extract desired information. They must enable users to apply time-tested analytical approaches that can be modified or expanded upon as understanding of issues and opportunities emerges from the data itself. For example, a call center should be able to apply a multi-dimensional analysis (i.e., “slice and dice”) to call center logs and email text for assessing trends, root causes, and relationships between issues, people, time to resolution, etc. Organizations should have the infrastructure, storage, and user interfaces to process and efficiently explore large volumes of data. And they need to easily leverage their existing BI and data warehousing (DW) tools presently used only for structured data analyses, to analyze unstructured data alongside structured data.

When text analytics are implemented against unstructured customer information, Customer Experience Management will drive significant, quantifiable benefits for the enterprise. In the most effective approaches to CEM, companies use text analytics to collect and analyze intelligence from all of the varied sources of feedback available inside and beyond the enterprise. They grow more intimate with their customers and more agilely adopt informed improvements. The focus is a real-time feedback loop that will result in a continual, systematic capability for measuring and improving customer experience.

The real magic always lives in the intersection of key technologies. Using text analytics for identifying the opportunities and trends from your customers then requires action – cross-selling or up-selling, generally implemented using automated workflows during the customer interaction. The faster and smoother the customer transaction occurs will help ensure “positive” feelings for the customer experience. A carefully architected solution implementation will drive this all important synergy for outstanding competitive results – and happy customers seeking out your company. The new mantra for marketing: Listen to your customers and make them happy.