Augmented Reality – What a great idea!

First a confession is in order – I’m not a big fan of cell phone cameras.  In the corporate world, they are sometimes banned or considered a nuisance.  In talking around the water cooler, cell phone cameras are terrific for documenting car accidents, especially when you aren’t at fault.  However an exciting use for cell phone cameras has emerged from Europe – augmented reality.  If you are unfamiliar with the concept of augmented reality, think of the Terminator movies.  When the robot looks at a person, scene or object, there is a set of facts or augmented information presented as a layer on top of the picture.  terminator_2_large_16For fighter pilots, the heads up display while looking forward out of the canopy is another good example of augmented reality.

The idea that you can point your cell phone camera at the scene in front of you and immediately through a “reality” browser see overlays of information about shops, restaurants and other facts is exciting and potentially game-changing for tourism, advertising, and mobile browsing in general.   Using the location-based services for cell phones, especially smartphones with built in GPS features, the software creates a “layer” of information on top of the picture.  In fact, the company, sprxmobile, driving this capability has a product called Layar that enables real time digital information on top of reality through the camera of a cell phone.  Their web site lists 87 available layers in many verticals including real estate, healthcare, transportation, tourism, entertainment, weather, schools and universities, and local search and directory services.  Today, the new software is limited to the Android operating system used in Google-oriented cell phones, but hopefully the idea will grab mainstream attention and move to other major smartphone operating systems.

Clearly, adding this reality layer service to browsers has broad applications beyond cell phones, however there are immediate applications for mobile users that come to mind.  Standing in front of a house for sale, pointing the camera at the home and seeing the price, number of bedrooms, etc. would be great.  Even better would be the ability to compare augmented information from a snapshot of a home up the street.  The application could capture the location based information from the cell phone with the picture and enhance the search experience.  Think about the impact of digital photography to grab GPS coordinates for adding information automatically or posting location information to Flickr, for example.

Augmented reality may be the “killer” application for smartphones beyond the obvious contact and calendar management.  The ability to add the value of layers of actionable information to where you are immediately located could revolutionize personal computing as well. The ease of adding this type of service to a browser demonstrates the power of both web services and mash-ups. My hope would be that it doesn’t simply add more advertising to our world, but ease traveling, shopping, navigating universities and large sporting venues and bring us actionable information in real time.  It is an exciting technology that needs to be nurtured and adopted for mainstream cell phones.

The Magic of Mash-ups: Co-browsing

What is co-browsing?

Co-browsing lets multiple users work together in their respective browsers through what look like shared screens and communicate via telepresence including video and audio.  The impact of this technology is enormous as companies become more virtual and the need for serious collaboration increases to be competitive in tough times.  To be able to share, interact and see the body language of your collaborator in real-time without extraordinary downloads to your PC or expensive third party solutions could simply change the way we work.  This innovation comes from not Google, or Yahoo but from IBM in a proof of concept project called Blue Spruce, a Web browser application platform that IBM is working on to allow simultaneous multiuser interactions enabled by AJAX and other standard technologies through the Web browser.

blue spruce header

The Blue Spruce project is IBM’s solution to the classic one-window, one-user limitation of current Web browsers.  The application is a mash-up that combines Web conferencing with voice and video and other data forms to let people share content including existing Web widgets – at the same time.  Two different users, possibly anywhere, are able to move their respective mouse pointers around the screen in the browser to click and make changes on the shared application, with the platform enabling concurrent interactions through the browser without disruptions.  Despite the appearance, the co-browsers aren’t actually sharing content. Both collaborators obtained a Web page through the Blue Spruce client, but the “events” enabled by the mouse are what is being sent to the Blue Spruce Co-Web Server.  The idea is that no matter where the two users are in the Internet world, they pick up the general data caches on both personal computers and react to the events.

The applications for co-browsing collaboration are numerous, especially for knowledge workers. In healthcare, IBM has used Blue Spruce to create an online “radiology theatre” product, currently at the prototype stage, which allows teams of medical experts to “simultaneously discuss and review patients’ medical test data using a Web browser.” The project is being run in collaboration with the Brigham and Women’s Hospital of Boston.  According to IBM, it has created a secure Web site that allows select medical experts at Brigham and Women’s Hospital to access and collaborate on data such as CT scans, MRIs, EKGs and other medical tests. Each medical expert can “talk and be seen through live streaming audio/video through their standard web connection, and have the ability to whiteboard over the Web page as well as input information to the patient’s record.” Basically it is a secure multimedia experience running inside a single browser window, using Blue Spruce as the platform.

It is important to note that Blue Spruce is not your typical “fat client” or downloaded application, but it is a fully browser-based application development platform, currently in development, which is being built on open Web standards. The main feature of Blue Spruce is that it allows for a combination of different Web components – data mashups, high-definition video, audio and graphics – to run simultaneously on the same browser page. It’s important to note that the Radiology Theatre app only requires a standard Web browser – so there’s nothing to download for the end user, in this case, doctors.

This is how IBM described how the new online radiology theatre will work:

 “A group of doctors can log into a secure Web site at the same time to review and analyze a patient’s recent battery of tests. For instance, a radiologist could use her mouse to circle an area on the CT scan of a lung that needs a closer look. Then using the mouse she could zoom into that scan to enlarge the view for all to see. An expert on lung cancer could use his mouse to show how the spot had changed from the last scan. And then, a pathologist could talk about patient treatments based on spots of that size depending on age and prior health history, paging through clinical data accessible on the site.”

“The theatre allows all these experts to discuss, tag and share information simultaneously, rather than paging through stacks of papers, calling physicians to discuss scan results and then charting the results. This collaborative consultation brings together the personal data, the experts and the clinical data in one physical, visual theatre.” 

The impact on rural medicine and the need for telemedicine for key healthcare experts is significantly advanced with this technology.
Perhaps the biggest potential benefit of the online radiology theatre is that it will enable experts from all over the world to consult on cases. The ability for multiple users to “co-browse” means they can interact in the browser in real-time and see each other’s changes.  Of course, since this is medical data, there are significant privacy implications involved in using the Internet to collaborate.  The time and cost savings from collaboration is important, but better and faster decision making is the key.

The need for inexpensive and minimally invasive techniques for real collaboration over the Internet is real and the backlog of potential applications is fun to consider.  Imagine reviewing your health care or insurance claims with a live person (and their reactions) at the insurance company to reduce cycle time, or collaborating on new product engineering drawings from the U.S. with your Chinese manufacturer.  Imagine the potential for teaching or training with key experts and a worldwide audience using a live whiteboard. Finally, imagine not paying big monthly fees for basic meeting collaboration needs on a daily basis.  Blue Spruce is really a technology to keep an eye on.

Knowledge-based computing: next generation of business intelligence?

pablopPablo Picasso once said “Computers are useless.  They only give you answers.”  The truth is that computers have to work very hard to provide answers to what appear to be simple questions.  While we are buried in terabytes, petabytes and exobytes of data – answers and information can be very hard to come by, especially information necessary for serious business decisions.   Data must be viewed in context of a subject area to become information, and analytic techniques must be applied to information in order to create knowledge worthy of taking action.  The challenge is getting data into context within a subject area and applying the right analytic techniques to get “real” answers.

Enter Wolfram Alpha, as an “answer” engine.  Once touted as the next generation of search engine, this web application combines free form natural language input, i.e. simple questions, and dynamically computed results.  Behind the scenes, a series of supercomputers provide linguistic analysis (context for both the question and the answer), ten terabytes of curated data that is constantly being updated, dynamic computation using 50,000 types of algorithms and equations, and computed presentation with 5,000+ types of visual and tabular output.  Sound impressive?  It could easily be a glimpse of the next generation of business intelligence and decision-support systems.

Wolfram Alpha lets you input a query that requires data analysis or computation, and it delivers the results for you. It’s “curated” data is specially prepared for computation— data that’s been hand-selected by experts working with Wolfram, who go through steps to make sure the raw data is tagged semantically and is presented unambiguously and precisely enough that it can be used for accurate computation.  Alpha demonstrates the real power of metadata – data about data, and the importance of semantic tags for categorizing data into a context necessary for providing knowledge and, thus, answers.

Wolfram Alpha is not a search engine according to Wolfram Research co-founder Theodore Grey.  It is not a replacement for Google.  He says that Alpha is very, very different from a search engine. “Search engines are like reference librarians,” Grey explained. “Reference librarians are good at finding the book you might need, but they’re useless at interpreting the information for you.”  Alpha takes reams of raw information and performs computations using those data.  It produces pages of new information that have never existed on the Internet. “Search engines can’t find an answer for you that a Web page doesn’t have,” Grey explained.

“It’s been a dream of many people for a long time to have a computer that can answer questions,” said Grey. “A lot of people may think of a search engine as that, but if you think about it, what search engines do is an extreme limited subset of that sort of thing.”  Examples of how Alpha can be used today range from solving difficult math equations to doing genetic analysis, examining the historic earnings of public companies, comparing the gross domestic products of different countries, even measuring the caloric content of a meal you plan to make. You can find out what day of the week it was on your birthday, or show the average temperature in your area going back days, months or years.

Wolfram Alpha would make an “ultimate” business intelligence application by computing over an enterprise data warehouse once the data was properly “curated.”  The ability to create knowledge from data, particularly to create actionable answers is what business executives really expect – not prettier presentations.  The only questions left for Alpha are:

  1. who can curate your data for you, and
  2. how quick can you see Alpha running over your data?

It’s time to clean out the Junk Drawer!

In the December 23rd issue of CIO magazine, there is a great article on “The Case for Enterprise Architects” by Kim S. Nash. Clearly, this type of article catches my eye because I am an Enterprise Architect but it is important to note that in tough economic times, the role of the Enterprise Architect becomes more valuable.  Instead of simply slashing staff to reduce costs, an Enterprise Architect can helpjunk-drawer-before the CIO save money by “cleaning out the junk drawer.” The average corporation has tens and sometimes hundreds of business applications, databases, one-time-use programs and other junk cluttering up their environment and, more importantly, wasting valuable maintenance dollars and personnel resources.

The Enterprise Architect can look at this mess and begin to organize it with an eye to reducing complexity and gaining better alignment with business needs. The process is called an application rationalization or, you’ll love this, an App-Rat. It isn’t a complicated process to develop an inventory of all of the good stuff and the junk, but the real skill comes in the analysis process. With the inventory in hand, the EA then maps which applications that deserve continued investment, which need less investment (stop paying maintenance, etc.), which applications need to be retired and where applications are simply missing. While it sounds like a simple process, it can become difficult in organizations that have grown by merger and acquisition to track down the information and get it organized for the decision-making process. It is clearly worth the effort when the latest statistics show that over 70% of an IT budget is spent on maintenance of existing applications. The big benefit is simply freeing up some of those maintenance dollars to retain key personnel and spend it on new, more aligned applications.

It is sometimes humorous when cleaning out the junk drawer of business applications and databases. Invariably, there are one or more applications that have become what I like to call “black boxes.” Data goes into the black box and the right answer comes out. However, no one in the IT department knows how the application was built and certainly would only maintain it under extreme duress. Programmers know how difficult it can be to follow breadcrumbs left by a long departed developer. The truth is that these applications represent very real risks to organizations and a dangerous hidden cost if they break. The skill set necessary to properly maintain these applications may not exist in enough depth in the current IT staff. The application rationalization process can identify these applications, the related skill gap and lay out a road map to resolve this risk, and more importantly, help remove this high risk and hidden cost.

It’s clearly time to clean out the junk drawer by bringing in the enterprise architect to organize, simplify and help with your budget pressures. One of the big benefits is that it demonstrates to your organization that IT can be focused on making sound investments and caring about managing costs. It will provide your IT staff with a sense of accomplishment to improve the alignment of IT with the goals of the business. Finally, it will reduce complexity in an area (IT) of the organization that struggles to cope with constant complexity.

Are you really listening to your customers?

customer-serviceIf the pressure to obtain and implement Customer Relationship Management software is any indication, companies are recognizing the increasing importance of customer knowledge. Indeed, customer insights can lead companies to their best opportunities for growth far more accurately than that marketing presentation in the boardroom. The increasingly-reluctant-spending-customer needs to be better understood because company growth depends on it. The challenge is that customer interactions are not typically structured information that is easily analyzed to be acted upon, but are increasingly emails, phone conversations, web-based chat support and other unstructured information.

Outbound direct mail or telemarketing is simply not getting results for marketing departments. The focus needs to shift to creating a great customer experience on the inbound approach as an alternative. Doesn’t everyone enjoy doing business with a company that makes it easy to find and obtain what you are looking for? You don’t have to look far for proof of this idea. No longer able to differentiate on brand reputation, leading companies instead are focusing on customer experience—the all important feelings that customers develop about a company and its products or services across all touch points—as the key opportunity to break from their competition. Evidence of this new emphasis is found in the emergence of the “Chief Customer Officer (CCO)” role across the Fortune 1000 community. Companies such as United Airlines, Samsung and Chrysler have all recently announced chief customer officers as part of their executive suites.

The first challenge faced by these newly minted executives is customer experience management (CEM)—the practice of actively listening to customers, analyzing what they are saying to make better business decisions and measuring the impact of those decisions to drive organizational performance and loyalty. Enter a new technology to address all of the unstructured information that comes from customer interactions – text analytics. Text analytics is specialized software that annotates and restructures text into a form suitable for data mining. Text mining comes from data mining, a statistically rooted approach to classification, clustering, and derivation of association rules. Fortunately, there is much to be learned about how to handle unstructured data from two decades of struggling with similar problems in the structured data world. We now know as needs change and evolve, organizations will require the flexibility to integrate the most appropriate text processing technologies to extract desired information. They must enable users to apply time-tested analytical approaches that can be modified or expanded upon as understanding of issues and opportunities emerges from the data itself. For example, a call center should be able to apply a multi-dimensional analysis (i.e., “slice and dice”) to call center logs and email text for assessing trends, root causes, and relationships between issues, people, time to resolution, etc. Organizations should have the infrastructure, storage, and user interfaces to process and efficiently explore large volumes of data. And they need to easily leverage their existing BI and data warehousing (DW) tools presently used only for structured data analyses, to analyze unstructured data alongside structured data.

When text analytics are implemented against unstructured customer information, Customer Experience Management will drive significant, quantifiable benefits for the enterprise. In the most effective approaches to CEM, companies use text analytics to collect and analyze intelligence from all of the varied sources of feedback available inside and beyond the enterprise. They grow more intimate with their customers and more agilely adopt informed improvements. The focus is a real-time feedback loop that will result in a continual, systematic capability for measuring and improving customer experience.

The real magic always lives in the intersection of key technologies. Using text analytics for identifying the opportunities and trends from your customers then requires action – cross-selling or up-selling, generally implemented using automated workflows during the customer interaction. The faster and smoother the customer transaction occurs will help ensure “positive” feelings for the customer experience. A carefully architected solution implementation will drive this all important synergy for outstanding competitive results – and happy customers seeking out your company. The new mantra for marketing: Listen to your customers and make them happy.

Cloud Computing: Where is the Killer App?

As an avid reader, I have read too many articles lately about how the bleak economy was going to drive more IT teams to use cloud computing. The real question: what are the proper applications for Cloud Computing? For the more conservative IT leader, there must be a starting point that isn’t throwing one of your mission-critical applications into the cloud.

One of the best applications of cloud computing that I have seen implemented recently is content management software. One of the challenges with content management is that it is difficult to predict the ultimate storage needs. If the implementation is very successful, the storage needs start small and immediately zoom into hundreds of gigabytes as users learn to store spreadsheets, drawings, video and other key corporate documents. Open source content management software can be deployed quickly on cloud computing servers and the cost of storage will ramp up in line with the actual usage. Instead of guessing what the processor needs and storage will be, the IT leader can simply start the implementation and the cloud computing environment will scale as needed. My suggestion is to combine wiki, content management and Web 2.0 project management tools running in the cloud computing space for your next major software implementation project or large corporate project.

A second “killer” application for cloud computing is software development and testing. One of the headaches and major costs for software development is the infamous need for multiple environments for developing and testing. This need is compounded when your development team is using Agile development methodologies and the testing department is dealing with daily builds. The cloud computing environment provides a low-cost means of quickly “spinning up” a development environment and multiple test environments. This use of the cloud  is especially beneficial for web-based development, and testing load balancing for high traffic web sites. The ability to “move up” on processor speeds, number of processors, memory and storage helps establish real baselines for when the software project moves to actual production versus the traditional SWAG approach. The best part is that once the development is complete, the cloud computing environment can be scaled back to maintenance mode and there isn’t unused hardware waiting for re-deployment.

The third “killer” application is data migration. Typically, an IT leader will need large processing and storage needs for a short term, to rapidly migrate data from an older application to a new one. Before the cloud, companies would rent hardware, use it briefly and ship it back to vendor. The issue was guessing the necessary CPU power and storage needs to meet the time constraints for the dreaded cut-over date. The scalability of the cloud computing environment reduces the hardware cost for data migrations and allows flexibility for quickly adding processors on that all important weekend. There is simply no hardware to dispose of when the migration is complete. Now that is a “killer” application in my humble opinion. By the way, cloud computing would be an excellent choice for re-platforming an application, too, especially if the goal is to make the application scaleable.

In summary, if your IT team has a short term hardware need, then carefully consider cloud computing as a cost effective alternative. In the process, you might discover your “killer app” for cloud computing.

ICD-10: Apocalypse or Advantage?

mayan-calendarWith humanity coming up fast on 2012, the media is counting down to this mysterious — some even call it apocalyptic — date that ancient Mayan societies were anticipating thousands of years ago.  However, the really interesting date in healthcare will happen one year earlier. In 2011, per the mandate of Senate Bill 628, the United States will move from the ICD-9 coding system to ICD-10, a much more complex scheme of classifying diseases that reflects recent advances in disease detection and treatment via biomedical informatics, genetic research and international data-sharing. For healthcare payers and providers that have used the ICD-9 coding system for submitting and paying healthcare claims for the last 30 years, it could be apocalyptic without proper planning and execution.  Conservative estimates of the cost of switching to ICD-10 are 1.5 to 3 billion dollars to the healthcare industry as a whole and nearly $70,000 for each doctor’s practice.

Since 1900, regulators of the U.S. health care system have endeavored to give care providers a systematic way to classify diseases so that care processes could be standardized and appropriate payments made. Like many of the world’s developed health care systems, the United States follows the World Health Organization’s (WHO) International Statistical Classification of Diseases and Related Health Problems (ICD) code standard that is typically used internationally to classify morbidity and mortality data for vital health statistics tracking and in the U.S. for health insurance claim reimbursement. In 2011, technically, healthcare providers and payers will be moving from ICD-9-CM to ICD-10-CM and ICD-10-PCS.  To meet this federal mandate, it will be essential that information systems used by U.S. health plans, physicians and hospitals, ambulatory providers and allied health professionals also become ICD-10 compliant. The scale of this effort for healthcare IT professionals could rival the Y2K problem and needs immediate planning.

The challenge is that the U.S. adoption of ICD-10 will undoubtedly require a major overhaul of the nation’s medical coding system because the current ICD-9 codes are deeply imbedded as part of the coding, reporting and reimbursement analysis performed today. In everyday terms, the ICD-9 codes were placed in the middle of a room and healthcare IT systems were built around them. It will require a massive wave of system reviews, new medical coding or extensive updates to existing software, and changes to many system interfaces. Because of the complex structure of ICD-10 codes, implementing and testing the changes in Electronic Medical Records (EMRs), billing systems, reporting packages, decision and analytical systems will require more effort than simply testing data fields – it will involve installing new code sets, training coders, re-mapping interfaces and recreating reports/extracts used by all constituents who access diagnosis codes. In short, ICD-10 implementation has the potential to be so invasive that it could touch nearly all operational systems and procedures of the core payer administration process and the provider revenue cycle.

A small percentage of healthcare organizations, maybe 10 to 15 percent, will use ICD-10 compliance as a way to gain competitive advantage – to further their market agendas, business models and clinical capabilities. By making use of the new code set, these innovators will seek to derive strategic value from the remediation effort instead of procrastinating or trying to avoid the costs. An example will be healthcare plans that seek to manage costs at a more granular level and implement pay for performance programs for their healthcare providers. In addition, ICD-10 offers an opportunity to develop new business partnerships, create new care procedures, and change their business models to grow overall revenue streams. Healthcare organizations looking for these new business opportunities will employ ICD-10 as a marketing differentiator to create a more competitive market position.

There are three key areas for healthcare organizations wanting to convert regulatory compliance into strategic advantage with ICD-10 remediation:

  1. Information and Data Opportunities – Healthcare entities that are early adopters of ICD-10 will be in a position to partner with their peers and constituents to improve data capture, cleansing and analytics. This could lead to the development of advanced analytical capabilities such as physician score cards, insightful drug and pharmaceutical research, and improved disease and medical management support programs, all of which create competitive advantage.
  2. Personal Health Records Opportunities – Using ICD-10 codes, innovative healthcare entities will have access to information at a level of detail never before available, making regional and personal health records (PHRs) more achievable for the provider and member communities. Organizations that align themselves appropriately can provide a service that will differentiate them in the marketplace.
  3. Clinical Documentation Excellence Program – Developing and implementing a Clinical Documentation Excellence (CDE) program is a critical component of organizational preparedness to respond to future regulatory changes because there could be an ICD-11 on the horizon.

Healthcare organizations need to understand the financial impact that ICD-10 will have on their bottom line and begin the operational readiness assessments, gap analyses and process improvement plans to facilitate accurate and appropriate reimbursement. Without action, a healthcare organization can expect to endure “data fog” as the industry moves through the transition from one code set to another. Now is the time to choose to gain the advantage or procrastinate on the coming code apocalypse.