Share More: a framework for enhancing collaboration

In a great study, McKinsey and Company published last year they showed how companies that use social and collaborative technologies extensively (networked companies in their terminology) outperformed traditional companies. They called it “Web 2.0 finds its payday”.

So if you work for a networked company – congratulations. Now if your company is part of the vast majority of companies struggling through some forms of collaboration but not seeing enough benefits, how do you get to the payoff stage?

In this following series of posts, I’ll try to offer a methodology and examples for how to do just that. Elevate the level of collaboration and create a fully networked organization one step at a time.

We call this process Share More.

The premise is simple, for each business area or function, find a real world business challenge where collaboration can make a difference. Implement it. Move to the next one.

Creating the overall framework is like creating an association wheel for the term “Share” in the middle:

Sharing can be with just a few team members or with the whole company. It can be internal or external. If you stop and think about all the interactions you have in a week, which causes you the most pain and time? Can these interactions be made simpler using technology? Can you Share More?

The first Share More solution I’d like to address is process and workflow solutions.

Share Process

Process and form automation is all about tracking and control. The real dramatic change is in giving managers and administrators visibility into every step and log of every change and update. It can also speed the process up and save effort in typing information into other systems, initiating emails or filing paper into physical files.

We’ve worked with a large hospitality organization to automate all HR and Payroll related forms through the use of InfoPath and SharePoint and learned a lot of valuable lessons that can be valid to many a process automation:

  • Strongly enforce data integrity: Most forms are created to collect data that will be fed eventually into another system. Therefore data input must come from the same source system it will end up in. Values and choices have to be restricted to valid combinations and open text fields limited to a minimum. The cleaner the data is, the less trouble it will cause down the road.
  • Know how organizational and reporting hierarchy is maintained: While you may know what system holds the organizational reporting structure, knowing that it’s 100% accurate and maintained up to date is a lot harder. Since some forms require sending confidential information like salary for approval, the wrong reporting relationship can compromise important information. Consider masking personal or confidential information if it is not essential for the approval requested (while the data, encrypted, can still be part of the form)
  • Don’t over customize: like our beloved tax code, approval workflows can get extremely complicated and convoluted as organizational politics that evolved over the years created special cases and more exceptions than rules. Codifying these special cases is expensive and prone to change. Consider it an opportunity to streamline and simplify the rules.
  • Augment with stronger 3rd party tools: while the core systems – like SharePoint contain built in (and free) workflow mechanism, it is limited in the control, flexibility, scalability and management as it comes out of the box. Some 3rd party tools like Nintex and K2 BlackPoint provide added flexibility and scalability. For a price.
  • Version deployment: Forms and process will change. How will updates be deployed without interfering with running flows and processes?

In future posts I’ll explore other opportunities for Sharing More including Sharing Insight, Sharing Responsibly and we’ll look into specific opportunities for collaboration and sharing in insurance and healthcare.

Adobe, IBM, WebTrends, and comScore named leaders in Web Analytics

Independent research firm Forrester recently released their annual “Forrester Wave: Web Analytics, Q4 2011” report naming Adobe, IBM, comScore, and WebTrends as the current leaders of the web analytics industry. AT Internet and Google Analytics were also included as “strong performers” while Yahoo Analytics took 7th place as the lone wolf in the “contender” category.

Not surprisingly Adobe Site Catalyst and IBM Coremetrics stood out with the top two scores overall but WebTrends Analytics 10 and comScore Digital Analytix showed major stengths as well. Unica NetInsight, another offering from IBM did not make the list because of its inevitable fate to be merged with Coremetrics. In 2010, IBM acquired both Unica and Coremetrics. The Forrester report states, “IBM is incorporating the complementary and notable features of Unica NetInsight into a merged web analytics solution based on the Coremetrics platform.”

The full report can be downloaded from Adobe or WebTrends and will likely show up on other vendor sites soon.

Keeping it Fresh: The 6 Pillars of Web Content Governance

Content. It is the bane of existence for web marketing managers everywhere. As soon as a new site is up and running, the content is getting old in inaccurate by the minute. Chasing business owners to revise, update or write new content is a constant struggle. To make it worse, many areas may not have an owner at all..

Fancy CMS systems were supposed to solve all that with expiration dates on content and distributed ownership but the tools themselves are just the means. People still need to use them.

That is where Web Content Governance comes in.

Web Content Governance is the overall approach to the way content is created, managed and maintained intended to ensure consistency, accuracy, relevance and compliance. It generally comprises of 6 main components: Process, Structure, Policies, Standards, Ownership, Processes and the Systems that are used to enable, enforce and automate them.

The details of each component vary between companies but generally include the following:

  • Process
    • Creation
    • Updates
    • Retention / expiration
    • Archiving
    • Workflows:
      • Editorial review
      • Legal review
      • Brand Review
      • Publishing
  • Structure
    • Content classification
    • Media types
    • Taxonomy and Metadata
    • Hierarchy and inheritance
  • Policies
    • Legal
    • Security
    • Data collection
    • E-mail
  • Ownership
    • Roles
    • Permissions
    • Escalation
  • Standards
    • Brand Guidelines
    • Content guidelines
    • Accessibility
    • Legal
    • Copyrights
  • Systems
    • Content Management System (CMS)
    • Digital Asset Management (DAM)
    • Document Management
    • Business Process Management (BPM)

Few tips and tricks

  1. Assign a bad cop. A senior enough executive who would be the enforcer.
  2. Build a team of champions. Department of area champions who have enough familiarity with the tools and can provide knowledge and communication channel to different business units and groups. The team should meet on a regular basis.
  3. Use automation. The ability to set content expiration is a great way to ensure all content is looked at (however briefly) regularly.
  4. Don’t relinquish control over the last step. Someone from the centralized web / marketing team should still review every page before it is being published

Is Legacy Modernization Just Procrastination?

There is no doubt that replacement of core systems for insurers has been very popular over the past six years or so.  With the advancements in technology enabling vendors to provide solutions that are configurable, and more easily maintained with “plug and play” technologies that can be upgraded by less technical resources, insurers are taking advantage and moving in to new lines of business and new territories, expanding their footprint.  It allows many small and mid-size insurers to better compete with the leviathans who once staved off competition due to their enormous IT staffs.

But many of these insurers have been in business for scores of years, and have successfully relied on their older technology.  Does the advancement in technology along with ubiquitous connectivity mean that the mainframes and older technology systems just have to go?  Does just refacing the green screens with new web-based user interfaces mean that the carriers that do so are just procrastinating and putting off the inevitable?

A recent blog in Tech Digest posed that question to which I would reply, “Why?  If it ain’t broke, don’t fix it.”  With the horrible economy, many people who need a bigger house aren’t dumping the one they have and buying another, they simply add-on.  The core systems within a carrier are very similar.  If the system you have now works well for its use and if you want to expand in to new lines, you don’t need to rip out that old system and pay for an expensive funeral, just add-on and integrate.  This will start your company down the path to more flexibility which can be supported by a system that is specifically designed to bring all your information into one place – Policy360 based on CRM.

Utilizing a system designed to bring data together from multiple sources allows you to keep your existing technology, leverage the capabilities of new systems, and present and manage that information in a much more accessible and user friendly manner.

Is plastic surgery on your legacy systems really just putting off the inevitable?  Or is presenting a fresh look that sees into the future allowing you to keep costs down while expanding service and capabilities.

Multi-Touch Attribution Campaign Tracking with WebTrends

This article is a follow-up to the webinar

All web analytics platforms have some way of tracking marketing campaign performance usually out-of-the-box or with a little bit of set up. Generally they all do a pretty good job of this and provide key reports to make important business decisions about which campaigns to invest more money in, which to reduce spending on, and which to get rid of altogether. But often these decisions are made without insight into the whole picture. Why? The answer is simply because most campaign reports are set up in the industry standard way of attributing all conversions to the last or most recent campaign clicked. This is and has long been the industry standard, but it is time for a change as this method ignores the fact that people often go through multiple campaigns before converting.

So what other attribution options are there? And why wouldn’t I want to attribute conversion credit to the most recent campaign? – There are typically 3 options for campaign attribution:

  1. Last Touch (Most recent campaign)
  2. First Touch (Original campaign)
  3. Multi-touch (All campaign touches)

Technically there are two options for multi-touch attribution. One option is to give full credit to all campaign touches and the other option is to give partial credit to each touch. For example, if 3 different campaign touches resulted in a sale of $30 you could credit each touch with $10. But for the purposes of this article we will focus on the full credit option. As for the question “why wouldn’t I want to attribute conversion credit to the most recent campaign?” – this is not really the right question to ask. The better question to ask is, “Do I have the best possible insight into the performance of my marketing campaigns?” The answer to that question is almost always “no” if you are only analyzing a single attribution method. So rather than replacing industry standard last touch reports, adding first touch and multi-touch to your arsenal of reports is the best course of action.

Fortunately for WebTrends users, there has been a great method for gaining insight into all campaign touches for quite some time although a little work up front is necessary to gain the full power of this. If you are already doing basic campaign tracking within WebTrends then the visitor history table is already turned on and with minimal effort you can set up two new custom reports which report on the first touch campaign and all campaign touches respectively. To do this you need to make use of two features of the visitor history table and create two new custom dimensions, one based on WT.vr.fc (the fc stands for “first campaign”) and another based on WT.vr.ac (the ac stands for “all campaigns”). Once you have the dimensions set up you create custom reports using those dimensions and whichever metrics you want applied. To make things easier, copy the existing campaign ID report and just change the dimension to base the report on.

The “first touch” report ends up looking nearly identical to the existing campaign ID report but the rows of data will be different since the revenue and other conversion credit is applied to the first campaign that referred the conversion as opposed to the last.

Standard Campaign ID Report Sample
First Touch Campaign ID Sample

The “all touches” report is where you’ll notice more differences. You will see some or many (depending on the date range you have selected) rows of data that have multiple campaign IDs separated by semi colons. To view only the data that contains multiple campaign touches just filter the report by a semi colon.

Multi-Touch Campaign ID Report Sample

So what do you do with this information? What does it all mean?
Spending some time with this new data will likely reveal some patterns you never had insight into before. For example, you may notice certain campaigns appear to perform poorly according to your traditional last touch reports but the same campaign’s performance as a first touch is much better, or vice versa. Since the first touch report is so similar to the out of the box campaign ID report it is fairly straightforward. The only difference is that the first touch gets the credit. The all touch reports are more complicated though. What I find most useful about this report is the ability to determine a campaign’s total reach and compare it to its absolute reach.  Take for example campaign ID 32. In the above screenshots you will notice that this campaign ID has $63,441 attributed to it as a last touch campaign, $35,839 attributed to it as a first touch campaign, and $82,036 attributed to it when you search for it in the all touches report (See fig. 4 below). What this data is telling us in this particular case is that:

  • $63,441 in revenue was most recently referred by campaign 32
  • Only $35,839 in revenue was initially referred by campaign 32
  • But overall campaign 32 at least partially referred $82,036 in revenue

As you can see, there can be very significant differences in campaign performance depending on how you look at the data. Taking the easy way out and looking only at a single attribution method can lead to less than fully-informed decisions being made about your campaigns. What if you were relying solely on first-touch reports in this example? That could lead you to reduce your budget on campaign 32 when in reality it was performing much better than your first-touch report told you.

Multi-Touch Report Filtered by Campaign ID 32

Ok, so all that is well and good but manually analyzing campaign IDs one at a time is a lot of work! Yes it certainly is using the methods I just provided as examples. But there is a much better way to approach this. Taking things a step further we can export each of these reports and combine them together in Excel using the campaign IDs as our key values. What we want to end up with is something like the following which will allow us to analyze first, last, and multi-touch all within a single interface.

Multi-Touch Reporting in Excel Sample

In part two of this article I’ll show you how to set this all up in WebTrends. But for now, follow the steps discussed in this article to get these super handy reports in place so you’ll be ready for the next part.

Paying Too Much for Custom Application Implementation

Face it. Even if you have a team of entry-level coders implementing custom application software, you’re probably still paying too much.

Here’s what I mean:

You already pay upfront for fool proof design and detailed requirements.  If you leverage more technology to implement your application, rather than spending more on coders, your ROI can go up significantly.

In order for entry-level coders to implement software, they need extra detailed designs. Such designs typically must be detailed enough that a coder can simply repeat patterns and fill in blanks from reasonably structured requirements. Coders make mistakes, and have misunderstandings and other costly failures and take months to complete (if nothing changes in requirements during that time).

But, again…   if you have requirements and designs that are already sufficiently structured and detailed… how much more effort is it to get a computer to repeat the patterns and fill in the blanks instead?   Leveraging technology through code generation can help a lot.

Code generation becomes a much less expensive option in cases like that because:

  • There’s dramatically less human error and misunderstanding.
  • Generators can do the work of a team of offshored implementers in moments… and repeat the performance over and over again at the whim of business analysts.
  • Quality Assurance gets much easier…  it’s just a matter of testing each pattern, rather than each detail.  (and while you’re at it, you can generate unit tests as well.)

Code generation is not perfect: it requires very experienced developers to architect and implement an intelligent code generation solution. Naturally, such solutions tend to require experienced people to maintain (because in sufficiently dynamic systems, there will always be implementation pattern changes)  There’s also the one-off stuff that just doesn’t make sense to generate…  (but that all has to be done anyway.)

Actual savings will vary, (and in some cases may not be realized until a later iteration of the application)but typically depend on how large and well your meta data (data dictionary) is structured, and how well your designs lend themselves to code generation.  If you plan for code generation early on, you’ll probably get more out of the experience.  Trying to retro-fit generation can definitely be done (been there, done that, too), but it can be painful.

Projects I’ve worked on that used code generation happened to focus generation techniques mostly on database and data access layer components and/or UI.  Within those components, we were able to achieve 75-80% generated code in the target assemblies.  This meant that from a data dictionary, we were able to generate, for example, all of our database schema and most of our stored procedures, in one case.  In that case, for every item in our data dictionary, we estimated that we were generating about 250 lines of compilable, tested code.  In our data dictionary of about 170 items, that translated into over 400,000 lines of  code.

By contrast, projects where code generation was not used generally took longer to build, especially in cases where the data dictionaries changed during the development process.  There’s no solid apples to apples comparison, but consider hand-writing about 300,000 lines of UI code while the requirements are changing.  Trying to nail down every detail (and change) by hand was a painstaking process, and the changes forced us to adjust the QA cycle accordingly, as well.

Code generation is not a new concept.  There are TONs of tools out there, as demonstrated by this comparison of a number of them on Wikipedia.  Interestingly, some of the best tools for code generation can be as simple as XSL transforms (which opens the tool set up even more).  Code generation may also already be built into your favorite dev tools.  For example, Microsoft’s Visual Studio has had a code generation utility known as T4 built into it for the past few versions, now.   That’s just scratching the surface.

So it’s true…  Code generation is not for every project, but any project that has a large data dictionary (that might need to be changed mid stream) is an immediate candidate in my mind.  It’s especially great for User Interfaces, Database schemas and access layers, and even a lot of transform code, among others.

It’s definitely a thought worth considering.

Rise of the networked Enterprise – Web 2.0 finds its payday

McKinsey & Company published their yearly study of Web 2.0 adoption in the enterprise as they’ve done over the last few years. In addition to the interesting data and continual growth of use, they tried to use some statistical analysis to correlate the level of use and adoption to company business performance.

The results, while far from being statistically conclusive, do show that companies that have extensively adopted Web 2.0 and collaborative technologies (they prefer using the term “Networked Enterprise” to the traditional Enterprise 2.0) perform better than their less networked peers.

It’s a great validation to what many of us practitioners in the field see as obvious. More information sharing, transparency and collaboration increases knowledge dissemination and empower better informed decisions. Taking these approaches out to customers and partners can only have positive effect.

Few things I found noteworthy in the results:

  • The ownership of internal collaboration at 61% of responding companies was in IT, not the business or corporate communications. This leads in many cases to a tool based discussion and decisions rather than how can these tools best serve business needs employee needs. Overall lack of ownership is still one of the biggest problems we are seeing. One of the most important steps a company can make in promoting the importance of collaboration is assigning clear ownership.
  • The biggest benefits come when companies use collaboration technologies both internally and externally. Business processes are complex and span multiple stakeholders. Companies that are able to automate and refine these processes and interactions see returns and this is very encouraging.
  • Success and adoption comes from putting Web 2.0 technologies “in the line of business”. If use of collaboration tools is not an additional tool or task but where the work is done, it will be used. If documents are only stored in SharePoint folders rather than in file shares, reports uploaded vs. emailed etc. everyone will get used to it quickly.
  • Social Networking being the highest used web 2.0 feature at 40% adoption. The term Social Networking itself is problematic as it can be used to describe many different types of interactions, from facebook to the SharePoint “colleagues” but there is no doubt that the immense popularity of these tools outside of the enterprise is having an impact, at least on what people think the priorities should be.

What’s ahead?

So how will social technologies evolve in 2011? It seems like the trend of adopting successful consumer tools and bringing them to the fold will continue. The gap is still huge and for most companies, even getting to a reasonable level of sharing still is in the future but some likely candidates include:

  • Full adoption and usage of smartphones as working and collaboration tools, not just email.
  • Location a la 4square
  • Collaborative editing with office 2010

Your Company’s Social Debut

Planning Your Company’s Debut or Strategy in the Social Media Sphere

Corporations have long been regarded by the law as having “legal personality”-  which means they have rights, privileges, responsibilities, and protections just like humans (with some differences, like marriage).   It should come as no surprise then, that they’re acting like humans more and more – now they’re relaxing with friends, and socializing! As communication gets easier through digital technology, humans are now able to interact with corporate personalities.  And these personalities are just beginning to awaken to the new freedoms they can find in the digital landscape.

If you’re like me, and I bet you are, you are both human, and, also a part of bringing business personalities to the social scene. In this capacity, I recently attended SocialTech2010 in Jan Jose, CA, right from my desk in NYC.

As the Twitter stream flowed by rapidly with commentary and quotes from the speakers, I watched and listened to advice, case studies and stories from the experts on Social Media for Business. I came away with the recognition that Social Media for business is just like a big networking cocktail party!

Companies aren’t accustomed to acting as social creatures and the adjustment will take some time. We all had to learn social skills growing up; companies can do the same. There are a few things that etiquette would require of a cocktail party attendee and that’s the same strategy the speakers at SocialTech2010 are recommending:  Know who you are, be interactive and respectful, don’t gossip, be a good listener, and don’t be afraid to share yourself.

As businesses gain proficiency in this kind of interacting, they follow an arc towards maturity. Kathleen Malone of Intel outlined the following 5 stages of a Social Media Approach:

1)      Listen: In this stage a company finds out: What are people saying about my Brand and/or my field? Where are they having this discussion? Who are the major players and influencers?  Services like Radian6, which Malone says Intel deployed 18 months ago, make this possible.

2)      Analyze: This is the time to read the room/space, figure out what your angle will be when you eventually do pipe up. Which conversation will you enter? What are your expectations? Why are you going to participate?

3)      Create: This is the stage where the business comes up with something appropriate to say. To participate effectively in the conversation, Malone says your content should be: useful, interesting, human, “snackable” (meaning in bite size pieces, easily consumed), inspiring and should cater to egos and build community.  

4)      Engage: In this stage you go public and enter the conversation, getting your content out there in new ways and/or by participating in the conversations that already exist.

5)      Measure: Your social media approach is not complete without an understanding of how you’re doing. The internet is an amazing forum for measuring how people behave with your content, and you should use a variety of tools to understand the response to your forays. Measuring properly will provide insight on how to proceed, both in the ongoing conversation, and with the business itself.

Both Malone and Brian Ellefritz of SAP outlined the natural evolution of Social Media programs at large companies  – first there are what Ellefritz calls “Grass Roots” efforts, where excited individuals branch out in ways that are unpredictable and non-uniform. He says companies should encourage these exploratory missions. Leadership will begin to emerge internally, and informal education will get the ball rolling. Following the “Grass Roots” period, Ellefritz sees “Silos Form.” This may not feel 100% smooth, but is an important step, as “coop-eteition” (a kind of cooperating/kind of competing relationship, sort of like sibling rivalry that spurs each one on) sees different silos jockeying for position. During this step, Ellefritz encourages companies to “invest in leaders, not laggards”, and to get the players from various silos together to learn from each other.  Also, he says, “don’t wait too long for governance.”

The next evolutionary phase in a corporate Social Media Program is “Operationalizing” – where leadership becomes clear, channels become well formed and in alignment with the divisions in your business.  Tools begin to consolidate and more emphasis on measurement and results appears. By this point your business may have headcount devoted to social media, and content should become less problematic, less of a focus, because it’s running more smoothly.  During this stage it’s important to align and integrate silos, and focus on strategy, ownership, metrics and priorities.

After this shift, the next phase is what Ellefriz calls “Lifestyle.” This is when the Social Media program has engaged and competent employees and success is understood and positive outcomes are frequent. This is a level of Social Media implementation that is fairly rare in today’s scene, though Ellefritz points towards Zappos as an example of a company that may be at this level.

.. .. ..

The wonderful thing about participating in social media is that it lets your personality out! For a business that hasn’t previously seen itself as the kind of entity that has a social life, this might seem daunting at first.  That’s why Ellefriz’s evolutionary arc makes so much sense to me. The way I see it, people and businesses want more than ever to get clear on who they are, and who they want to be, in order to present themselves well, and to participate in Social Media conversations. The best advice is to be authentic. Just like at cocktail parties, the people you’re conversing with generally know if you’re “full of it”, or if you’re being sincere.  Your conversational counterparts like to be complemented, offered nuggets of useful information, and generally considered and included.

For businesses, (and the teams of people that perpetuate them) this will mean really focusing on what the goals are, what opportunities exist to communicate clearly and uniformly around these interests, finding “friends” out there to talk with, and owning up to the inevitable minor mistakes that are so easy to make along the way. Since SM is such a public sphere, the resulting increased level of transparency is going to make businesses change and open up in new ways.

Coachdeb:”RT @MarketingProfs: “When someone says they need a Facebook strategy, a Twitter strategy, I say… Wait! Take it back… What’s your story?” @scobleizer #mptech”

So, armed with the Social Media/networking party analogy and with the stages of approach and evolution path laid out before you – what are you waiting for?  Participate!

Here are 10 tips to consider as you get started:

1)      Go where the fish are – target engagement carefully where the conversation already is.

2)      Social Media is Local. The goal is to be uniform while being decentralized – Intel communicates internally with their 1000 “Registered Social Media Practitioners” with guidelines and trainings (some mandatory). Intel also has their own internal newsletter that aggregates Social Media content – Malone says this makes management comfortable as well as keeps everyone updated.

3)      Have a Content Calendar for the year to coordinate Social Media messaging across channels and people, and to keep it focused on your message. Kathy Malone said at Intel, 2/3 of the content that gets put out falls under the guidelines of their content strategy calendar.

4)      Consider in advance how to manage Social Media Risk. One of the most interesting things Jaime Grenny of SalesForce said at SocialTech2010 is that all their employee training videos on Social Media strategy (and how to use online video for B2B marketing) are up for the public to see on YouTube (here).  This level of transparency lets everyone know what to expect upfront.  Malone outlined a “prevention/detection/response” approach in which 3 teams worked from different angles to mitigate risk on the social media front. And experience teaches: “if you screwed up, fess up”, and be transparent.

5)      If your company is doing moderation of dialogue, consider having a light hand to keep the conversation honest – as Intel puts it, they let the good and the bad in, but moderate the ugly – mostly meaning profanity and non-constructive comments, and they’ve found their audience appreciates it.

6)      Build a business case for your business so you know why you’re entering into Social Media – not only will it legitimize your efforts internally, but it’ll provide clarity for your message. Will it extend customer service? Will it increase SEO? Can you use it to create brand advocates and champions? Can you collect ideas on where to take your product?

7)      To measure, use Context. As with all web metrics, in order to understand what’s happening you need to understand the context of your data, and compare it to a baseline to view trends. Knowing your goals will assist you in setting up context.

8)      People are the PlatformLaura Ramos of Xerox encourages us to get our people out there and seen. Show video of your thought leadership. Get your salespeople to share their stories and knowledge with the rest of your company and make them heroes. Build relationships, and let your existing customers create new business for you. Social Media Marketing is not about reaching many to influence a few but engaging a few to influence many!

9)      Social is relevant. Here are some StatsRené Bonvani of Palo Alto Networks says that FaceBook has a 96% penetration in enterprise, meaning that only 4/100 people aren’t using it at work! He also said that only 1% is posting on Facebook but that people are 69 times more likely to use FaceBook chat than to post.  Another impressive Bonvani stat: 69% of business buyers use social media to make purchasing decisions.  No matter the numbers, it’s clear that with the cost of communication dropping close to $0, as social beings, we’re using the web to communicate more often with more people, and in smaller chunks regularly.

10)   Social media has to be part of WHAT you do, not something else you do. Jeremiah Owyang in his keynote said that the only difference between the Social Site and your business is the URL. He says that in the radical future, websites will be dynamically assembled on the fly based on social profiles. URLs and domains won’t matter – the web will be sorted around people and contextual situations.  Because of this, ads will become useful content.  This is already evident.

So – Get out there and participate!

Edgewater Technology provides strategy, consulting, web metrics, and implementation expertise to help you focus on the best ways your company can engage in these dynamic communities and track your success!

Hidden Impact of the iPad on Your Corporate Website?

Even though the iPad appears to be a device for Apple fans, gadget freaks, tech savvy consumers, etc., it is already positioned to have a significant impact on the nature and shape of corporate websites. The adoption and growth rate of specialized content consumption devices such as Smart Phones, Tablets, and Net Books can no longer be simply ignored. B2B and especially B2C companies need to ensure that their websites not just “function” on the iPad and other similar devices but they must provide a good user experience. Just as higher bandwidths and more powerful computers once changed static websites forever the new content consumption devices are positioned to do the same again. The buzz around the iPad and its successful launch means that any future website planning and upgrades need to keep the new realities in mind.

Perhaps the most talked about aspect of the iPad browsing has been lack of support for the Adobe Flash. Apple CEO Steve Jobs has termed Flash buggy, a CPU hog, and a security risk. Regardless of whether you agree with Jobs or not, recent trends in website design and development do point towards less Flash usage for various reasons. HTML 5 is being touted as the new replacement and many key companies and websites have already adopted it. Therefore any decision to use Flash or to continue supporting existing Flash applications should be weighted carefully. Many major websites like WSJ, NPR, USA Today, and NYT are doing away with Flash and taking a dual approach of providing the iPad optimized apps and rolling out new Flash-less websites.

Still common among corporate websites are the slide out menus that require constant cursor presence which makes for not so friendly experience for the finger based navigation. Touch interfaces are rapidly gaining in popularity and are no longer limited to Smart Phones and Tablets, some of the newer laptops also support them. Another implication of touch interface is that links that are too small or too close to each other make it difficult for users to click them and create a frustrating user experience. The ever shrinking size of buttons and links now needs to be reconsidered and their placement must also be rethought. Touch is here to stay and sites need to evolve to ensure they are “touch” friendly.

In the days immediately following Netscape’s collapse, Internet Explorer became the king of browsers and any corporate website needed only to be tested with 2 or 3 versions of Internet Explorer. Since then FireFox, Chrome, and Safari have gained significant market share, and any updates to the corporate website must once again be tested against a plethora of browsers. Safari has hitched a ride with the iPhone and the iPad and therefore requires special consideration in ensuring that the corporate website renders and functions well. However, browsers no longer hold a monopoly on being the sole interface to the internet content.  The rise of proprietary apps providing customized experience means now there is a challenger to the mighty browser. Major media sites like NYT, WSJ, BBC, etc. all have rolled out their iPhone and iPad apps which provide a highly controlled experience outside of the browser.  

The iPad’s larger screen provides an excellent full-page browsing experience. However, number of sites treat the iPad as a mobile client and serve up the mobile version of the web-site. In most cases user can click on the full-site link and access the standard website but that is cumbersome to the users. Websites now need to differentiate between smaller mobile clients where limited mobile version of the website makes sense and newer larger content consumption devices like the iPad where nothing but the full-site will do. Needless to say the iPad and other similar devices are creating a new reality and the corporate websites need to take notice or risk being considered old and obsolete.

Top Web Technology and Marketing trends for 2010 part 1 – Social Strategy and Infrastructure

I was at Barnes and Noble over the weekend and browsing through the business books section could see only 2 types of titles, books on the financial collapse and guides to social media marketing. Both are selling well I hear.

It’s good to see that after some significant doubts, corporate America and small businesses alike are engaging users on social media sites and twitting away. Unfortunately, what we often get is a complete schizophrenic approach. The corporate website is all law and order, control and command broadcasting carefully crafted and designed branding messages and product introductions. Then we have the social media wild west where everything goes, no rules exist and chaos reigns. Living with a split personality is hard and as Nestle recently found out, trying to enforce brand guidelines on Facebook can backfire at you.

As mentioned, there are a bucketload of books that will teach you how to engage and utilize social media, use it to form personal relationships and provide value add rather than just another outlet for PR.

I think a more urgent task we have is addressing the challenges of changing the purpose, structure and utility of public websites to adapt to the new social reality. Frankly, even after 6 years of “web 2.0” most sites are still pretty static brochureware, but the Social revolution is changing that quickly. Even though not every company will want to cancel their website and send users to Facebook instead as Skittles did for a few months, there is much to gain from trying to marry the two worlds.

The goals of the public website have not really changed: create a positive brand experience, attract and convert new customers, retain existing customers, make it easy to do business with you and provide great service anytime, anywhere. Now adding the social layer on top of that elevates it to a whole new level. It also requires a new and maturing technical infrastructure and tools to manage this experience.

Adding the social layer can take many forms but done right it will make every website more relevant, accessible, personal and effective. The tools to manage this new environment are still evolving and maturing but the next releases in all product categories will include a social integration layer.

Before embarking on the next iteration, every website owner must examine and decide: “How social should the company’s site be?”

Here are some guidelines for different models of social integration

  1. Divide and Conquer: create separate destinations for different types of interaction but make them distinct from the main site
  2. Complete control over brand experience: build the brand site into a social community
  3. Co-Promotion: link and syndicate content from site to social media, promote social media activity on site.
  4. Aggregation and context: aggregate relevant social media to site from multiple sources
  5. Integrate and Connect with Social Media: create a seamless experience and leverage identity and existing relationships

Of course, these modes are not mutually exclusive and can be used for different part of the site or in evolving fashion.

For more on these topics, I’m doing a webinar on 3/31/10 on best practices of social integration and will bring some examples. To register go here.