Digital Web Magazine

The web professional's online magazine of choice.

Microsoft B2B Site Case Study

Got something to say?

Share your comments on this topic with other web professionals

In: Articles

By Alex Barnett

Published on September 8, 2004

Introduction

In February 2004, Microsoft UK launched Advisor, an extranet targeted at business and technical decision makers in the mid-market.

The end product was the result of over a year of research, planning, design, development and pilot testing.

It was one of the most intensive and thorough design processes I have been involved with. I learned a great deal along the way about how to create an offering that really meets customer needs. It is these learning points I want to share because I believe the design principles can be applied to practically any kind of Web development project.

Background

In late 2002, the Microsoft UK online customer relationship marketing team was briefed by the mid-market audience marketing group. They shared the results of research undertaken to understand how our customer engagement model was meeting the needs of the customer.

In short, the brief indicated that there was a lot of room for improvement, especially in the online space. The research showed that having an online relationship with Microsoft was a top-three priority for our mid-market customers. And, critically, the way Microsoft engages its customers and the quality of its products were the major drivers of customer satisfaction.

The evidence pointed to a simple solution—develop our online offering to complement our offline offerings as part of an improved CRM program to improve customer satisfaction levels.

At this stage, we were tempted to go ahead and make a number of “obvious” assumptions about what an online offering should consist of. This was where discipline and clear process had to kick in. Assumptions are dangerous things.

Process

We agreed on a plan and a process we would work with, putting particular focus on:

The internal customer agreed that we should take time to develop a great product. Rushing this type of project due to internally created deadlines was not going to achieve the customer’s strategic objectives.

So off to work we went.

Research

A research project was commissioned to determine specifically what this online offering should comprise. There were a number of high-level requirements that emerged. Customers stated the following needs:

Design

With our Web agency, AKQA, we analyzed the feedback.

Initially, the brief to the agency was to develop ideas based on the feedback and our objectives. They came back to us with strong concepts, including proposed names (we tested many and customers told us that Advisor was the most meaningful), information architectures, screenshots, design, layouts and a portfolio of potential services and content we could provide.

We then set up focus groups and tested ideas with the target audience. It was great to learn there were some clear killer apps emerging from within the various services we proposed. The “Call Me” feature and the ability to manage company profile details were popular. We had some surprises too: Some services we had assumed would be valuable were rejected as low priority or gimmicky. Saved searches, personalized bookmarks and forums were dropped like rocks.

We also learned that interactive services, as opposed to volumes of flat content, was what the customers were really after. This was valuable feedback. We had been heading in a content-heavy direction and our customers pulled us back and steered us toward a more function-focused path.

This stage of the process was key to getting Advisor right. It drove our priorities and gave us confidence that what we were developing was something our customers really needed and wanted, not just what we assumed they needed.

Technical

The requirements were driven by our customers. These in turn provided us with the initial input for our functional requirements. We worked with the agency to flesh these out in detail and understand the various internal systems and Web services we’d need to expose and integrate.

We developed use cases, analyzed business processes, requested information from the various internal systems and Web services owners and conducted planning sessions with development teams where we interacted with those systems.

The agency led the way in developing the technical architecture and system design and developing proofs of concept, verifying with technical stakeholders throughout. It was apparent from the beginning that Advisor was going to be a technically complex project to deliver, so to ensure we stayed on track we leveraged the Microsoft Solutions Framework (a flexible and scalable framework to plan, build, and deploy business-driven technology solutions).

UI and Usability

One of the really exciting challenges of Advisor’s development was the need to create a user interface that would provide customers with all the services (applications) and content they would need in a single, cohesive and usable environment and to create a compelling online customer experience that was a pleasure to use. In short, the interface needed to enable, not hinder.

Many of the UI components were already defined for us through Microsoft Network Project, an XML-based presentation framework used on Microsoft.com. MNP aims to provide consistent experience by defining standard design and presentation elements across our sites through a common, managed code base.

Leveraging MNP meant we could focus on the development of the Advisor experience and avoid reinventing wheels.

Key to the success of the UI design process was the investment made in usability testing. We partnered with a usability specialist agency (Bunnyfoot) to conduct two testing sessions with the target audience.

In the first session we provided screenshots and label names for navigation. The design team gained valuable feedback, especially around label naming and IA. At this stage, the issue of terminology was raised and addressed. “Microsoftisms,” terms that were unlikely to mean anything to anyone except those working at Microsoft (like “integrated collaboration” and “close the delta”) were dropped for plain English.

We had to rename all six top-level sections. For example, we changed “Reality” to “Industry Success Stories” and “Equip” to “Events and Training.” Secondary and tertiary labels did better, but still required rethinking. We tweaked and retested, and moved on in the development phase.

The purpose of the second session was to test a working site prior to pilot launch. Again, we picked up on a set of issues, mainly to do with navigation and workflow, that we prioritized according to severity, and retested where significant changes were necessary.

During this session, customers made a number of great suggestions, some of which were easy to implement and made a big difference. One was the observation that when a link was clicked to another site outside of Advisor the user felt lost, and suggested we open up a new browser—even if it was a link to another Microsoft site. We had considered this at an earlier stage but were concerned that this broke one of our “consistent usage principles.” Practically every customer mentioned this point, so we gave them what they wanted.

Pilot Phase

We piloted the site with over 200 customers, allowing us to learn about all the areas we needed to focus on prior to the full launch. We wanted to fully test a range of dependencies and operational processes, including recruitment processes, marketing communications, back-end offline support, service level agreements, issue escalation and resolution and so on. It’s a good thing we did—we had already done a great deal of work in this regard, but realized we needed to re-engineer some other offline process if we were to go into full launch without causing mayhem. This period allowed us to test these processes under real-world conditions.

Many Web agencies consider this type of testing and preparation to be someone else’s job—and yes, to a great extent it is—but if the Web agency is responsible for delivering its own part, it needs to be acutely aware of all dependencies and risks and make sure the client knows and acts accordingly. The agency team did a great job in this regard, and in my opinion, this is where agencies stand out from the crowd (assuming creative and technical excellence).

An example of the value we derived from pilot phase was learning about the way one of the services offered within Advisor was being used (or not being used, as the case may be). The service in question was “Call Me,” a feature that let customers request a call at a specific date and time from the customer services team.

It was simply not achieving the level of use we expected. This confused and concerned us as the earlier research had shown “Call Me” to be a popular feature that resonated well with customers. Additionally, the usability studies did not highlight any discoverability issues with the feature—in fact, a number of the tasks we tested were planned specifically to ensure that “Call Me” was easy to find and use. There was no evidence of a problem at that stage.

A number of theories were floated around as to why “Call Me” wasn’t getting the usage we anticipated. Someone had the bright idea of asking the customers. The answer came back loud and clear—customers weren’t quite sure what constituted a reasonable reason to request a call. They knew that “Call Me” was not intended to replace existing support contracts (this was made clear in the copy) and were concerned about wasting our time with trivial queries. When we dug further, these “trivial queries” turned out to be exactly the kind of queries “Call Me” was designed to pass on to our customer services team. Many of the queries were actually the beginnings of decision-making processes about purchasing our products.

The insights we gained around “Call Me” allowed us to re-evaluate our marketing of this specific service within our communications at the pilot stage, saving us money later on. The results showed we had cracked the problem—we experienced a 250% increase in the use of “Call Me” in the first week and the new levels were sustained after that.

The Results

The pilot, launched in August 2003, also aimed to help us understand how the site was meeting its original objective of providing an online experience that would meet customer needs as part of an overall CRM program and therefore lift customer satisfaction.

All said, the pilot did very well. We carried out pre- and post-use surveys with the control group and the results were overwhelmingly positive, giving us the confidence to launch in the UK in February 2004. Early signs are very encouraging.

Our work was closely tracked and supported by teams in Redmond. Advisor is rolling out in Germany and France later this year, and globally after that.

The lessons we learned during Advisor’s development are now used across teams and built upon by other projects. These lessons in user-centered design and our general approach can be applied to online projects large and small; in particular, that it’s more important to get it right than deliver early.

Got something to say?

Share your comments  with other professionals (0 comments)

Related Topics: E-Marketing, User-Centered Design (UCD)

 

Alex Barnett is Online Customer Experience Manager at Microsoft UK, living in London and spending way too much time online.

Media Temple

via Ad Packs