The Right Way to Do Lean Research

Written by: Laura Klein

StartX, a nonprofit startup accelerator, recently devoted an entire day to the role of design in early-stage companies. One panel included Laura Klein, Todd Zaki-Warfel, Christina Wodtke, and Mike Long.

Each panelist had made their mark on how design is done in start-ups: Laura wrote the influential O’Reilly book on UX for Lean Startups, and Todd penned the bestselling Rosenfeld Media Prototyping book. Christina has been cross-teaching design to entrepreneurs and entrepreneurship to designers at institutions such as California College for the Arts, General Assembly, Copenhagen Institute of Interaction Design, and Stanford. Mike founded an influential Lean UX community in San Francisco.  

Although the conversation ranged widely, they kept coming back to research: the heart of the lean build-measure-learn cycle. As the hour-long panel drew to a close, Christina jumped up and scribbled on the board the key themes of the conversation: right questions, right people, right test, right place, right attitude and right documentation.

Below is Laura Klein expounds on these key themes of lean research. Boxes and Arrows is grateful for her time.

Right questions: Make sure you know what you need to know

Too many people just “do research” or “talk to customers” without having a plan for what they want to learn. What they end up with is a mass of information with no way of parsing it.

Sure, you can learn things just by chatting with your users, but too often what you’ll get is a combination of bug reports, random observations, feature suggestions, and other bits and bobs that will be very difficult to act on.

A better approach is to think about what you’re interested in learning ahead of time and plan the questions that you want to ask. For example, if you need to know about a particular user behavior, come up with a set of questions that is designed to elicit information about that behavior. If you’re interested in learning about the usage of a new feature, ask research participants to show you how they use the feature.

The biggest benefit to planning your research and writing questions ahead of time is that you’ll need to talk to far fewer people to learn something actionable. It will be quicker and easier to learn what you need to know, make a design change, and then test that change, since you will see patterns much more quickly when you ask everyone the same set of questions.

Right people: Talk to people like your users

Let’s say you’re building a brand new product. You want to get everybody’s opinion about it, right? Wrong! You want to get the opinions of people who might actually use the product, and nobody else.

Why? Well, it’s pretty obvious if you think about it. If you’re building a product for astronauts, you almost certainly don’t want to know whether I like the product. I’m not an astronaut. If you make any changes to your product based on anything I say, there is still no conceivable way that I’m going to buy your product. I am not your user.

Yet, this happens over and over. Founders solicit feedback about their product from friends, family, investors…pretty much anybody they can get their hands on. What they get is a mashup of conflicting advice, none of it from the people who are at all likely to buy the product. And all the time you spend building things for people who aren’t your customer is time you’re not spending building things for people who are your customer.

So, stop wasting your time talking to people who are never going to buy your product.

Right test/methodology: Sometimes prototypes, sometimes Wizard of Oz

Figuring out the right type of test means understanding what you want to learn.

For example, if you want to learn more about your user–their problems, their habits, the context in which they’ll use your product–you’re very likely to do some sort of ethnographic research. You’ll want to run a contextual inquiry or an observational study of some sort.

If, on the other hand, you want to learn about your product–whether it’s usable, whether the features are discoverable, whether parts of it are incredibly confusing–you’ll want to do some sort of usability testing. You might do task based usability testing, where you give the user specific tasks to perform, or you might try observational testing, where you simply watch people interact with your product.

There is another type of testing that is not quite as well understood, and that’s validation testing. Sometimes I like to call it “finding out if your idea is stupid” testing. This type of testing could take many forms, but the goal is always to validate (or invalidate) an idea or assumption. For example, you might test whether people want a particular feature with a fake door. Or you might learn whether a particular feature is useful with a concierge test. Or you could gauge whether you’re likely to have a big enough market with audience building. Or you could test to see whether your messaging is clear with a five second test.

All of these approaches are useful, but the trick is to pick the right one for your particular stage of product development. A five second test won’t do you any good if what you want to learn is whether your user is primarily mobile. A concierge test doesn’t make sense for many simple consumer applications. Whatever method you use, make sure that the results will give you the insights you need in order to take your product to the next level.

Right place: When do you go onsite?

If you talk to serious researchers, they will often tell you that you’ll never get good data without being in the same room with your subject. You’ll learn so much more being able to see the context in which your participant is using the product, they’ll tell you.

And they’re right. You do learn more. You also spend more. Kind of a lot more, in some cases.

So, what do you do if you don’t have an infinite budget? What do you do if you have users on multiple continents? What do you do if, in short, you are a typical startup trying to make decisions about a product before going out of business. You do what people have been doing since the dawn of time: You compromise.

Part of deciding whether or not to do remote research has to do with the difficulty of the remote research and what you need to learn. For example, it’s much harder at the moment to do remote research on mobile products, not just because there isn’t great screen sharing software but also because mobile products are often used while…well, mobile. If you simply can’t do an in person observation though, consider doing something like a diary study or tracking behaviors through analytics and then doing a follow up phone interview with the user.

Other types of research, on the other hand, are pretty trivial to do remotely. Something like straightforward, task based, web usability testing is almost as effective through screensharing as it is in person. In some cases, it can be more effective, because it allows the participant to use her own computer while still allowing you to record the session.

Also, consider if you’re truly choosing between remote testing and in-person testing. If you don’t have the budget to travel to different countries to test international users, you may be choosing between remote testing and no testing at all. I’ll take suboptimal remote testing over nothing any day of the week.

Choosing whether your testing is going to be remote, in person, or in a lab setting all comes down to your individual circumstances. Sure, it would be better if we could do all of our testing in the perfect conditions. But don’t be afraid to take 80% of the benefit for 20% of the cost and time.

Right attitude: Listen, don’t sell

I feel very strongly that the person making product decisions should be the person who is in charge of research. This could mean a designer, a product owner, an entrepreneur, or an engineer. Whatever your title, if you’re responsible for deciding what to make next, you should be the one responsible for understanding your user’s needs.

Unfortunately, people who don’t have a lot of experience with research often struggle with getting feedback. The most common problem I see when entrepreneurs talk to users is the seemingly overwhelming desire to pitch. I get it. You love this idea. You’ve probably spent the last year pitching it to anybody who would listen to you. You’ve been in and out of VC offices, trying to sell them on your brilliant solution.

Now stop it. Research isn’t about selling. It’s about learning. Somehow, you’re going to have to change your mode from “telling people your product is amazing” to “learning more about your user and her needs.”

The other problem I see all the time is defensiveness. I know, I know. It’s hard to just sit there and listen to someone tell you your baby is ugly. But wouldn’t you really rather hear that its ugly before you spend several million dollars on building a really ugly baby?

If you open yourself up to the possibility that your idea may be flawed, you have a chance of fixing the things that don’t work. Then your baby will be pretty, and everybody will want to buy it. Ok, the metaphor gets a little creepy, but the point is that you should stop being so defensive.

Right documentation: Record!

You should be taking all of this down. Specifically, you should be recording whatever you can. Obviously, you need to get permission if you’re going to record people, but if that’s at all possible, do it.

The main reason recording is so important is so that you can be more present while interviewing. If you’re not busy writing everything down, you can spend time actually having a conversation with the participant. It makes for a better experience for everybody.

If you can’t get everything on video, or really even if you can, it’s also good to have someone in the room with you taking extensive notes. You’re not going for a transcript, necessarily, but just having somebody record what was said and what was done can be immensely helpful in analyzing the sessions later.

Another important tactic for remembering what was said is the post-session debrief. After conducting the interview or observation, spend 15 minutes with any other observers and write down the top five or ten take-aways. Do it independently. Then, compare notes with the other observers and see if you all learned the same things from the session. You may be surprised at how often other people will have a different understanding of the same interview.

~~

Boxes and Arrows thanks Laura for sharing these insights with our readers! If you want to learn more about fast and effective research, we strongly recommend her book UX for Lean Startups: Faster, Smarter User Experience Research and Design and her talk “Beyond Landing Pages” from the 2013 Lean Startup Conference.

The UX Professionals’ Guide to Working with Agile Scrum Teams

Written by: Aviva Rosenstein

The adoption of Agile software development approaches are on the rise across our industry, which means UX professionals are more likely than ever to support Agile projects. Many UX professionals seem stymied by the challenge of effectively integrating UX within an Agile development framework–but there are others in our field who have encountered the same problems yet are finding effective solutions.

I first encountered Agile Development in 2005, when a team I supported was chosen to help pilot Scrum development methodology at Yahoo! Inc. There are variety of Agile development approaches in use, but Scrum is currently the most popular: over 70% of software professionals using Agile methodologies employ some variant of the Scrum methodology.

When I left the company three years later, more than 150 teams at that company were using Scrum for developing both infrastructure and product features. In 2009, I moved on to Salesforce.com, where Agile methods (including Scrum) were implemented across their entire research and development organization.

In my experience, when product development is managed with an Agile development approach, user experience professionals are expected to find a way to work within the Agile framework to succeed. But, while team members may be offered training or even certification on Agile development practices, the training rarely discusses best practices for integrating UX design into the development process. And though internal surveys posted by my employers indicated that most  employees were satisfied with Agile development practices, some of my UX colleagues privately expressed frustrations with the challenge of  delivering a high quality user experience in Agile’s incremental release framework.

So, I decided to interview my UX colleagues for their perspective: What Agile practices were working well for them, and what specific pain points had they identified in the Scrum development process?

I reached out to seventy colleagues and received detailed responses from twenty UX professionals (including interaction designers, user researchers, and visual designers) who were actively supporting Scrum development teams. Many of the problems they reported indicated that both UX professionals and technical staff lacked a shared understanding of each others’ team roles and responsibilities. And other problems stemmed from UX practitioners feeling disconnected from the daily life of the development teams they supported.

Fortunately, for nearly every specific issue an informant raised as a pain point, some other colleague independently described an approach they had used to successfully resolve it with their team. By reading their responses, I learned that effective relationships between UX and technical staff could be created and sustained by actively involving scrum teams in the UX process, by active participation by UX professionals in team activities, and by frequent communication with team members about UX issues.

Here’s a summary of what what worked well for UX professionals supporting Agile development teams, as well as some of their common pain points. At the end are recommendations for individual UX contributors, UX managers, Scrum Masters and product owners, based on my colleagues’ responses and my own experiences with Agile development.

Opinions on what’s working

Informants were asked: “Thinking about how you and your UX colleagues are working with your scrum teams, what’s going well?” Their answers included the following themes.

Trust and earned respect

Both designers and user researchers shared techniques for keeping product owners and developers informed and aware of their progress. Their practices included presenting information about their roles to teams, inviting teams to observe user research sessions, and sharing documents to track progress on usability issues.

Being transparent about the UX process helped some respondents foster trust between themselves, their product managers, and the technical staff on their scrum teams.

“I have a close relationship with the product managers I am working with, the dev manager for the scrum team and the developers themselves… I am transparent about my progress and share design iterations to get their feedback and opinions. In return, they trust me and accept the value of my expertise as a designer.”

Respondents also reported creating successful relationships with their product teams by  involving their scrum teams in the UX process–especially by collaborating on UX issues with technical staff.  Giving all ideas and contributions equal consideration regardless of the role of the originator, inviting all team members to give feedback on designs, and inviting them to participate in user research all helped promote developers’ ownership of design decisions.

“One of my teams has a lot of ideas and contributes a lot to the UX of the product. Sometimes they come up with ideas that I didn’t think of and it greatly improves the product’s usability.”

Be present in the life of the scrum team

Several respondents credited active team participation (beyond UX-specific activities) with their success in building relationships and fostering trust, and with achieving more of their user experience goals. Due to conflicting schedules across multiple teams, some UX professionals were often unable to attend all scrum meetings, but one called out the value of attending scrum meetings at least once a week.

“Being a constant voice in the development lifecycle helps keep the UX vision in line.”  

“Partaking in blitzes & logging bugs helps the team know you’re there and trying to help.”

Co-location was preferred, but those serving remote teams cited use of tools like Skype and IRC to maintain close connections.

“Being available and in the aisle to be a part of the conversations that happen spontaneously.”

“I’m on skype and talk to them all day long”

Being present also made it possible to take advantage of opportunities to educate scrum teams in the moment when they raised relevant questions:

“Gaining interest to discuss UI topics can often go well when the scrum team has a particular question or is unsure on a particular thing.”

Frequent communication outside of standard agile interactions

In addition to participation in regular scrum meetings, UX colleagues shared that adding meetings specifically devoted to coordinating UX activities with the team were successful in increasing ownership  and a shared vision of the design direction.

“Regular design review meetings are helpful in keeping both the scrum teams and researchers in the loop with decisions that seem to change every 2 minutes. Regular check-ins with product owners are also helpful in knowing priorities.”

Types of meetings called out as particularly effective included regular check-ins with product owners for both user research and design, regular design review, or design initiative meetings with scrum teams, and weekly meetings with those working specifically on front end development (or even more frequently when preparing for usability tests).

Opinions on what wasn’t working

Informants were also asked to answer the question: “Thinking about how you and your UX colleagues are working with your scrum teams, what’s NOT working so well?” Answers included the following themes.

Conflicting expectations around quality, fit, and finish

Most of the concerns raised were related in some way to delivering a quality user experience–a key concern for everyone in UX regardless of role. Some people raised issues related to these conflicting expectations, specifically around a perceived lack of commitment to quality by developers and product owners. Perhaps because our view of the product is through the lens of the user experience, UX professionals pay more attention to fit and finish than product owners or other members of a scrum team when judging whether a release is ready for launch. Some informants felt that developers ignored specifications and resisted improvements, or that insufficient team resources were devoted to executing specifications aimed at improving product quality.

“The greatest obstacle is convincing the team to go the final mile to deliver a great experience.”

“The opinion that ‘This is good enough.’ Design implementation always goes to the bottom of the priority for the sake of MVP… Pushed to the next release and stay in the UX bug list forever.”

Lack of holistic planning and prioritization for the user experience

Informants raised concerns about designs being bolted onto existing products incrementally without concern for the overall product experience. Design managers who responded complained that too often they were brought in too late to the process, were left out of the loop on strategic planning, or were not adequately exposed to the product roadmap.

“No time is considered for design of the overall framework. Scrum teams jump into feature releases. They’re building inside out instead of outside in or holistically.”

Unclear expectations about the role of UX on the team

Many frustrations expressed by informants were due to a lack of clarity around the role of UX members on the scrum team. In some cases, people felt as though product owners and technical staff members did not have a clear understanding about the skills of UX practitioners, their overall role in the development process, or that they were a shared resource dividing their attention between multiple scrum teams. In other cases, the expectations held by different members of the scrum team about the timing and relationships between design and development activities appeared to be out of alignment.

Some informants raised concerns about  product owners or developers thinking of design only in very tactical terms and not recognizing the value that UX brings to the product ideation process. UX team members expected to be included in developing product strategy, but some reported that they weren’t brought into the process until after requirements were set and coding had started, leading to problems with the overall user experience delivered:

“…when it comes to designing a new product or larger holistic experience, design doesn’t get looped in until after the idea is sold and a launch date is picked and devs start working. Design needs to align earlier and sooner with the business owners to ideate and come up with a great design.”

UX team members may expect ownership of the design of the user interface, including decisions about overall  information architecture and interaction models–but this expectation will not necessarily be shared by members of the technical staff on Agile teams, who may perceive the role of the designer as simply “skinning” the user interface. This creates difficulties when developers code elements of the user interface ahead of, or at odds with, UX work and specifications still in progress.

“UX was seen as pixel pushers who made things look nice after the developers built their features”

“Developers build whatever UI they think is appropriate while a designer is working on design iteration or testing.”

Perception of UX as less valued than development

Some informants raised concerns about how UX was valued as part of product development. In particular, one respondent perceived his organization as having a “developer-centric culture” that often dismissed UX input, resulting in usability and utility deficits in the product.

“They valued developers over everyone else, to the expense of everyone else being productive. PMs and Dev managers had an exclusive relationship with a few key developers and worked on product direction and user experience direction without any involvement from the UX team… Needless to say, the product they deliver has a lot of usability problems.”

Perception that technical staff is disinterested in users’ needs

In addition to dismissing of the value of UX team members’ contributions, some informants characterized technical team members as lacking empathy with the needs of their end users:

“Sometimes engineers are thinking of the solution without understanding the need. This is understandable, but it is taking a lot of education to get the idea of understanding the need and then building a solution to fulfill that need in the minds of our engineers.”

Being disconnected from the regular activities of the scrum team

Being separated from other team members either by distance or by allocation had a negative impact on UX team members. This included difficulties navigating time differences and exclusion from remote meetings as well as missed opportunities to bond with their scrum teams.

“Difficulty in being aligned with the engineering team which is mostly in India. Timing is difficult. Stand ups are near impossible to join as they are late evening for us in the US.”

Informants who served multiple teams reported difficulties with managing time and workload, and also were concerned about managing their teammates’ expectations about their availability. User researchers, who were often supporting three or more teams, were most likely to report problems with time management and team integration, but this problem could impact any UX team member with responsibilities for more than one scrum team.

“The challenge is that supporting two teams that are both on nearly identical timelines creates a time crunch. Some of that ideal process gets cramped and I end up just getting the basics done, just in time.”

Recommendations

A few of the respondents were generally unhappy with the Agile approach to development and expressed nostalgia for waterfall development. But when I looked closely at their responses, it seemed their dissatisfaction with Agile related to uncertainty about how to integrate UX into their scrum teams’ development processes or their discomfort with discussing technical topics.

Helping new UX team members with time management skills, with improving their estimation of UX work, and with understanding the roles and responsibilities of everyone on the project team may help improve their satisfaction and effectiveness with Agile teamwork.

Although UX managers can begin improving relationships between technical and design staff by offering more training in Agile techniques to UX team members, real change will require participation from product owners, scrum masters, and technical leadership. As one participant wrote:

“The culture and attitudes really have an impact whether UX-scrum team relationships can be successful. It’s a two way effort and it doesn’t work when one of the parties is unwilling.”

The suggestions below are targeted at specific roles in Scrum and on the UX team (product owners, scrum masters, UX managers, interaction designers, and user researchers) as well as at those responsible for the employee on-boarding process (including Agile trainers and coaches). Some recommendations are relevant to more than one role, so they may appear in multiple sections.

Employee onboarding (development as well as UX)

To encourage an atmosphere of trust and understanding between UX and development staff, and clarify the role of UX for everyone on the team, consider:

  1. Explicitly training people to recognize that including specialists on teams may be necessary for some projects or sprints, and to reject the old Agile dogma that openly denigrated specialization.

  2. Including training about UX practices and process in organizational training for developers, scrum masters, and product owners (including the relevant recommendations below).

  3. Setting clear expectations for involving UX in team activities.

  4. Setting expectations early that developers and product owners participate regularly in customer contact opportunities and ideation sessions around user needs (such as design studios).

Scrum masters

To encourage an atmosphere of trust and understanding between UX and development staff and clarify the role of UX for everyone on the team, consider:

  1. Team intros at project kickoff. At the beginning of each project, give each team member a brief chance to introduce themselves and explain what they will be doing and how they need to integrate with other team members. Allow team members to ask questions and clarify answers as needed. If there are serious disconnects between the expectations of different team members, use this time to achieve consensus about the role of everyone on the team.

  2. More in-depth definitions of each role on the team.  Give a member of each discipline a chance to deliver a presentation or talk to the larger team about their skills, their background, their experience, and the tools or techniques they use in their role. This will help developers understand what UX team members do plus help UX team members understand the roles of different members of the technical staff.

  3. Including UX team in synchronous and asynchronous communication channels (such as Skype, IRC or other chat systems.)

  4. Including UX goals and needs in sprint retrospectives.

To create shared team ownership of expectations for fit and finish, consider clarifying the definition of ‘done’ to include UX criteria.

To enhance project planning and prioritization, consider improving estimation for UX efforts by:

  1. Adding knowledge acquisition activities and design exploration work to the product backlog.

  2. Separating design effort on each story from implementation effort in product backlogs.

  3. Experimenting with tools and practices that have been used elsewhere to improve estimation and tracking of UX work across the feature lifecycle or within the context of a particular release, such as story mapping, design spikes, and UX matrices.

To foster understanding and empathy for the needs of users, consider hanging appropriate persona posters in the team’s work area or scrum room.

Product owners

To improve holistic planning outcomes, consider:

  1. Drawing on the expertise of design managers and leads. Invite them to participate in early strategy and product ideation sessions.

  2. Identifying and validating core needs of target users before initiating development (and capturing that information in product personas.)

  3. Using prioritized personas to groom the backlog.

To foster understanding and empathy for the needs of users across the team, consider:

  1. Associating user stories with specific personas.

  2. Scheduling and participating in a persona development process if appropriate personas aren’t available.

  3. Encouraging team participation or observation in user research activities. Consider making this participation explicit in the backlog so it doesn’t negatively impact velocity estimations. Opportunities may include joining site visits, speaking to users at events and observing usability studies.

To clarify expectations for fit and finish, consider:

  1. Including UX criteria in the definition of done.

  2. Setting clear UX goals for each sprint.

UX managers

To enable more holistic planning, set expectations with product management and executives for UX participation in product strategy meetings at all levels.

To increase team communication across business areas or large projects, create and support mechanisms for communication about priorities, design themes and patterns, and design efforts in progress.

To enable stronger relationships to form between designers and scrum teams: consider:

  1. Limiting the number of teams each designer supports during any one release.

  2. Improving estimation for UX efforts across business areas by tracking velocities for UX across each area with a UX matrix, or maintaining a master backlog of all UX activities in conjunction with scrum masters. This data will eventually help support your requests for additional headcount.

To clarify the role of UX for everyone on the team, provide regular Agile UX training for new hires.  This training should cover:

  1. Known effective tools and practices, including design studios, story mapping, design spikes, RITE studies, and unmoderated usability tests (including click tests, cardsorts and tree testing.)

  2. Techniques for estimating and tracking design work.

  3. Explicit training about the role of UX within the Agile development process and expectations for how UX team members interact with technical staff.

Interaction designers and user researchers

To improve involvement of scrum teams in the UX process, consider:

  1. Inviting all team members to give feedback on design directions and listening to design ideas from everyone on the team, regardless of role. Design studios, product walkthroughs, usability test debriefs and user research data interpretation sessions are all effective ways of soliciting this input.

  2. Inviting teams to participate in user research activities such as joining site visits, speaking to users at events and observing usability studies.

  3. Leveraging opportunities to provide more information about your role and about UX in general whenever a team member asks questions about your work.

To improve relationships and trust with stakeholders and team members, consider:

  1. Increasing your visibility in the life of the scrum team.

  2. Calling meetings outside of the standard agile interactions when necessary.

  3. Providing access to works in progress in a collaborative workspace.

  4. Listing UX issues and tracking their status in a shared document.

To foster understanding and empathy for the needs of users, consider:

  1. Reviewing appropriate design personas with the product owner and scrum team at the start of each release, and assign priorities to each.

  2. Hanging persona posters in the Scrum room as reminders.

  3. Associating user stories with specific personas.

  4. Including the product owner and scrum team in the persona development process if appropriate personas aren’t available or complete.

Conclusion

The Agile Manifesto was written to promote better ways of developing software–but the twelve principles behind it are relevant to everyone involved in the process of software delivery, not just those who code. Better integration of UX specialists will result in better outcomes for the business and for developers who work with UX.

In the words of Scrum Alliance founder Mike Cohn, “Agile does not at all require individuals to be generalists, but individuals are expected to work together as a team.”

For Scrum and Agile to live up to its full potential, it must address the needs of all team contributors, not just software developers. Giving support and trust to UX contributors will help motivate them to do their best work and leverage more of their skills in the pursuit of excellence.

Integrating UX into the Product Backlog

Written by: Jon Innes

Integrating UX into the Product BacklogTeams moving to agile often struggle to integrate agile with best practices in user-centered design (UCD) and user experience (UX) in general. Fortunately, using a UX Integration Matrix helps integrate UX and agile by including UX information and requirements right in the product backlog.

While both agile and UX methods share some best practices—like iteration and defining requirements based on stories about users—agile and UX methods evolved for different purposes, supporting different values. Agile methods were developed without consideration for UX best practices. Early agile pioneers were working on in-house IT projects (custom software) or enterprise software [1, 2].

The economics are different in selling consumer products than when developing software for enterprises—UX matters more for consumer products. Jeff Bezos cares if users know how to click the button that puts money in his pocket more than Larry Ellison cares about any button in Oracle software. Larry makes money even if people can’t use his software. Oracle sells support contracts and professional services to fix things customers don’t like. Amazon and other online businesses can’t operate like that. They have to get the UX right, or they go out of business fast. User experience factors rarely get the same level of consideration when the end-user is not the same person as the paying customer [3].

Agile teams and UX problems

I’ve encountered two problems common among agile teams when helping them improve the user experience of their products or services:

  1. UX work is frequently overlooked during the release and sprint planning efforts.
  2. Teams often fail to measure the UX impact of their iterative efforts.

These two problems become more serious when combined.

When UX work goes undone and the impact is not measured, the team doing the work has no idea what is going on. The feedback loop is broken. Both agile and UX methods emphasize iteration, but productive iteration requires good feedback loops.

You can conduct development iterations (the focus of agile) or design iterations (the focus of UX), but if you fail to measure the impact of the iteration, you won’t see the real benefits of an iterative process. You will have no real idea if your offering is any closer to meeting the needs of the end user. The User Experience Integration Matrix (UXI Matrix) addresses these problems by tying UX to the project backlog.

Integrating UX into the product backlog

You can conduct development iterations (the focus of agile) or design iterations (the focus of UX), but if you fail to measure the impact of the iteration, you won’t see the real benefits of an iterative process. You will have no real idea if your offering is any closer to meeting the needs of the end user.

Scrum (one of the most popular variants of agile) advocates you create a Product Backlog, a collection of stories that describe user needs. The team iteratively refines the requirements, from rough (and often ill defined) to more specific. This is achieved using stories from a user’s perspective, which have conditions of satisfaction associated with them. This concept is adapted from UCD’s scenario based design methods. In my view, this is far better than other traditional approaches to documenting requirements that are often detached from user’s goals.

Various agile gurus [4, 5] discuss how to break down requirements from high-level stories to user needs targeted at specific users. Even if your team follows Jeff Patton’s story mapping method [6], (which I highly recommend) to create structured hierarchical representations, you’ll often find you want to analyze the stories by different factors as you groom your backlog.

I’ve worked with teams who want to analyze stories the following ways:

  1. Order of dependency or workflow (self-service password reset depends on user registration)
  2. Criticality (which of these stories must be done so customers pay us next month)
  3. How much work will they take to complete (show me all the epics)
  4. What related stories do I have (find requirements with related UI patterns)
  5. By role or persona (show me all the stories that impact persona X)
  6. What stories have a high impact on the UX, so we can focus on those?

If the project is small (both in number of team members and number of stories) you might be able to get away with rearranging story cards on the wall. However, in my experience, things inevitably get more complex. You often want to consider multiple factors when reviewing the backlog. A UXI Matrix helps you track and view these multiple factors.

Creating a UX Integration Matrix

The UXI Matrix is a simple, flexible, tool that extends the concept of the product backlog to include UX factors normally not tracked by agile teams. To create a UX Integration Matrix, you add several UX-related data points to your user stories:

Column Name Possible Values Description
Persona Persona’s name Identifies the persona a user story applies to
UX complexity 1 to 5 (or Fibonacci numbers if you’re into that sort of thing) Estimates design effort separate from implementation effort
Story verified Y/N Is this story fiction or fact? Is it based on user research or customer input?
Design complete Y/N Is the design coherent enough for development to be able to code it (or estimate how long it would take to code)?
Task completion rates 0 to 100% The percentage of users who have been observed to complete the story without any assistance.
Staffing Resource’s name Who’s owns the design, at whatever level of fidelity is agreed to.

With these columns added, your product backlog begins to look something like the spreadsheet in figure 1.



Figure 1: UX Integration Matrix, a simplified example

Figure 1: UX Integration Matrix, a simplified example

Advantages to using the UXI Matrix

The UXI Matrix helps teams integrate UX best practices and user-centered design by inserting UX at every level of the agile process.

Groom the backlog

During release and sprint planning you can sort, group, and filter user stories in Excel:

  • Group by themes or epics, or anything you want to add via new columns
  • A primary persona, a set of personas, or number of personas associated (more = higher)
  • Stories you’ve verified via user research or ones you need to verify
  • Stories you’ve completed but need to refine based on metrics
  • Export stories into a format that can be used in information architecture work
  • Explore related UX deliverables like personas, mockups, and research findings via hyperlinks to them

Note the summary rows near the bottom of the example in Figure 1. Those values can help you prioritize new and existing stories based on various UX factors.

Reduce design overhead

Perhaps my experience is unusual, but even when I’ve worked on teams as small as seven people, we still had trouble identifying redundant user stories or personas. My heuristic is that if a story shares several personas with another story in a multi-user system, then that story may be a duplicate. Grouping by themes can also help here.

Another useful heuristic is that if a persona shares a large list of user stories with another persona, it’s likely the personas should be merged. Most of the time personas that do the exact same things with a product can use the same design, unless of course they have very different skills or something, which becomes evident when reviewing the personas or conducting user research (which all good personas are based on in my view).

Facilitate collaboration

The UX Integration Matrix helps teams integrate UX best practices and user-centered design by inserting UX at every level of the agile process.

Another major benefit of the UXI Matrix format is you can share it with remote team members.

Listing assigned staff provides visibility into who’s doing what (see the columns under the heading Staffing). Team members can figure out who’s working on related stories and check on what’s complete, especially if you create a hyperlink to the design or research materials right there in the matrix.

For example, if there’s a Y in the Design Complete column, you can click the hyperlink on Y to review the design mockup. I’ve worked with teams who like to track review states here: work in progress (WIP), draft (D), reviewed ( R ), etc. (instead of just Y or N).

Track user involvement and other UX metrics

The UX Integration Matrix also helps track and share key UX metrics. One key metric to track is team contact with real end users. For example, if you’ve talked to real users to validate a persona, how many did you speak with? Another good metric is, how many users have been involved in testing the design (via usability tests, A/B tests, or otherwise)?

You can also capture objective, quantitative UX metrics like task completion rates, click/conversion rates, and satisfaction rates by persona. It makes it easier to convince the team to revisit previous designs when metrics show users cannot use a proposed design, or are unsatisfied with the current product or service. It can also be useful to track satisfaction by user story (or story specific stats from multivariate testing) in a column right next to the story.

Review UX metrics during sprint retrospectives

Scrum-style reviews and retrospectives are critical to improving both the design and team performance. However, few teams consider UX metrics as part of the retrospective. During these meetings, it’s helpful to have UX metrics next to the stories you are reviewing with the team.

I ask the team to answer these UX-related questions during the retrospective:

  1. Did we meet our UX goals for the sprint?
    1. Does our user research show that what we built is useful and usable?
    2. Are users satisfied with the new functionality (new stories)?
    3. Are users likely to promote our product or service (overall) to others?
    4. Do we have product usage metrics (via site analytics) that meet our expectations?
  2. Is there existing functionality (stories) that need to be refined before we add new stuff?
  3. What user research should we do to prioritize stories going forward?
  4. Do we have enough staff with UX skills to meet our objectives?
  5. Going forward should we:
    1. Work a sprint ahead to ensure we validate UX assumptions?
    2. Do a spike to refine a critical part of the UX?
    3. Refocus UX work on select user stories or personas?
    4. Improve our feedback mechanisms to capture factors we are missing today?


Annotated example

Figure 2: The UX Integration Matrix inserts key user experience activities and context adjacent to user stories into the product backlog.

Improving your team’s design decisions

Once you start tracking in the UX Integration Matrix it becomes easier to have informed discussions during reviews and retrospectives. I use the UXI Matrix to set UX goals with the team, help prioritize stories in the backlog, track UX work in progress, and to help answer the classic agile problem “what does done mean”; not just for the entire product or service, but for individual stories.

I’d be curious to hear from others who would like to share their experiences with variations of the above or similar methods. On the other hand, if you’re an agile guy that thinks this all is very non-agile, I’ll ask “can you really prove your method creates a better UX without this stuff?”

Start using the UXI Matrix in your next sprint. Download and customize this Excel template to get started:

[1] Larman, Craig. Agile and Iterative Development: A Manager’s Guide. Pearson Education, 2004.

[2] Sutherland, Jeff. “Agile Development: Lessons learned from the first scrum”. Cutter Agile Project Management Advisory Service, Executive Update, Vol. 5, No. 20, 2004: http://www.scrumalliance.org/resources/35.

[3] Cagan, Martin. Inspired: How To Create Products Customers Love. SVPG Press, 2010.

[4] Cohn, Mike. User Stories Applied: For Agile Software Development. Addison-Wesley Professional, 2004.

[5] Cockburn, Alistair. Agile Software Development. Addison-Wesley Professional, 2001.

[6] Patton, Jeff; “It’s All In How You Slice It”, Better Software, January 2005: http://www.agileproductdesign.com/writing/how_you_slice_it.pdf.

Case study of agile and UCD working together

Written by: James Kelway

Large scale websites require groups of specialists to design and develop a product that will be a commercial success. To develop a completely new site requires several teams to collaborate and this can be difficult. Particularly as different teams may be working with different methods.

This case study shows how the ComputerWeekly user experience team integrated with an agile development group. It’s important to note the methods we used do not guarantee getting the job done. People make or break any project. Finding and retaining good people is the most important ingredient for success.

The brief

In 2008, we were tasked with resurrecting a tired, old, and ineffective site. It was badly out of date, and the information architecture was decrepit to both users and search engines.

The old computerweekly.com

Our goals were:

  1. Make content visible and easy to find
  2. Create an enjoyable and valuable user experience so users would return
  3. Increase page impressions to bring in ad revenue
  4. Allow site staff to present more rich media content
  5. Give the site more personality and interactivity

The UX team created personas from ethnographic studies, online surveys, and in-depth interviews with users. The data gave us a clear idea of the user’s needs and wants. We also gleaned data from analytics that told us where users engaged and where the bounce rates were highest.

At this point the development team maintained the site with an agile process. They created features for the new site in parallel to ongoing site maintenance, but this work was outside the normal maintenance sprints. The new site was considered as an entirely new project with a separate budget and scheduled into longer term.

Boundary Spanner

As the User Experience team gathered data key team members were recruited. The diagram below shows the key team members needed to produce this large scale site, their specific concerns, and their methodologies.

Boundary Spanner

To bring these teams and disparate elements together requires a launch manager or ‘boundary spanner’. Rizal Sebastian wrote about boundary spanners in Design Issues in 20051. The boundary spanner needs to be aware of the individual issues the project faces. He need not know the detail but needs to know the cultural context of the collaborative environment.

Do people get on with each other? Are communication lines clear? Are there any personality clashes on the team. To throw developers, interface designers, business analysts, SEO experts, and usability guys together and expect them all to gel is optimistic but unlikely. It also requires those people devote all their time to just one project and that is unrealistic for a large operation where several projects are underway simultaneously.

They are more than a project manager because the user— and not the project—is at the heart of their concerns. He is responsible for delivering and continually developing a quality product. He is not just monitoring the features on a checklist. The user is at the center of all decisions on the design and development of the site. This is the only way you can ensure the user will be heard throughout product development – to employ somebody who listens to user voices and never forgets what they said. They must also ensure that SEO and business requirements are satisfied, and a well-defined strategy is in place. The boundary spanner owns and clearly communicates the vision until the whole team understands.

The boundary spanner’s strength is that they are core to the product and not a department or team known for their skillset (like a UX team for example). In many cases it is a product manager, but in this case it was the website editor who was responsible for synchronizing the teams.

Defining a process

To assist this user focused approach, the IAs produced set of deliverables that ensured the launch manager’s vision could be realized and developed.

Defining a process

Diagramming the process using a concept model engaged key stakeholders with the project by communicating the vision of what the UX team would achieve with the speed and clarity of an elevator pitch.

Information gathering

A content audit revealed broken links, redundant content, and poor usability plagued the site. It also revealed how much content became lost the second it moved from the home page. The high value research papers were impossible to find.

30 interviews, 20 ethnographic studies, and 950 responses to an online survey. created six personas. With the content audit, personas, and business objectives (what we wanted them to do on our site), we began creating the taxonomy.

Analytics

During this project we were very fortunate to work alongside the web analytics manager who provided insight into user behavior, including high bounce rates from visitors arriving from search engines. He also provided a scorecard that showed where the site failed in terms of traffic and user engagement. We knew what users were searching for, and pretty quickly could tell why they were not finding content we knew we had.

Analytics screen showing overlay on the new website

By looking at web metrics we were understood usage patterns and popular and unpopular areas of the site. The depth of information enabled us to quickly formulate a plan.

Persona driven taxonomy

As we knew our user base were industry experts, we also knew their vocabulary was related to specific areas of their markets.

The taxonomy was created by gathering as many industry sources (websites, journals, white papers), breaking these down into unique elements, and clustering these elements together to form categories for our content.

The interface used to manage the CW taxonomy

The CW taxonomy formed the basis of the navigation, content categorization, and highlighted areas for future development. It also ensured our search results served up related content in context.

Search results displayed contextual related content

We defined four main areas by looking at the community of users.

ComputerWeekly Concept

News was an obvious requirement, defined by their particular area of interest within the sector. The need for knowledge was evident, and we created an in-depth research area where case studies and white papers could be easily accessed. Tools and services, RSS, email news alerts, and newsletters reflected user needs to be kept up to date and in tune with their specialization.

Finally, although the CW community was secretive and did not divulge information amongst their peers, they were very interested in expert opinions. This need gave rise to much more integrated blog postings.

Interface development

The navigation scheme defined the elements of the page the users would use to move to other areas in the site. It clarified the naming of those items on the page.

Sitemap

We then considered the prioritization of information and content for each page, and this facilitated the production of wireframes that represented the culmination of all research, showed the interface and interactions for elements on the page, and were a working document that changed as we iterated the design.

Core and Paths

We used Are Halland’s method for ‘designing findability inside out.’2

  • Prioritize and design the core – Satisfy user goals using prioritized content and functionality.
  • Design inward paths to the core – Consider how users will arrive at the page from search engine results, facets, menus, search, RSS, aggregation, email, etc.
  • Offer relevant outward paths from the core – Ensure that the site delivers both user and business goals through clear calls to action and completion of interaction tasks.

For CW.com, we focused on the cores for the home page, a channel level homepage, and a news article page and looked at key content such as lead news story and the editor’s picks or the best from the web aggregated from external sources. The key functionality and supporting content also had to be included and prioritized on the page.

Next we considered the inward paths, which are the channels that our users are likely to utilize to arrive at the page.

Inward paths

Inward paths included search engines, blogs, bookmarks, syndication, aggregators, RSS, and email subscriptions. Considering inward paths helped us focus on the marketing channels we needed to drive users to the relevant type of page. It also focused on the keywords and themes that users are likely to use and helped us optimize pages for search and online marketing campaigns.

Finally we designed the outward paths that helped users complete online tasks for our business objectives.

Outward paths

These outward paths include:

  • Newsletter sign-up
  • Inline links to related articles to drive page consumption
  • Sharing, printing or emailing of news articles
  • Related content types such as video or audio
  • Stimulating community participation in forums or blogs
  • Contextual navigation to aggregated content or the editors best bets
  • Subscription to an RSS feed
  • Prioritizing the content

Prioritizing the Content

Once the wireframes had been approved, the content was organized so the most commercially valuable and user focused content was pushed to the top of the page. As the design went through user testing, certain elements changed, as with any iterative process, but through team collaboration, the solution remained true to the initial vision from concept to design delivery.

The development cycle

The wireframes were handed over to creative, and they began designing the interface and graphic elements. The development group released some functional elements to the old website before the relaunch.

Widget

These agile methods allowed the old site to feel the benefits of the new widgets. However, as the site changed so radically in the new design, we still had to release the site in an old style ‘big-bang’ manner. This is perhaps where agile has its problems as a methodology for new launches. It’s focus on many small releases is a great tool to implement incremental changes but not for a completely new site.

As the html flat pages were passed to the team, the SEO requirements were defined and built into the site. By the time the site launched it, had been through four major pieces of testing.

Development Timeline

A holistic solution

Providing user experience deliverables like the concept map and wireframes ensured more comprehensive requirements were delivered to the design and development team. This approach addressed marketing, editorial, sales, and business requirements next to the needs and wants of the user. The vision was aligned with an achievable delivery from the IT team that ensured we delivered the site we wanted to give the user.

The new computerweekly.com

The core IA work enabled the new site to be future-focused and versatile. The structure and design of good sites should be able to adapt to change.

User-centered design and agile can work alongside each other but what is more important is having people who can tie all the loose strands of a website design and development cycle together. The concept map, wireframes and the IA strategy document that listed the rationale behind the design decisions helped ensure the product vision was correctly communicated to the development team, so the product that was developed through their agile processes was in line with the overall product vision.

1http://www.mitpressjournals.org/doi/abs/10.1162/0747936053103020? cookieSet=1&journalCode=desi

2http://www.slideshare.net/aregh/core-and-paths-designing-findability-from-the-inside-and-out

Bringing User Centered Design to the Agile Environment

Written by: Anthony Colfelt

When the exciting opportunity to work in a post-bubble dot.com startup arose, I jumped to take it. I had the luxury of doing things exactly as I thought right, and for a while it was truly fantastic. I built a team with a dedicated user researcher; information architect; interaction and visual designers and we even made a guerilla usability lab and had regular test sessions.

Unfortunately, the enthusiasm I had for my new job waned after six months when an executive was appointed Head of Product Development — who insisted he knew SCRUM1 better than anybody. As the Creative Director, I deferred authority to him to develop the product as he saw fit. I had worked with SCRUM before, done training with Ken Schwaber (author1 and co-founder of the Agile Alliance) and knew a few things from experience about how to achieve some success integrating a design team within SCRUM. This required the design team to work a “Sprint” (month long iteration) ahead of the development team. But the new executive insisted that SCRUM had to be done by-the-book. Which meant, all activities had to be included within the same sprint, including design.

Requirements came from the imagination of the Head of Product Development; design was rushed and ill-conceived as a result of time pressure; development was equally rushed and hacked together, or worse, unfinished. The end of Sprint debriefing meetings reliably consisted of a dressing down of the entire team by the executives (since nobody had delivered what they’d committed to i.e. they had tried to do too much, or had not done enough). Each Sprint consisted of trying to fix the mess from the Sprint before or brushing it under the carpet and developing something unstable atop the code-garbage. Morale languished, the product stank, good staff began to leave… it was horrible.

This is an extreme example of where SCRUM went bad. I am not anti-Agile although I’ve been bitten a few times and feel trepidation when I hear someone singing its praises without having much experience with it. Over the last eight years, I’ve seen Agile badly implemented far more often than well (and yes, it can be done well, too). The result of this is mediocre product released in as much time as it would have taken a good team to release great product using a waterfall approach. In this article, I will describe Agile and attempt to illuminate a potential minefield for those who are swept up in the fervor of this development trend and want to jump in headlong. Then I will present how practices within User Centred Design (UCD) can mitigate the inherent risks of Agile and how these may be integrated within Agile development approaches.

Where did Agile come from?

Envisioned by a group of developers, Agile is an iterative development approach that takes small steps toward defining a product or service. At the end of each step, we have something built that we could release to the market if we choose to and therefore it can assure some speed to market where waterfall methods usually fail. Agile prefers to work out how to build something as we go, rather than do a waterfall style deep dive into specification and then finding out we can’t build parts of the spec for some reason e.g. a misjudgment of feasibility, misjudgment of time to build, or changing requirements.

A group of developers such as Kent Beck, Martin Fowler and Ken Schwaber got together to come up with a way to synthesize what they had discovered was the most effective ways to develop software – The Agile Alliance was born. It released a manifesto2 to describe its tenets and how it differs from waterfall methods.

Agile can be thought of as a risk-management strategy. Often developers are approached directly by a client who does not know what a user experience designer, information architect or user interface designer is. Roles such as these usually interpret what clients want and translate it to some kind of specification for developers. Without this role, it’s down to the developer to work out and build what the customer wants. Because Agile requires a lot of engagement with the client (i.e. at the end of every iteration, which can be as little as a week) it mitigates the risk of going too far toward creating something the client doesn’t want. As such, it is a coping mechanism for a client’s shifting requirements during development as they begin to articulate what they want. To quote the Agile Manifesto’s principles “Welcome changing requirements, even late in development. Agile processes harness change for the customer’s competitive advantage.

Why do people rave about it?

At the heart of what makes Agile attractive is the possibility of quicker return on investment for development effort, because we can release software earlier than we would have otherwise. In the short term, this is typically borne out. In the long term it can be too, though only when the team hasn’t fallen victim to temptation (more on that later).  Agile is also good at generating momentum because the iterations act as a drumbeat to which the team marches toward manageable deadlines. The regular "push" to finish a sprint ensures that things move along swiftly. Agile is also good at avoiding feature bloat by encouraging developers to do only what is necessary to meet requirements.

Because it emphasizes face to face contact for a multidisciplinary team, Agile tends to encourage contribution from different perspectives. This is generally a positive influence on, pragmatism, innovation and speed of issue resolution. The team is empowered to make decisions as to how requirements should best be met.

The Minefield

In of itself, Agile does a good job of flexing to the winds of change. But one has to ask whether it was devised to treat a symptom of the larger cause: the business doesn’t know what it wants. While Agile enables the development team to better cope with this, it doesn’t solve the problem and in most cases creates new problems.

Mine 1: An unclear role for design

In the best cases of business approaching developers to build some software, some of those developers may have design skills. But that’s not a particularly common scenario. Many developers have also had bad experiences with designers who don’t know what they’re doing. It took a long time for the design profession to come to grips with designing for complex systems and there is still a deficit of expertise in this field. “Business people and developers must work together daily throughout the project” is another principle of Agile. Where does the designer fit into the frame?

Mine 2: The requirements gathering process is not defined

Agile accommodates design activities from the perspective of a developer. It tends to shoe-horn these activities into their view of the world where requirements fall from the sky (from the business or customer who is assumed to be all-knowing) and takes for granted that they are appropriate.

According to Ken Schwaber, SCRUM intends to be a holistic management methodology and leaves space for activities other than programming to occur within the framework of iterative cycles. But when organizations adopt SCRUM, too often the good parts of a waterfall process like research and forming a high-level blueprint for the overall design become the proverbial baby thrown out with the documentation bathwater. As the Agile Manifesto says, “Working software over comprehensive documentation.”2 Many latch onto this and don’t want to do any type of documentation that might outline a vision, even if in a rudimentary sense.

Mine 3: Pressure to cut corners

Implementations of Agile that put design activities within the same iteration as they must be developed, ensure designs are achievable in code. But they also put tremendous pressure on the experience design team to ‘feed the development machine’ in time enough for them to implement their vision. This can and does lead to impulsive design. So, what’s wrong with that? Well, nothing if you’re not adhering to user centric principles which suggest you should test ideas with end users before committing them to code.

Some assert that there are plenty of examples of best-practice interfaces to copy out there. So, why reinvent the wheel? Surely we can save time that way? Sometimes they’re right, but how will we know which best-practice interface works best in context with the user’s goals, with no time to test with the user? How can we innovate by copying what already exists? Before Google reinvented internet search, other search engines assumed a status quo which behooved the user to learn how to form proper search queries. It was institutional knowledge among the other search engines that this is how searching was done and customers simply had to learn to use it. Most people’s search results were poor at best. Then Google came along and realized what is now obvious. People just want to find what they’re looking for, not learn how to drive a search engine first. I’m not suggesting the other search engines could not have done what Google did sooner, but I am pointing the finger at a mentality which meant they missed the opportunity. Interestingly, Google is not known for its designers. It’s mainly a development house, but lots of those developers can clearly put a design hat on too.

There is absolutely nothing wrong with using Agile to produce results quickly; that is, if you don’t intend to release them on your poor, unsuspecting user without some usability testing. Just don’t be fooled that this is going to save you a lot of time if you want your new product to be right, because you will have to iterate to arrive at an appropriate solution. Alan Cooper has argued that this creates a kind of ‘scar tissue’ where code that has to be changed or modified leaves a ‘scar’ that makes the foundations of the program unsound.4

Mine 4: The temptation to call it “good enough”

Invariably when we have release-ready working code at the end of each cycle, even if it’s sub-optimal, there’s a strong temptation to release it because we can. Agile condones releasing whatever we have so long as it works. Sometimes, that means doing what we can get away with, not what is ultimately best for the user. Equally, if we do decide that a feature isn’t right yet, it’s amendments get fed back into the requirements backlog where temptation strikes again. Should we spend time in our next iteration on a feature that we’ve already got a version of? Or shall we develop something new instead? Too often, the rework gets left in favor of exciting new stuff. An so on we go building a product full of features that don’t quite meet the bar.

Typical Agile Development

Mine 5: Insufficient risk-free conceptual exploration time

Iteration “zero” (i.e. a planning and design iteration prior to the first development iteration) can be used to do this and other planning activities. However, depending on how long this iteration is, the level of rigor applied to exploration may be insufficient. An argument used by some Agile practitioners asserts that a working example of a solution is the best way to validate whether it is the right one through exposure to the market. This ‘suck it and see’ approach bypasses an activity called “concepting.” Concept activities dedicate time to sketching different solutions at a high level and validating them in the rough with users before digging into detailed design or code. “Suck it and see” would have us just build it, launch it and see if it flies. This way, we’ve wasted time building something we will probably have to take apart or rebuild. The counter argument is: if it took as long to build as it would have to research and design before laying a line of code, then we break even. This statement is a stretch in practice because development itself usually does take longer than well-managed design research and conceptual exploration. Also, there has to be some level of design regardless  of which methodology is used, and this adds days to the timeline.

Mine 6: Brand Damage

Let’s just say that design and research takes the same amount of time as development for argument’s sake. In the worst case, we completely miss the mark with the non-researched and designed solution and we have to start all over again. Then we’re back to the same total duration after developing it a second time, but there’s no guarantee we’ll get the solution right the second time either. All the while we’ve repeatedly foisted a botched product design on our users and adversely affected our brand. Many companies succeed on the back of their reputation for producing consistently appropriate products and services. When a company releases a flawed product or service, then their image in the customers mind (i.e. brand) is tarnished. Brand damage takes far longer to mend than it does to make. Software creators that fall victim to the temptation of "good enough" and fail to innovate through conceptual exploration put their companies revenues at risk. In a competitive market, repeated failure to meet user needs well leads to serious brand and subsequently financial repercussions, as other companies who do get it right take the business.

Agile is good for refining, not defining.

If you have an existing product that you want to develop to the next level, then Agile in its truest sense works because you have a base upon which to improve. This means that if you know what your requirements are and these have been properly informed with user research, comparative analysis, business objectives, and analysis of what content you have and what you can technically achieve, then Agile alone can work well.

But spending money on software development without a plan of what to build is like asking a construction crew to erect a tower with no blueprint. Some level of plan is necessary to avoid a Frankenstein of each individual’s perspective on the best design solution.

User Centered Design

UCD requires iteration – design, test with users, refine, test with users again, refine… repeat till it’s right. This is where Agile and UCD can work brilliantly together. Agile really is about presuming you’ll need to change things, and that’s a good thing when it comes to refinement.

Uncovering requirements to form a strategy

User Centered Design (UCD) is not about answering requirements alone, but also includes defining requirements. When we practice UCD end-to-end, we pretend we know little. Little about what the solution to a problem should be; little about what the problem actually is because assumptions close us off to new possibilities. We prefer to allow some design research to create a viewpoint and then form a hypothesis as to what we might build. In this regard, we cross into the realm of product managers, producers, program managers, business analysts and the like, trampling toes with gay abandon and meeting resistance all around. Facing confinement to defining the boring old business need (distinct from the user or customer need), these folks would prefer we constrain our UCD work to usability testing on designs meeting the requirements they set out. They’d prefer we stick to just helping with development… and if we can do that quicker using Agile? Wahey!

Typical UCD Waterfall

Is it always appropriate to do extensive research before starting design? That’s a good question and one that Jared Spool’s Market Maturity Framework5 helps answer. Sometimes, just getting something off the ground, regardless of how precisely we meet user’s needs with it is all we can afford to do. Once we graduate out of this "Raw Iron" stage into "Checklist Battles" focused on getting the right features and then beyond, research is a core ingredient to putting our feet in the right place.

After researching what the user and business requires, we can make the “Strategy” tier of Jesse James Garret’s Elements of User Experience3which underpins everything we do during the project. Do this well, and you really shouldn’t come up with something that’s fundamentally wrong. Agile doesn’t account for this beyond a planning phase (i.e. iteration zero), which may well define a strategy of sorts. But does it really define the correct strategy? Surely, that’s created through careful consideration of three things:

  1. Empathetic qualitative research that uncovers the user’s context, needs, goals and attitudes i.e. user requirements. Cooper suggests that the customer doesn’t know what they want and advocates a role of interaction designer as requirements planner.4 This would avert building to the wrong requirements in the first place, but the time to do this must come into the development lifecycle somewhere. It involves talking to users, preferably visiting with them in their environments to create experience models and user personas.
  2. A thorough appreciation of what else in the big wide world exists in terms of products, features and technology that can be emulated somehow (not necessarily addressing a similar situation to ours).
  3. A clear articulation of the business problem, objectives, success measures and constraints. Business people sat in a room discussing what they think should be done must be informed by all these things if the right strategy is to emerge. Agile doesn’t preclude that kind of consideration, but it does not mandate it either.

JJG's Element of UE

Concept Development

If we manage to built something usable and reasonably intuitive without research or strategy, did we succeed? Most MP3 players fit this bill but none took off like the Apple iPod. Leaving interface usability aside, the iPod had a service concept behind it which included digitizing, replenishing and managing your entire music library with iTunes. This was part of the iPod concept from the outset and in combination with good marketing and design, continues to eclipse the competition over seven years later. But that concept needed to be sketched and iterated at some point. If we don’t explicitly build this into our Agile methodology, we can miss that thinking time.

Holistic Design Concept

The best of both worlds

UCD can be too documentation-heavy, isolated and risky but Agile needs help with defining requirements and concept development. How can Agile and user centric principles work together? First let’s understand what works well with Agile and not so well with user centered design. In this regard, the work that user centered design calls the ‘design’ phase can produce buckets of documentation which isn’t read, describing interfaces specified in isolation which may not be feasibly coded in the time allotted to them. So, doing detailed design is best done in conjunction with the development team and in a way where resulting interfaces can be tweaked as you go. 

Best of Both Worlds

A shared vision of the interaction fundamentals

In good software development, a conceptual interaction model that has been thought through beforehand, outlines how the user navigates the system, performs tasks and uses tools in generic terms, i.e. not each and every navigation label, task or tool but rather the interface and interaction patterns that will persist. This produces something rudimentary to test with users to see if we got the big picture right. Following this roadmap sketched on the back of research and concepting prior to development activity, ensures consistency and cohesiveness when each component is coded separately to each other later. In many cases, the concept will need iterating to accommodate lessons from the journey. But we’ll at least have some indication of direction at a macro scale. Then, when in the midst of Agile iterations working out the details alongside our developer brethren, a level of expertise and experience is required of the designer because what we design will be built before we’ve had a chance to second-guess ourselves. Domain knowledge and an understanding of interface paradigms that work is also a big help. But to build new projects from scratch without a shared vision is a mistake.

Risky interfaces that are new or significant improvements on what has been seen before, are best tackled as design-only activities in a sprint prior to when they will be developed (i.e. do involve developers, don’t try to produce code). This circumvents the pressure to deliver something before proper thought, reflection and user testing, which ensures you’re not wasting time and effort. Sometimes most of the product will be done this way and that’s fine so long as developers and designers are still working together and talking every day. The first development iterations are an important time for the developers to lay the architectural foundations based on the vision. Designers should use this time to get a jump on any high-priority tricky interfaces so the development team isn’t waiting for something meaty to start on when it comes time to build features.

Most important to success, the business needs to accept that some things won’t be right the first time around and commit to iterating them prior to release i.e. not be led into the temptation to release something that’s not right yet.

Conclusion

In summary, dogmatic attitudes about each of these approaches should be avoided if they are to be combined. Remember, Agile does not mandate how to define concepts or overall design direction, but it is a great way to execute on solid design research and well laid plans. UCD needs to be flexible enough to respond to the reality on the ground when the implementation team encounters issues that mandate a different design solution. Document only what is needed to get the message across and co-locate if at all possible, because cross-disciplinary collaboration and face to face communication are vital. Working a sprint ahead of the development team is helpful in allowing the design team enough time to test and iterate. If these rules of engagement are followed, the two approaches can work very well together.

Notes:
1. Agile Software Development with SCRUM by Ken Schwaber and Mike Beedle

2. Manifesto for Agile Software Development

3. The Elements of User Experience by Jesse James Garrett

4. Extreme Programming Vs. Interaction Design. Interview with Kent Beck and Alan Cooper

5. The Market Maturity Framework is Still Important – Jared Spool