Bringing User Centered Design to the Agile Environment

Posted by

When the exciting opportunity to work in a post-bubble dot.com startup arose, I jumped to take it. I had the luxury of doing things exactly as I thought right, and for a while it was truly fantastic. I built a team with a dedicated user researcher; information architect; interaction and visual designers and we even made a guerilla usability lab and had regular test sessions.

Unfortunately, the enthusiasm I had for my new job waned after six months when an executive was appointed Head of Product Development — who insisted he knew SCRUM1 better than anybody. As the Creative Director, I deferred authority to him to develop the product as he saw fit. I had worked with SCRUM before, done training with Ken Schwaber (author1 and co-founder of the Agile Alliance) and knew a few things from experience about how to achieve some success integrating a design team within SCRUM. This required the design team to work a “Sprint” (month long iteration) ahead of the development team. But the new executive insisted that SCRUM had to be done by-the-book. Which meant, all activities had to be included within the same sprint, including design.

Requirements came from the imagination of the Head of Product Development; design was rushed and ill-conceived as a result of time pressure; development was equally rushed and hacked together, or worse, unfinished. The end of Sprint debriefing meetings reliably consisted of a dressing down of the entire team by the executives (since nobody had delivered what they’d committed to i.e. they had tried to do too much, or had not done enough). Each Sprint consisted of trying to fix the mess from the Sprint before or brushing it under the carpet and developing something unstable atop the code-garbage. Morale languished, the product stank, good staff began to leave… it was horrible.

This is an extreme example of where SCRUM went bad. I am not anti-Agile although I’ve been bitten a few times and feel trepidation when I hear someone singing its praises without having much experience with it. Over the last eight years, I’ve seen Agile badly implemented far more often than well (and yes, it can be done well, too). The result of this is mediocre product released in as much time as it would have taken a good team to release great product using a waterfall approach. In this article, I will describe Agile and attempt to illuminate a potential minefield for those who are swept up in the fervor of this development trend and want to jump in headlong. Then I will present how practices within User Centred Design (UCD) can mitigate the inherent risks of Agile and how these may be integrated within Agile development approaches.

Where did Agile come from?

Envisioned by a group of developers, Agile is an iterative development approach that takes small steps toward defining a product or service. At the end of each step, we have something built that we could release to the market if we choose to and therefore it can assure some speed to market where waterfall methods usually fail. Agile prefers to work out how to build something as we go, rather than do a waterfall style deep dive into specification and then finding out we can’t build parts of the spec for some reason e.g. a misjudgment of feasibility, misjudgment of time to build, or changing requirements.

A group of developers such as Kent Beck, Martin Fowler and Ken Schwaber got together to come up with a way to synthesize what they had discovered was the most effective ways to develop software – The Agile Alliance was born. It released a manifesto2 to describe its tenets and how it differs from waterfall methods.

Agile can be thought of as a risk-management strategy. Often developers are approached directly by a client who does not know what a user experience designer, information architect or user interface designer is. Roles such as these usually interpret what clients want and translate it to some kind of specification for developers. Without this role, it’s down to the developer to work out and build what the customer wants. Because Agile requires a lot of engagement with the client (i.e. at the end of every iteration, which can be as little as a week) it mitigates the risk of going too far toward creating something the client doesn’t want. As such, it is a coping mechanism for a client’s shifting requirements during development as they begin to articulate what they want. To quote the Agile Manifesto’s principles “Welcome changing requirements, even late in development. Agile processes harness change for the customer’s competitive advantage.

Why do people rave about it?

At the heart of what makes Agile attractive is the possibility of quicker return on investment for development effort, because we can release software earlier than we would have otherwise. In the short term, this is typically borne out. In the long term it can be too, though only when the team hasn’t fallen victim to temptation (more on that later).  Agile is also good at generating momentum because the iterations act as a drumbeat to which the team marches toward manageable deadlines. The regular "push" to finish a sprint ensures that things move along swiftly. Agile is also good at avoiding feature bloat by encouraging developers to do only what is necessary to meet requirements.

Because it emphasizes face to face contact for a multidisciplinary team, Agile tends to encourage contribution from different perspectives. This is generally a positive influence on, pragmatism, innovation and speed of issue resolution. The team is empowered to make decisions as to how requirements should best be met.

The Minefield

In of itself, Agile does a good job of flexing to the winds of change. But one has to ask whether it was devised to treat a symptom of the larger cause: the business doesn’t know what it wants. While Agile enables the development team to better cope with this, it doesn’t solve the problem and in most cases creates new problems.

Mine 1: An unclear role for design

In the best cases of business approaching developers to build some software, some of those developers may have design skills. But that’s not a particularly common scenario. Many developers have also had bad experiences with designers who don’t know what they’re doing. It took a long time for the design profession to come to grips with designing for complex systems and there is still a deficit of expertise in this field. “Business people and developers must work together daily throughout the project” is another principle of Agile. Where does the designer fit into the frame?

Mine 2: The requirements gathering process is not defined

Agile accommodates design activities from the perspective of a developer. It tends to shoe-horn these activities into their view of the world where requirements fall from the sky (from the business or customer who is assumed to be all-knowing) and takes for granted that they are appropriate.

According to Ken Schwaber, SCRUM intends to be a holistic management methodology and leaves space for activities other than programming to occur within the framework of iterative cycles. But when organizations adopt SCRUM, too often the good parts of a waterfall process like research and forming a high-level blueprint for the overall design become the proverbial baby thrown out with the documentation bathwater. As the Agile Manifesto says, “Working software over comprehensive documentation.”2 Many latch onto this and don’t want to do any type of documentation that might outline a vision, even if in a rudimentary sense.

Mine 3: Pressure to cut corners

Implementations of Agile that put design activities within the same iteration as they must be developed, ensure designs are achievable in code. But they also put tremendous pressure on the experience design team to ‘feed the development machine’ in time enough for them to implement their vision. This can and does lead to impulsive design. So, what’s wrong with that? Well, nothing if you’re not adhering to user centric principles which suggest you should test ideas with end users before committing them to code.

Some assert that there are plenty of examples of best-practice interfaces to copy out there. So, why reinvent the wheel? Surely we can save time that way? Sometimes they’re right, but how will we know which best-practice interface works best in context with the user’s goals, with no time to test with the user? How can we innovate by copying what already exists? Before Google reinvented internet search, other search engines assumed a status quo which behooved the user to learn how to form proper search queries. It was institutional knowledge among the other search engines that this is how searching was done and customers simply had to learn to use it. Most people’s search results were poor at best. Then Google came along and realized what is now obvious. People just want to find what they’re looking for, not learn how to drive a search engine first. I’m not suggesting the other search engines could not have done what Google did sooner, but I am pointing the finger at a mentality which meant they missed the opportunity. Interestingly, Google is not known for its designers. It’s mainly a development house, but lots of those developers can clearly put a design hat on too.

There is absolutely nothing wrong with using Agile to produce results quickly; that is, if you don’t intend to release them on your poor, unsuspecting user without some usability testing. Just don’t be fooled that this is going to save you a lot of time if you want your new product to be right, because you will have to iterate to arrive at an appropriate solution. Alan Cooper has argued that this creates a kind of ‘scar tissue’ where code that has to be changed or modified leaves a ‘scar’ that makes the foundations of the program unsound.4

Mine 4: The temptation to call it “good enough”

Invariably when we have release-ready working code at the end of each cycle, even if it’s sub-optimal, there’s a strong temptation to release it because we can. Agile condones releasing whatever we have so long as it works. Sometimes, that means doing what we can get away with, not what is ultimately best for the user. Equally, if we do decide that a feature isn’t right yet, it’s amendments get fed back into the requirements backlog where temptation strikes again. Should we spend time in our next iteration on a feature that we’ve already got a version of? Or shall we develop something new instead? Too often, the rework gets left in favor of exciting new stuff. An so on we go building a product full of features that don’t quite meet the bar.

Typical Agile Development

Mine 5: Insufficient risk-free conceptual exploration time

Iteration “zero” (i.e. a planning and design iteration prior to the first development iteration) can be used to do this and other planning activities. However, depending on how long this iteration is, the level of rigor applied to exploration may be insufficient. An argument used by some Agile practitioners asserts that a working example of a solution is the best way to validate whether it is the right one through exposure to the market. This ‘suck it and see’ approach bypasses an activity called “concepting.” Concept activities dedicate time to sketching different solutions at a high level and validating them in the rough with users before digging into detailed design or code. “Suck it and see” would have us just build it, launch it and see if it flies. This way, we’ve wasted time building something we will probably have to take apart or rebuild. The counter argument is: if it took as long to build as it would have to research and design before laying a line of code, then we break even. This statement is a stretch in practice because development itself usually does take longer than well-managed design research and conceptual exploration. Also, there has to be some level of design regardless  of which methodology is used, and this adds days to the timeline.

Mine 6: Brand Damage

Let’s just say that design and research takes the same amount of time as development for argument’s sake. In the worst case, we completely miss the mark with the non-researched and designed solution and we have to start all over again. Then we’re back to the same total duration after developing it a second time, but there’s no guarantee we’ll get the solution right the second time either. All the while we’ve repeatedly foisted a botched product design on our users and adversely affected our brand. Many companies succeed on the back of their reputation for producing consistently appropriate products and services. When a company releases a flawed product or service, then their image in the customers mind (i.e. brand) is tarnished. Brand damage takes far longer to mend than it does to make. Software creators that fall victim to the temptation of "good enough" and fail to innovate through conceptual exploration put their companies revenues at risk. In a competitive market, repeated failure to meet user needs well leads to serious brand and subsequently financial repercussions, as other companies who do get it right take the business.

Agile is good for refining, not defining.

If you have an existing product that you want to develop to the next level, then Agile in its truest sense works because you have a base upon which to improve. This means that if you know what your requirements are and these have been properly informed with user research, comparative analysis, business objectives, and analysis of what content you have and what you can technically achieve, then Agile alone can work well.

But spending money on software development without a plan of what to build is like asking a construction crew to erect a tower with no blueprint. Some level of plan is necessary to avoid a Frankenstein of each individual’s perspective on the best design solution.

User Centered Design

UCD requires iteration – design, test with users, refine, test with users again, refine… repeat till it’s right. This is where Agile and UCD can work brilliantly together. Agile really is about presuming you’ll need to change things, and that’s a good thing when it comes to refinement.

Uncovering requirements to form a strategy

User Centered Design (UCD) is not about answering requirements alone, but also includes defining requirements. When we practice UCD end-to-end, we pretend we know little. Little about what the solution to a problem should be; little about what the problem actually is because assumptions close us off to new possibilities. We prefer to allow some design research to create a viewpoint and then form a hypothesis as to what we might build. In this regard, we cross into the realm of product managers, producers, program managers, business analysts and the like, trampling toes with gay abandon and meeting resistance all around. Facing confinement to defining the boring old business need (distinct from the user or customer need), these folks would prefer we constrain our UCD work to usability testing on designs meeting the requirements they set out. They’d prefer we stick to just helping with development… and if we can do that quicker using Agile? Wahey!

Typical UCD Waterfall

Is it always appropriate to do extensive research before starting design? That’s a good question and one that Jared Spool’s Market Maturity Framework5 helps answer. Sometimes, just getting something off the ground, regardless of how precisely we meet user’s needs with it is all we can afford to do. Once we graduate out of this "Raw Iron" stage into "Checklist Battles" focused on getting the right features and then beyond, research is a core ingredient to putting our feet in the right place.

After researching what the user and business requires, we can make the “Strategy” tier of Jesse James Garret’s Elements of User Experience3which underpins everything we do during the project. Do this well, and you really shouldn’t come up with something that’s fundamentally wrong. Agile doesn’t account for this beyond a planning phase (i.e. iteration zero), which may well define a strategy of sorts. But does it really define the correct strategy? Surely, that’s created through careful consideration of three things:

  1. Empathetic qualitative research that uncovers the user’s context, needs, goals and attitudes i.e. user requirements. Cooper suggests that the customer doesn’t know what they want and advocates a role of interaction designer as requirements planner.4 This would avert building to the wrong requirements in the first place, but the time to do this must come into the development lifecycle somewhere. It involves talking to users, preferably visiting with them in their environments to create experience models and user personas.
  2. A thorough appreciation of what else in the big wide world exists in terms of products, features and technology that can be emulated somehow (not necessarily addressing a similar situation to ours).
  3. A clear articulation of the business problem, objectives, success measures and constraints. Business people sat in a room discussing what they think should be done must be informed by all these things if the right strategy is to emerge. Agile doesn’t preclude that kind of consideration, but it does not mandate it either.

JJG's Element of UE

Concept Development

If we manage to built something usable and reasonably intuitive without research or strategy, did we succeed? Most MP3 players fit this bill but none took off like the Apple iPod. Leaving interface usability aside, the iPod had a service concept behind it which included digitizing, replenishing and managing your entire music library with iTunes. This was part of the iPod concept from the outset and in combination with good marketing and design, continues to eclipse the competition over seven years later. But that concept needed to be sketched and iterated at some point. If we don’t explicitly build this into our Agile methodology, we can miss that thinking time.

Holistic Design Concept

The best of both worlds

UCD can be too documentation-heavy, isolated and risky but Agile needs help with defining requirements and concept development. How can Agile and user centric principles work together? First let’s understand what works well with Agile and not so well with user centered design. In this regard, the work that user centered design calls the ‘design’ phase can produce buckets of documentation which isn’t read, describing interfaces specified in isolation which may not be feasibly coded in the time allotted to them. So, doing detailed design is best done in conjunction with the development team and in a way where resulting interfaces can be tweaked as you go. 

Best of Both Worlds

A shared vision of the interaction fundamentals

In good software development, a conceptual interaction model that has been thought through beforehand, outlines how the user navigates the system, performs tasks and uses tools in generic terms, i.e. not each and every navigation label, task or tool but rather the interface and interaction patterns that will persist. This produces something rudimentary to test with users to see if we got the big picture right. Following this roadmap sketched on the back of research and concepting prior to development activity, ensures consistency and cohesiveness when each component is coded separately to each other later. In many cases, the concept will need iterating to accommodate lessons from the journey. But we’ll at least have some indication of direction at a macro scale. Then, when in the midst of Agile iterations working out the details alongside our developer brethren, a level of expertise and experience is required of the designer because what we design will be built before we’ve had a chance to second-guess ourselves. Domain knowledge and an understanding of interface paradigms that work is also a big help. But to build new projects from scratch without a shared vision is a mistake.

Risky interfaces that are new or significant improvements on what has been seen before, are best tackled as design-only activities in a sprint prior to when they will be developed (i.e. do involve developers, don’t try to produce code). This circumvents the pressure to deliver something before proper thought, reflection and user testing, which ensures you’re not wasting time and effort. Sometimes most of the product will be done this way and that’s fine so long as developers and designers are still working together and talking every day. The first development iterations are an important time for the developers to lay the architectural foundations based on the vision. Designers should use this time to get a jump on any high-priority tricky interfaces so the development team isn’t waiting for something meaty to start on when it comes time to build features.

Most important to success, the business needs to accept that some things won’t be right the first time around and commit to iterating them prior to release i.e. not be led into the temptation to release something that’s not right yet.

Conclusion

In summary, dogmatic attitudes about each of these approaches should be avoided if they are to be combined. Remember, Agile does not mandate how to define concepts or overall design direction, but it is a great way to execute on solid design research and well laid plans. UCD needs to be flexible enough to respond to the reality on the ground when the implementation team encounters issues that mandate a different design solution. Document only what is needed to get the message across and co-locate if at all possible, because cross-disciplinary collaboration and face to face communication are vital. Working a sprint ahead of the development team is helpful in allowing the design team enough time to test and iterate. If these rules of engagement are followed, the two approaches can work very well together.

Notes:
1. Agile Software Development with SCRUM by Ken Schwaber and Mike Beedle

2. Manifesto for Agile Software Development

3. The Elements of User Experience by Jesse James Garrett

4. Extreme Programming Vs. Interaction Design. Interview with Kent Beck and Alan Cooper

5. The Market Maturity Framework is Still Important – Jared Spool