MindCanvas Review

by:   |  Posted on

MindCanvas describes itself as a remote research tool that uses Game-like Elicitation Methods (GEMs) to gather insights about customer’s thoughts and feelings. It was developed by Uzanto Consulting, a web product strategy firm. When I first learned about MindCanvas, I understood it to be an online card sorting tool. Happily, it’s much more than that.

As a veteran IA consultant, I have used MindCanvas a handful of times during the course of different projects. I have also conducted card sorting exercises without the tool. I am thrilled to have a useful—and user-friendly—tool at my disposal. One of my main reasons for selecting MindCanvas was the reputation of one of its creators, Rashmi Sinha. She is well known and respected, and I felt assured that any tool designed by a fellow IA for IAs couldn’t be all that bad. I was right.

MindCanvas provides open and closed card sorting capabilities, as well as a host of other UT tools: Divide-the-Dollar, Clicky, Sticky, Concept Test, and FreeList. Clicky and Sticky allow users to react to a wireframe or prototype by answering questions about images and content, or applying stickies (Post-it–like notes) with attributes to a visual image. FreeList and Divide-the-Dollar allow you to elicit product ideas and prioritize them by having participants list and rank the features they find most useful. All of these methods offer easy-to-use interfaces to help your research participants along.

Deciding which MindCanvas method to use is one of the more complicated parts of the tool. It’s card sorting methods are good for validating a site’s navigation or information hierarchy. You can also explore user needs and values and gather feedback on brand and positioning by using some of its more specialized UT methods. MindCanvas’ website and supporting help wiki provide information on selecting the appropriate testing method for your website or product.

Using MindCanvas

The basic process for using MindCanvas is as follows:

  1. After payment, sign an agreement to obtain a login and password.
  2. Decide which method (i.e. Sticky, FreeList, etc.) addresses your research needs.
  3. Create potential research questions and tasks based on the MindCanvas method you have selected.
    (I’ve used OpenSortand TreeSort).
  4. Upload questions to MindCanvas’ Workbench.
  5. Test the research study and make changes until you are satisfied with it.
  6. Send out the test site URL to your participants.
  7. Monitor the study (i.e. see how many people have completed all the tasks).
  8. When the study is concluded, send a report request to the MindCanvas team.
  9. Receive the reports in visual form and download raw data from the MindCanvas site.
  10. Embed reports into PowerPoint or Word document and review results with client.

I usually take several days to review the reports before showing them to my consulting clients. Doing so allows me to more easily explain the results. (Here’s a pointer to anyone using MindCanvas: To view the results properly make sure PowerPoint is in “Slideshow” mode).


MindCanvas has a couple shining strengths I’d like to illuminate:

  1. An engaging, easy-to-use interface for your customers or end users. It’s fairly self-explanatory and makes routine UT tasks fun.
  2. Stellar data visualization tools once your study is completed.

User Interface

MindCanvas’ interface is what sets it apart from other UT software I’ve seen. Its creators took their inspiration from the world of digital gaming to develop an interface that’s engaging for the person using it, while gathering important data for researchers. Its card sorting methods employ a floating hand to deal cards, which are then sorted by users. Another method gives users virtual gold coins to vote for their favorite product features. These exercises are enhanced by accompanying sound effects. I’ve received numerous comments from users describing MindCanvas’ exercises as “fun”. They have also commented that while they don’t understand how these exercises will help me build a better website or software interface, they still enjoyed the tasks and were pleased at the conclusion of the test.

The other online research tools I’ve reviewed offer more awkward interfaces. Sorting exercises take multiple steps or the online tasks are not intuitive and confuse research participants. I’m not interested in making my users become experts at online card sorting or other UT methods. I simply want to extract what they know or understand about a particular website or service.

According to Jess McMullin of nForm User Experience Consulting, “MindCanvas is unmatched as a remote research tool in its ability to provide creative methods for gathering data [and] engaging participants…..”

Data Visualization

Another MindCanvas strength is its data output. Although you can obtain the raw data and analyze it yourself (assuming you have statistical software and know how to use it), the real benefit of MindCanvas is its easy-to-understand data visualizations, which showcase the results of your study. All my clients have received clear, easy-to-interpret answers to their research questions. The visualizations can be embedded into a PowerPoint slide or Word document, making them easily accessible. Your clients don’t have to rely on your interpretation of the data; they can interpret the data themselves if they choose. Every client who has viewed MindCanvas’ data visualizations has been impressed and wondered why it wasn’t used all along.


I’ve used MindCanvas a handful of times and encountered some weaknesses:

  • Study Size: If you have a large client with complex, statistically rigorous research needs, MindCanvas is not for you. It has a limit of 200 users per study. Two hundred is plenty for most of my research needs, but some of my clients want to go beyond that.

  • Data Sorting: If you have complex user segmentation needs, MindCanvas has its limitations. It allows you to perform a single data sort to identify user sub-groups. For example, it’s easy to segment all male vs. female participants or all participants who are 21- to 50-years-old. If you need to segment 16- to 20-year-old females or men who only shop online (or any two parameters of your choice), you’ll need a different tool. There are ways around these limitations: You can create two separate research studies to deal with different users, or you can build more complex research questions to solicit the answers you need in order to sort the data required. However, these solutions have limitations of their own, so there is a trade-off.

  • Pricing Structure: The current pricing structure is $499 per study, with each accompanying report $99. This is adequate for quick-and-dirty research to resolve obvious user issues, but the pricing structure doesn’t scale well. For example, if you run a single study and want multiple reports for different audience segments, each $99 report adds up quickly. It can be difficult to budget up front before the research study is even developed, leaving the door open for cost increases. If a simple card sorting tool is all that you need, check out WebSort, which costs $499 for three months of unlimited use and automatically generates a dendogram. (Please note that MindCanvas offers much more than card sorting).

  • Data Analysis Bottleneck: Some of the back-end data analysis is done by a human, who works on a schedule. All data reports are generated once a week. If you get your report order request to Uzanto by the Tuesday deadline, results will be available by Thursday. This might not work with your tight project schedule, in which case, you’re out of luck.

MindCanvas’s Workbench

MindCanvas is currently offered in self-service mode. This means that you (or your researcher) need to become familiar with the finer points of MindCanvas’ Workbench for constructing studies. The upside is that some parts are made easy, like being able to “copy” another study in order to create your own (a handy feature), or creating as many preliminary studies as you like before distributing the real thing.

Mindcanvas Workbench
Figure 1: Manage Activity

The downside is that some interface elements in the study creation console are a bit mysterious. For example, under Manage Study, it’s unclear if the data has been downloaded 164 times or if there are 164 participants who have completed the study. The difference between Manage Study and Track Activity is also hazy. Manage Study allows you to specify where to send users after they have completed the study and limit the number of participants or the length of the study, while Track Activity informs you how many people have completed the study. The Download Tracking CSV gives you access to a text file with a list of all participant’s URL information and their start and stop times.

Mindcanvas Workbench
Figure 2: Track Activity

The Workbench allows access to MindCanvas’ powerful study creation module, but you can tell most of the design effort went into the end user’s interface, not the study designer’s. Luckily, there is a wiki available which answers a lot of questions and Uzanto consultants are very friendly and helpful with the occasional question.


The IA community can finally say that we have a tool designed for us. For so long, we’ve had to take existing tools and try to use them in ways not intended by their designers, sometime with frustrating results and having to develop clever and complicated workarounds. These issues are no longer a problem. It’s a tool for us, made by one of us. It’s about time!

Using Adoption Metaphors to Increase Customer Acceptance

by:   |  Posted on
“Adoption metaphors have a lifecycle. They begin by introducing a new concept. They help us map something new to something we already understand and give us a framework in which to understand the new thing.”

Metaphors are used every day. We are all familiar with them and what they are. They help us understand conceptual ideas, convey complex notions and have a shared understanding so that we can talk to each other using verbal shorthand. Take electronic mail, otherwise known as email as an example. Email seems so much like regular mail, except that there is no paper, no ink, no envelope, no postage stamp, and no postal carrier. There is, however, something familiar about composing a message and sending it to someone else.

What about the wildly popular Tivo’s metaphor? I applaud the decision to use a well-known comparison to explain what the product does: the video cassette recorder. Tivo does replicate many of the VCR’s abilities… and yet it offers so much more! We’re so overjoyed to be able to pause live TV that we overlook the fact that Tivo won’t play your video cassette tapes, let you transfer recorded shows from one machine to another, and it requires a monthly fee.

Metaphors help us grasp new things, but they don’t necessarily account for all aspects of that idea. VCRs couldn’t pause live TV; regular postal mail couldn’t arrive at its destination 3.2 seconds after it had been sent. The main characteristics of the original metaphor allow us to understand the basics of the new; the rest we learn over time. Moreover, the original metaphor helps us easily understand why a product might be useful or necessary, which means we’re more likely to adopt it into our daily lives.

Adoption metaphors have a lifecycle. They begin by introducing a new concept. They help us map something new to something we already understand and give us a framework in which to understand the new thing. After a while, the concept isn’t new any more, and people usually understand it pretty well without needing the original metaphor. The internet is a perfect example. Remember the term “Information Superhighway”? It was such a buzzword back in the 1990s. Although the internet had been around for a while, its introduction as an information superhighway helped frame the whole idea so that others could understand it. They could understand it because it was framed in terms of their daily lives. Do we still use the term “information superhighway”? Not much. Does this mean that we don’t use the internet any more? Is it gone from our minds the same way the term has gone away? Hardly! The internet is such a pervasive part of our culture now. It is everyday. It is mundane. It’s the exception rather than the rule to have no internet connection available to you. We are offered connections at work, at home, in coffee shops, through cell phones and in most public libraries.

The concept of the internet has been so well adapted into our culture, that it is now being used as a metaphor itself for other things. The language of the net pervades our everyday lives. For example, last week I was interviewing a woman who talked about a meeting she had with her supervisor because she needed to get up to speed on a project. Two years ago, she might have said that he informed her, he briefed her, he told her everything he could about what he knew on that particular topic. However, she said, “he downloaded it to me”. There was no computer in the room. No internet connection was involved. It was a verbal transaction, yet she invoked a metaphor that is widely recognized as being synonymous with online activity (the act of collecting or retrieving an electronic file from a remote location). Instead of an electronic file, it was ideas or thoughts. Instead of being retrieved from a remote server, it was retrieved from the supervisor’s brain. Instead of being received onto a computer, it was received into the person’s collection of thoughts. Instead of a file being “pulled” from a server and collected on one’s own computer, the information was “pushed” from one person and collected by another. As you can see, not all aspects of the metaphor fit precisely, but when she said “he downloaded that information to me,” I had no doubt what had happened between the supervisor and the direct report during the lifespan of that meeting.

With this example, you can see the lifecycle of an adoption metaphor, from its introduction, when it is a novel concept first being introduced to the public, to acceptance, when our understanding moves beyond the initial adoption metaphor and fully embraces the concept itself on its own merits. At that point, the metaphor has outlived its usefulness and is either discarded (information superhighway) or becomes mundane (download). You know the adoption metaphor has reached the pinnacle of success when the metaphor itself is used as a metaphor for other things (“where can I get “TiVo” for my radio?”). We don’t bother to refer to cars as “horseless carriages”, but we often use cars as metaphors for other things. Three quotes from recent articles: “Cops drive home seatbelt safety [in a 3-day game event aimed at high-schoolers] ” “Congress Revs Its Engine.” “Fitness Beginners learn how to go from zero to sixty with these workout tips.”

Metaphors accompany every new technological leap. They help with the core concepts such as a blog (originally “web log”) to the interface itself in iconic button choices. When introducing something new into the marketplace, how do you choose the right adoption metaphor? How can you tell if it will work? Is it luck? Do you go with your gut and then “wait and see”?

Happily, there are ways to analyze what metaphors people use to think about things. Systematic analysis at the beginning of a project can better ensure that you’ve got the right metaphor for your new product or service. In part 2 of this series, I will outline ways to analyze effectiveness of current metaphor use as well as ways to identify new or more impactful metaphors.

The following books and articles offer more information on conceptual metaphors:

  • DesCamp, Mary Therese, and Eve E. Sweetser. “Metaphors for God: Why and How Do Our Choices Matter for Humans? The Application of Contemporary Cognitive Linguistics Research to the Debate on God and Metaphor” Pastoral Psychology. 50.3 (2005): 207-238.
  • Fauconnier, Gilles, and Mark Turner. The Way We Think: Conceptual Blending and the Mind’s Hidden Complexities. New York: Basic Books, 2002.
  • Lakoff, George, and Mark Johnson. Metaphors We Live By. Chicago: University of Chicago Press, 1980.
  • Rohrer, Tim. “Conceptual Blending on the Information Highway: How Metaphorical Inferences Work.” 1995.