From start-ups to banks,design has never been more central to business. Yet at conference after conference, I meet designers at firms talking about their struggle for influence. Why is that fabled “seat at the table” so hard to find, and how can designers get a chair?
Designers yearn for a world where companies depend on their ideas but usually work in a world where design is just one voice. In-house designers often have to advocate for design priorities versus new features or technical change. Agency designers can create great visions that fail to be executed. Is design just a service, or can designers* lead?
*Meaning anyone who provides the vision for a product, whether it be in code, wireframes, comps, prototypes, or cocktail napkins.
Have you ever asked for an update on a project you’d invested a great deal of time and energy in, only to hear “they have completely redesigned it since then”?
I did, and it left me with this very empty feeling.
After some wallowing, I realized I needed to discover a new way to think about the way I work and what really matters in my consulting career. My answer: The mark of a truly good consultant is investing in people. Focusing on investing in people will ensure that your work will still continue to see results long after the application is redesigned, and that is change that matters in the long run.
In the following article, I will give three areas in which we can focus our efforts: mentoring, client education, and our own team members. I hope that the reflection will help us all be better consultants and make better investments.
Client mentoring as an investment
There are often opportunities for us to invest in “client side” people, but they might not be readily apparent. I will give two examples of this.
On a recent project, I was the designer paired with a recently-hired UX director, who was a little bewildered still by the new gig. When we talked, it became apparent that what he needed was someone to mentor him in an intentional way because he was overwhelmed and feeling lost.
Now mind you, this mentoring was not the part of any statement of work. This was something I did because it was the right thing to do. It was an opportunity to make an investment much bigger than the project at hand—and to see someone blossom right before your eyes makes the time investment very much worthwhile.
By the end of the client engagement, he was extremely thankful to have had someone invest time in him, to point him in the right direction—which allowed him to lead the UX capability much better than he was before. It turned out to be the most satisfying work I had done in ages. Fortunately, both my company and the client were extremely appreciative of the time spent with their people.
A second example is on the implementation side. I was the interface developer for an intranet project, and the client had a talented UI person who had questions about the CMS and approach we were using. To complicate the situation, we came to the project after they fired another firm for an inability to deliver. This woman had been given poor advice by the previous vendor, and she naturally had lots of questions about how to do the implementation the right way. It is easy to become exhausted with external consultants, and I wanted to ensure that she and their team quickly came to trust us to deliver.
I set up bi-weekly meetings with her throughout the four month implementation. Before we even started development, she and I mapped out the scope of the work and talked about all kinds of details, down to and including minutia like CSS class names. The regular meetings gave her a chance to see and give input throughout the entire process.
Another advantage of this approach—beyond those that accrue through collaboration—was that there was no big knowledge handoff at the end. It was something that was built into the project from the beginning.
As companies become more lean, we can get a double benefit of increased collaboration and knowledge sharing: First, we spend far less time writing copious documentation because we have been sharing all along, and second, the solution has a much greater chance for long-term success due to our time spent investing in these individuals who take over after we leave.
Client education as an investment
We can also educate clients even if they are not themselves in the UX world. A big intranet project I worked on was scoped to be responsive, but it became very apparent in the beginning that the design that went into it was not done in the best way for my company to implement; it was not designed in a mobile-first fashion.
I had two options: Either I just let it go, do my work, and move on; or, I could take the time to reach out to the client and educate them. I knew that this project was already moving forward, but I could set up a foundation for this client’s future success.
One thing to gauge is whether the client is even interested in such a relationship. Sometimes, despite your best intentions, clients are only interested in timelines and not interested in spending lots of billable time learning or re-learning.
And I had to ask: What did I value? Was I only in it for the money, or could I help enact lasting change and provide real value?
This client was not himself a UX practitioner, and he was looking for someone to be the expert he could trust. Working with non-UX people is a challenge, because you have to sell them a bit harder on why doing things the right way is important, even when they do not understand the implications or appreciate the time necessary to do it the right way.
I pulled him aside in a couple of private meetings and talked about everything with him, from defining responsive design correctly, understanding mobile-first design, and even things like home page carousel use and abuse. In the end, it not only furthered our relationship, but it afforded me a high level of trust and rapport with the client.
This particular client was open to the discussion and was even excited about extending the relationship, but if you have a hesitant client, don’t give up on them. Show them the quantifiable benefits of this increased collaboration by pointing to your experience in the past, or that the time they spend with you learning will only pay dividends in the future.
Remember that even if things aren’t changeable in the short term, you can make investments in people for the next project and longer term.
Teams as an investment
There is one last, important group we can’t forget: our co-workers. These are the people that become like a family in ways our clients never will. Project after project, these are the people we are tasked to work with, and in some ways these are often the most strategic people to invest in but also sometimes the most difficult to do it with because we can so easily overlook them.
During my firm’s adoption of the CSS preprocessor SASS, my team was mostly junior people who were looking for leadership in all kinds of areas. This time, I was given the opportunity to help others use this powerful tool. I took the lead in understanding its implications and how to use it in our teams, and then I spent concentrated time with each member of the UX team to help them understand how to use the technology in both a programmatic and process way. Taking advantage of opportunities like these furthers your relationship with your team members and demonstrates to them that you care deeply about their professional development.
To this day, those team members reach out to me with questions and best practices due to the trust gained through leading in this way. It is amazing how doing this even on a detail like a CSS preprocessor can assist your team members greatly.
We all have different motivations for doing the work that we do, and I imagine that for most of us money—as good as money is—is not the primary factor. Instead, very talented people tend to thrive on being an expert, enacting change, and leading others. True leaders are not given an opportunity to lead—they find those opportunities. Leading inside your organization will make you as close to irreplaceable as you can get.
Design research has always been about qualitative techniques. Increasingly, our clients ask us to add a “quant part” to projects, often without much or any additional budget. Luckily for us, there are plenty of tools available to conduct online surveys, from simple ones like Google Forms and SurveyMonkey to more elaborate ones like Qualtrics and Key Survey.
Whichever tool you choose, there are certain pitfalls in conducting quantitative research on a shoestring budget. Based on our own experience, we’ve compiled a set of tips and tricks to help avoid some common ones, as well as make your online survey more effective.
We’ve organized our thoughts around three survey phases: writing questions, finding respondents, and cleaning up data.
Writing a good questionnaire is both art and science, and we strongly encourage you to learn how to do it. Most of our tips here are relevant to all surveys, but particularly important for the low-budget ones. Having respondents who are compensated only a little, if at all, makes the need for good survey writing practices even more important.
Ask (dis)qualifying questions first
A sacred rule of surveys is to not waste people’s time. If there are terminating criteria, gather those up front and disqualify respondents as quickly as you can if they do not meet the profile. It is also more sensitive to terminate them with a message “Thank you for your time, but we already have enough respondents like you” rather than “Sorry, but you do not qualify for this survey.”
Keep it short
Little compensation means that respondents will drop out at higher rates. Only focus on what is truly important to your research questions. Ask yourself how exactly the information you collect will contribute to your research. If the answer is “not sure,” don’t ask.
For example, it’s common to ask about a level of education or income, but if comparing data across different levels of education or income is not essential to your analysis, don’t waste everyone’s time asking the questions. If your client insists on having “nice to know” answers, insist on allocating more budget to pay the respondents for extra work.
Keep it simple
Keep your target audience in mind and be a normal human being in framing your questions. Your client may insist on slipping in industry jargon and argue that “everyone knows what it is.” It is your job to make the survey speak the language of the respondents, not the client.
For example, in a survey about cameras, we changed the industry term “lifelogging” to a longer, but simpler phrase “capturing daily routines, such as commute, meals, household activities, and social interactions.”
Keep it engaging
People in real life don’t casually say, “I am somewhat satisfied” or “the idea is appealing to me.” To make your survey not only simple but also engaging, consider using more natural language for response choices.
For example, instead of using standard Likert-scale “strongly disagree” to “strongly agree” responses to the statement “This idea appeals to me” in a concept testing survey, we offered a scale “No, thanks” – “Meh” – “It’s okay” – “It’s pretty cool” – “It’s amazing.” We don’t know for sure if our respondents found this approach more engaging (we certainly hope so), but our client showed a deeper emotional response to the results.
Online survey tools differ in how much help they provide with recruiting respondents, but most common tools will assist in finding the sample you need, if the profile is relatively generic or simple. For true “next to nothing” surveys, we’ve used Amazon Mechanical Turk (mTurk), SurveyMonkey Audience, and our own social networks for recruiting.
Be aware of quality
Cheap recruiting may easily result in low quality data. While low-budget surveys will always be vulnerable to quality concerns, there are mechanisms to ensure that you keep your quality bar high.
First of all, know what motivates your respondents. Amazon mTurk commonly pays $1 for the so-called “Human Intelligence Task” that may include taking an entire survey. In other words, someone is earning as little as $4 an hour if they complete four 15-minute surveys. As such, some mTurk Workers may try to cheat the system and complete multiple surveys for which they may not be qualified.
SurveyMonkey, on the other hand, claims that their Audience service delivers better quality, since the respondents are not motivated by money. Instead of compensating respondents, SurveyMonkey makes a small donation to the charity of their choice, thus lowering the risk of people being motivated to cheat for money.
Use social media
If you don’t need thousands of respondents and your sample is pretty generic, the best resource can be your social network. For surveys with fewer than 300 respondents, we’ve had great success with tapping into our collective social network of Artefact’s members, friends, and family. Write a request and ask your colleagues to post it on their networks. Of course, volunteers still need to match the profile. When we send an announcement, we include a very brief description of who we look for and send volunteers to a qualifying survey. This approach costs little but yields high-quality results.
We don’t pay our social connections for surveys, but many will be motivated to help a friend and will be very excited to hear about the outcomes. Share with them what you can as a “thank you” token.
For example, we used social network recruiting in early stages of Purple development. When we revealed the product months later, we posted a “thank you” link to the article to our social networks. Surprisingly even for us, many remembered the survey they took and were grateful to see the outcomes of their contribution.
If you are trying to hit a certain sample size for “good” data, you need to over-recruit to remove the “bad” data. No survey is perfect and all can benefit from over-recruiting, but it’s almost a must for low-budget surveys. There are no rules, but we suggest over-recruiting by at least 20% to hit the sample size you need at the end. Since the whole survey costs you little, over-recruiting will equally cost little.
Cleaning up data
Cleaning up your data is another essential step of any survey that is particularly important for the one on a tight budget. A few simple tricks can increase the quality of responses, particularly if you use public recruiting resources. When choosing a survey tool, check what mechanisms are available for you to clean up your data.
Throw out duplicates
As mentioned earlier, some people may be motivated to complete the same survey multiple times and even under multiple profiles. We’ve spotted this when working with mTurk respondents by checking their Worker IDs. We had multiple cases when the same IDs were used to complete a survey multiple times. We ended up throwing away all responses associated with the “faulty IDs” and gained more confidence in our data at the end.
Check response time
With SurveyMonkey, you can calculate the time spent on the survey using the StartTime and EndTime data. We benchmarked the average time of the survey by piloting the survey in the office. This can be used as a pretty robust fool-proof mechanism.
If the benchmark time is eight minutes and you have surveys completed in three, you may question how carefully respondents were reading the questions. We flag such outliers as suspect and don’t include them in our analysis.
Add a dummy question
Dummy questions help filter out the respondent quickly answering survey questions at random. Dummy questions require the respondent to read carefully and then respond. People who click and type at random might answer it correctly, but it is unlikely. If the answer is incorrect, this is another flag we use to mark a respondent’s data as suspect.
Low-budget surveys are challenging, but not necessarily bad, and with a few tricks you can make them much more robust. If they are used as an indicative, rather than definitive, mechanism to supplement other design research activities, they can bring “good enough” insights to a project.
Educate your clients about the pros and cons of low-budget surveys and help them make a decision whether or not they want to invest more to get greater confidence in the quantitative results. Setting these expectations up front is critical for the client, but you never know, it could also be a good tool for negotiating a higher survey budget to begin with!
Which version of the ‘suspended account’ dashboard page do you prefer?
Perhaps you don’t really care. Each one gets the job done in a clear and obvious way.
However, as the UX architect of the ‘overview’ page for a huge telecom leader, it was my job to tell the team which treatment we’d be using.
I was a freelancer with only four months tenure on this job, and in a company as large, diverse, and complex as this one, four months isn’t a very long time. There are a ton of things to learn—how their teams work, the latest visual standards, expected fidelity of wireframes, and most of all, selecting the ‘current’ interaction standards from a site with thousands of pages, many of which were culled from different companies following acquisitions or created at different points in time. Since I worked off-site, I had limited access to subject matter experts.
Time with the Telecom Giant’s UX leads is scarce, but Nick, my lead on this project , was a great guy with five years at the company, much of it on the Overview page and similar efforts. He and I had spent a lot of phone time going over this effort’s various challenges.
Version A, the yellow note treatment, had been created to highlight the suspended location if the “Home Phone” account covered more than one address. After much team discussion, we realized that this scenario could not occur, but since the new design placed what seemed like the proper emphasis on the ‘Account Suspended’ situation, I was confident that we’d be moving forward with version A.
So, why was I surprised when Nick said we’d “obviously” go with version B?
Whenever I start with a new company, I try to do a mind meld with co-workers to understand their approach, why they made certain decisions, and learn their priorities. Unless I’m certain there is a better way, I don’t want to go in with my UX guns blazing—I want to know whether they’d already considered other solutions, and if so, why they were rejected. This is especially true in a company like Telecom Giant, which takes user experience seriously.
I’d worked so closely with Nick on this project that I thought I knew his reasoning inside out. And when he came to a different conclusion, I wondered whether I’d ever be able to understand the company’s driving forces. If I wasn’t on the same page with someone who had the same job and a similar perspective, with whom I’d spent hours discussing the project, what chance did I have of seeing eye-to-eye with a business owner on the other side of the country or a developer halfway across the world?
Version A (the yellow note treatment) was created by Ken, a visual designer who had an intimate knowledge of the telco’s design standards. This adhered to other instances where the yellow note was used to highlight an important situation.
Version B was the existing model, which had worked well in a section of the site that had been redesigned a year ago following significant user testing. Because of its success, this section–“Home Usage”–was earmarked as the model for future redesigns.
Once I had a bit of distance from the situation, I realized what the problem was. Although I had worked very closely with Nick, I didn’t have the same understanding of the company’s priorities.
My priorities were:
Consistency across the site
Using the most up to date and compelling interaction and design patterns
Modeling redesign efforts on “Home Usage” where possible
Because Nick had a background in visual design, I thought that he would want to use Ken’s design pattern, which seemed both more visually distinct and a better match for the situation. But Nick preferred the Home Usage pattern and may have had good reasons to think so.
First, Home Usage had been thoroughly tested, and since this was an ecommerce site with many hard-to-disentangle components, testing could have provided insight into its success factors, especially if individual components had been tested separately.
Second, by following the existing pattern, we wouldn’t wind up with two different treatments for the same situation. Even though the yellow note treatment might be more prominent, was it significant enough to shoulder the cost of changing the pattern in the existing Home Usage flow?
Now that I knew at least one piece of the puzzle, I wondered how I might have achieved a more complete ‘mind meld’ with Nick, so that we were more closely in sync.
Know your priorities—and check them out
Just being aware of the priorities I was following would have offered me the chance to discuss them directly with Nick. With so much information to take in, I hadn’t thought to clarify my priorities and compare them with my co-workers, but this would have made it easier to sync up.
Other barriers to knowledge transfer
Gabriel Szulanski1 identified three major barriers to internal knowledge transfer within a business. Although these are aimed at firm-wide knowledge, they seem relevant here for individuals as well:
Recipient’s lack of absorptive capacity
Absorptive capacity is defined as a firm’s “ability to recognize the value of new information, assimilate it, and apply it to commercial ends.”2 To encourage this, companies are urged to embrace the value of R&D and continually evaluate new information.
Szulanski notes that such capacity is “largely a function of (the recipient’s) preexisting stock of knowledge.”3 If existing knowledge might help or hinder gathering new information, how might we apply this to an individual?
As information load increases, it lessens your ability to understand it and properly place it within a mental framework.
While the new company may have hired you for your experience and knowledge, you might need to reevaluate some of that knowledge. For instance, it may be difficult to shed and reframe your priorities to be in sync with the new firm.
Causal ambiguity refers to an inability to precisely articulate the reasons behind a process or capability. According to Szulanski, this exists “when the precise reasons for success or failure in replicating a capability in a new setting cannot be determined.”
How did causal ambiguity affect this transfer? While the site’s Home Usage section was promoted because of its successful testing and rollout, the reasons behind its success were never clear. Success of an ecommerce site depends on many factors, among them navigation, length and content of copy and labels, information density, and the site’s interaction design. Since Home Usage’s advantages had never been broken down into its components, and I hadn’t been there when usability tests were conducted, I could only see it as a black box.
To truly assimilate new knowledge, you need context. If none is provided, you need to know how to go out and get it. Ask about the reasons behind a model site. If possible, read any test reports. Keep asking until you understand and validate your conclusions.
An arduous relationship between the source and the recipient
Finally, knowledge transfer depends on the ease of communication and ‘intimacy’ between the source and recipient. Although my relationship with Nick was close, I worked off-site, which eliminated many informal opportunities for knowledge sharing. I couldn’t ask questions during a chance meeting or ‘ambush’ a manager by waiting for her to emerge from a meeting. Since I didn’t have access to Telecom Giant’s internal messaging system, I was limited to more formal methods such as email or phone calls.
A model for knowledge transfer
Thomas Jones offered this approach to knowledge transfer in a Quora post: “As they say in the Army: ‘an explanation, a demonstration, and a practical application.’ Storytelling, modeling, and task assignment … share your stories, model the behaviors you want to see and assign the tasks required to build competency.”4
Keeping “Home Usage” in mind, the story could be “how we came to follow this model,” the demonstration could be the research paper, and a practical application could be your work, evaluated by your lead.
Your ability to retain new information is essential to your success at a new company. However, your ability to understand the reasons behind the information and place these within a framework are even more important. Some techniques to help you do so are:
Be aware of your own design priorities and how they match with the firm’s. Treat the company’s priorities like any user research problem and check them out with your leads and co-workers.
To increase your absorptive capacity, evaluate your preconceptions and be prepared to change them.
Ask for the reasons behind a ‘model’ design. Read research reports if available.
Maximize your contact points. Follow-up emails can target ambiguous responses. If time with the UX leads is scarce, ask your co-workers about their view of priorities, patterns and the reasons behind them.
1 Szulanski, G 1996, ‘Exploring Internal Stickiness: Impediments to the Transfer of Best Practice within the Firm’, Strategic Management Journal, vol. 17, pp. 27-43.
In 2008, Lloyds Pharmacy conducted 20 minute interviews1 with 1,961 UK adults. Almost one in five people admitted to having taken prescription medicines incorrectly; more than eight million adults have either misread medicine labels or misunderstood the instructions, resulting in them taking the wrong dose or taking medication at the wrong time of day. In addition, the overall problem seemed to be more acute among older patients.
Almost one in five people admitted to having taken prescription medicines incorrectly; more than eight million adults have either misread medicine labels or misunderstood the instructions.
Medicine or patient information leaflets refer to the document included inside medicine packaging and are typically printed on thin paper (see figures 1.1–1.4). They are essential for the safe use of medicines and help answer people’s questions when taking the medicine.
If the leaflet works well, it can lead to people taking the medicine correctly, hopefully improving their health and wellness. If it works poorly, it can lead to adverse side effects, harm, or even death. Subsequently, leaflets are heavily regulated in the way they need to be designed, written, and produced. European2 and individual national legislation sets out the information to be provided, in a specific order, within a medicine information leaflet.
Adding to the design challenge is the fact that the guidelines for how medicine information leaflets are designed changes from country to country, and the guidelines are often vague.
One of the changes in the 2004 European Commission directive2 was to ensure that all medicine information leaflets ‘reflect the results of consultations with target patient groups.’ In other words, when producing a leaflet, user testing (or ‘readability testing’ as it is also known4) must be done. A satisfactory test outcome is when the information requested within the package leaflet can be found by 90% of test participants, of whom 90% can show that they understand it.3
The diagnostic testing method for medicine information leaflets also raises a unique challenge when designing leaflets and is more rigorous than the level of user testing most designers are used to.
Additionally, medicine information leaflets are required to be reviewed and approved by a competent authority, which varies from country to country, before being included in the packaging with the medicine.5
Possible Design Improvements
How can these materials be designed so that people end up taking the medicine as directed?
One issue with medicine information leaflets seems to be that most people do not read the document from start to finish, although it contains important information. Reasons for not reading or only skimming the leaflet from start to finish could be due to the amount of information or the leaflet design.
Competing sources of information introduce additional confusion. Sometimes the pharmacist will attach to the packaging a sticker with dosage instructions. That sticker can cover the dosage instructions printed on the packaging itself.
There are now potentially three sources of dosage information: the sticker, the packaging, and the leaflet, all with different densities of information. This creates an assumption on the part of the patient that everything they will need to know will be on the sticker–a dangerous assumption because patients do not read through the whole of the medicine information leaflet.
Medicine information leaflets are usually long and contain a wealth of information and complex terminology. An option would be to provide the document written to different educational levels.4
Sometimes leaflets do not make the most of headings and sectioning, which keeps people from finding quickly the information they need. Medicine information leaflets are usually minimally treated, featuring only plain text with headings in bold.
Could a more designed and illustrated appearance lead to people taking the medicine in the prescribed manner? A study6 suggests this is the case: Layouts that reduce text density, use purposeful sectioning, highlight key messages, and use a logical type hierarchy helped people to find the right information more quickly.
The example shown in figure 1.5 is a step in the right direction; the different types of information have been given a diversity of treatments to provide emphasis.
Layouts that reduce text density, use purposeful sectioning, highlight key messages, and use a logical type hierarchy helped people to find the right information more quickly.
In a similar vein, the United States Food and Drug Administration (FDA) recently proposed a redesign of nutrition labels on food packaging. Among the changes were putting calorie counts in large type, adjusting portion sizes to reflect how much Americans actually eat, and additional information about sugars in food.7
The Lloyd’s Pharmacy research stated that older people make the most mistakes when using medicine information due to either misreading medicine labels or misunderstanding the instructions. Clearer written instructions would solve the comprehension issue; a more ‘large print’ design would enable both older and a wider variety of people to better use the leaflet.
Medicine information leaflets are often printed on thin paper and folded many times to fit into the medicine package. There is a lot of show-through from the information printed on the back of the leaflet, which decreases readability. When the leaflet is unfolded, the paper crease marks affect the readability of the text (see figures 1.3 and 1.4). A possible improvement would be to print the leaflet on a thicker paper.
Article 63(2) of the European Commission, 2004,2 states that: ‘The package leaflet must be written and designed to be clear and understandable, enabling the users to act appropriately, when necessary with the help of health professionals.’
Diagnostic testing is examining an existing design to find out how it performs against the agreed performance requirements set at the scoping stage; for example, a satisfactory test outcome is when the information requested within the package leaflet can be found by 90% of test participants, of whom 90% can show that they understand it. Diagnostic testing takes the actions of people using the document as symptoms of the document’s health and is concerned with finding out what is wrong with a design. Diagnostic testing should be used iteratively—that is, repeated until its performance reaches the agreed benchmark. Diagnostic test questions are designed to see whether a consumer can find information quickly and easily and perform actions appropriately.8
Earlier research from Lloyds Pharmacy1 and Dickinson et al.6 demonstrates that design and writing has the potential to make a real difference in regard to medical errors and that design, writing, and production of a medicine information leaflet can have a real positive effect on people’s health.
The design of medicine information leaflets provides some interesting challenges because they might not be seen as a typical creative graphic design job. Just because they do not contain overly designed text or graphics, however, does not mean creativity is not needed, in fact creativity is usually lacking in leaflets typically produced.
Furthermore, creativity when designing medicine information leaflets usually comes in the form of clear writing, clear layout, and user testing—more of an information design challenge rather than graphic design.
The designer’s job is to clearly communicate the desired message. The designer also has to follow guidelines—in this case, not corporate identity guidelines but guidelines laid out in legislation and vetted by a regulatory body.
Effective design can make the difference between a person deciding to read a leaflet or not, or getting the information they need about the medicine they are taking or not. And that difference can be a matter of life or death. The not so typical design challenge of medicine information leaflets shows the importance effective design can have.
5 Medicines and Healthcare products Regulatory Agency. (2005). Always Read the Leaflet: Getting the best information with every medicine. Report of the Committee on Safety of Medicines Working Group on Patient Information. London: The Stationery Office. Retrieved January 2014, www.mhra.gov.uk/home/groups/p-la/documents/publication/con2018041.pdf.
6 Dickinson, D., Teather, J., Gallina, S., Newsom-Davis, E. (2010). Medicine package leaflets – does good design matter? Information Design Journal 18(3). Amsterdam: John Benjamins.