Mentoring as an Investment

Written by: Chris Poteet

Have you ever asked for an update on a project you’d invested a great deal of time and energy in, only to hear “they have completely redesigned it since then”?

I did, and it left me with this very empty feeling.

After some wallowing, I realized I needed to discover a new way to think about the way I work and what really matters in my consulting career. My answer: The mark of a truly good consultant is investing in people. Focusing on investing in people will ensure that your work will still continue to see results long after the application is redesigned, and that is change that matters in the long run.

In the following article, I will give three areas in which we can focus our efforts: mentoring, client education, and our own team members. I hope that the reflection will help us all be better consultants and make better investments.

Client mentoring as an investment

There are often opportunities for us to invest in “client side” people, but they might not be readily apparent. I will give two examples of this.

On a recent project, I was the designer paired with a recently-hired UX director, who was a little bewildered still by the new gig. When we talked, it became apparent that what he needed was someone to mentor him in an intentional way because he was overwhelmed and feeling lost.

I spent lunches with this gentleman talking about UX strategy, how my company had handled process definition, and I eventually worked on a project where I invited him to come do user research with me.

Now mind you, this mentoring was not the part of any statement of work. This was something I did because it was the right thing to do. It was an opportunity to make an investment much bigger than the project at hand—and to see someone blossom right before your eyes makes the time investment very much worthwhile.

By the end of the client engagement, he was extremely thankful to have had someone invest time in him, to point him in the right direction—which allowed him to lead the UX capability much better than he was before. It turned out to be the most satisfying work I had done in ages. Fortunately, both my company and the client were extremely appreciative of the time spent with their people.

A second example is on the implementation side. I was the interface developer for an intranet project, and the client had a talented UI person who had questions about the CMS and approach we were using. To complicate the situation, we came to the project after they fired another firm for an inability to deliver. This woman had been given poor advice by the previous vendor, and she naturally had lots of questions about how to do the implementation the right way. It is easy to become exhausted with external consultants, and I wanted to ensure that she and their team quickly came to trust us to deliver.

I set up bi-weekly meetings with her throughout the four month implementation. Before we even started development, she and I mapped out the scope of the work and talked about all kinds of details, down to and including minutia like CSS class names. The regular meetings gave her a chance to see and give input throughout the entire process.

Another advantage of this approach—beyond those that accrue through collaboration—was that there was no big knowledge handoff at the end. It was something that was built into the project from the beginning.

As companies become more lean, we can get a double benefit of increased collaboration and knowledge sharing: First, we spend far less time writing copious documentation because we have been sharing all along, and second, the solution has a much greater chance for long-term success due to our time spent investing in these individuals who take over after we leave.

Client education as an investment

We can also educate clients even if they are not themselves in the UX world. A big intranet project I worked on was scoped to be responsive, but it became very apparent in the beginning that the design that went into it was not done in the best way for my company to implement; it was not designed in a mobile-first fashion.

I had two options: Either I just let it go, do my work, and move on; or, I could take the time to reach out to the client and educate them. I knew that this project was already moving forward, but I could set up a foundation for this client’s future success.

One thing to gauge is whether the client is even interested in such a relationship. Sometimes, despite your best intentions, clients are only interested in timelines and not interested in spending lots of billable time learning or re-learning.

And I had to ask: What did I value? Was I only in it for the money, or could I help enact lasting change and provide real value?

This client was not himself a UX practitioner, and he was looking for someone to be the expert he could trust. Working with non-UX people is a challenge, because you have to sell them a bit harder on why doing things the right way is important, even when they do not understand the implications or appreciate the time necessary to do it the right way.

I pulled him aside in a couple of private meetings and talked about everything with him, from defining responsive design correctly, understanding mobile-first design, and even things like home page carousel use and abuse. In the end, it not only furthered our relationship, but it afforded me a high level of trust and rapport with the client.

This particular client was open to the discussion and was even excited about extending the relationship, but if you have a hesitant client, don’t give up on them. Show them the quantifiable benefits of this increased collaboration by pointing to your experience in the past, or that the time they spend with you learning will only pay dividends in the future.

Remember that even if things aren’t changeable in the short term, you can make investments in people for the next project and longer term.

Teams as an investment

There is one last, important group we can’t forget: our co-workers. These are the people that become like a family in ways our clients never will. Project after project, these are the people we are tasked to work with, and in some ways these are often the most strategic people to invest in but also sometimes the most difficult to do it with because we can so easily overlook them.

During my firm’s adoption of the CSS preprocessor SASS, my team was mostly junior people who were looking for leadership in all kinds of areas. This time, I was given the opportunity to help others use this powerful tool. I took the lead in understanding its implications and how to use it in our teams, and then I spent concentrated time with each member of the UX team to help them understand how to use the technology in both a programmatic and process way. Taking advantage of opportunities like these furthers your relationship with your team members and demonstrates to them that you care deeply about their professional development.

To this day, those team members reach out to me with questions and best practices due to the trust gained through leading in this way. It is amazing how doing this even on a detail like a CSS preprocessor can assist your team members greatly.

We all have different motivations for doing the work that we do, and I imagine that for most of us money—as good as money is—is not the primary factor. Instead, very talented people tend to thrive on being an expert, enacting change, and leading others. True leaders are not given an opportunity to lead—they find those opportunities. Leading inside your organization will make you as close to irreplaceable as you can get.

Online Surveys On a Shoestring: Tips and Tricks

Written by: Gabriel Biller

Design research has always been about qualitative techniques. Increasingly, our clients ask us to add a “quant part” to projects, often without much or any additional budget. Luckily for us, there are plenty of tools available to conduct online surveys, from simple ones like Google Forms and SurveyMonkey to more elaborate ones like Qualtrics and Key Survey.

Whichever tool you choose, there are certain pitfalls in conducting quantitative research on a shoestring budget. Based on our own experience, we’ve compiled a set of tips and tricks to help avoid some common ones, as well as make your online survey more effective.

We’ve organized our thoughts around three survey phases: writing questions, finding respondents, and cleaning up data.

Writing questions

Writing a good questionnaire is both art and science, and we strongly encourage you to learn how to do it. Most of our tips here are relevant to all surveys, but particularly important for the low-budget ones. Having respondents who are compensated only a little, if at all, makes the need for good survey writing practices even more important.

Ask (dis)qualifying questions first

A sacred rule of surveys is to not waste people’s time. If there are terminating criteria, gather those up front and disqualify respondents as quickly as you can if they do not meet the profile. It is also more sensitive to terminate them with a message “Thank you for your time, but we already have enough respondents like you” rather than “Sorry, but you do not qualify for this survey.”

Keep it short

Little compensation means that respondents will drop out at higher rates. Only focus on what is truly important to your research questions. Ask yourself how exactly the information you collect will contribute to your research. If the answer is “not sure,” don’t ask.

For example, it’s common to ask about a level of education or income, but if comparing data across different levels of education or income is not essential to your analysis, don’t waste everyone’s time asking the questions. If your client insists on having “nice to know” answers, insist on allocating more budget to pay the respondents for extra work.

Keep it simple

Keep your target audience in mind and be a normal human being in framing your questions. Your client may insist on slipping in industry jargon and argue that “everyone knows what it is.” It is your job to make the survey speak the language of the respondents, not the client.

For example, in a survey about cameras, we changed the industry term “lifelogging” to a longer, but simpler phrase “capturing daily routines, such as commute, meals, household activities, and social interactions.”

Keep it engaging

People in real life don’t casually say, “I am somewhat satisfied” or “the idea is appealing to me.” To make your survey not only simple but also engaging, consider using more natural language for response choices.

For example, instead of using standard Likert-scale “strongly disagree” to “strongly agree” responses to the statement “This idea appeals to me” in a concept testing survey, we offered a scale “No, thanks” – “Meh” – “It’s okay” – “It’s pretty cool” – “It’s amazing.” We don’t know for sure if our respondents found this approach more engaging (we certainly hope so), but our client showed a deeper emotional response to the results.

Finding respondents

Online survey tools differ in how much help they provide with recruiting respondents, but most common tools will assist in finding the sample you need, if the profile is relatively generic or simple. For true “next to nothing” surveys, we’ve used Amazon Mechanical Turk (mTurk), SurveyMonkey Audience, and our own social networks for recruiting.

Be aware of quality

Cheap recruiting may easily result in low quality data. While low-budget surveys will always be vulnerable to quality concerns, there are mechanisms to ensure that you keep your quality bar high.

First of all, know what motivates your respondents. Amazon mTurk commonly pays $1 for the so-called “Human Intelligence Task” that may include taking an entire survey. In other words, someone is earning as little as $4 an hour if they complete four 15-minute surveys. As such, some mTurk Workers may try to cheat the system and complete multiple surveys for which they may not be qualified.

SurveyMonkey, on the other hand, claims that their Audience service delivers better quality, since the respondents are not motivated by money. Instead of compensating respondents, SurveyMonkey makes a small donation to the charity of their choice, thus lowering the risk of people being motivated to cheat for money.

Use social media

If you don’t need thousands of respondents and your sample is pretty generic, the best resource can be your social network. For surveys with fewer than 300 respondents, we’ve had great success with tapping into our collective social network of Artefact’s members, friends, and family. Write a request and ask your colleagues to post it on their networks. Of course, volunteers still need to match the profile. When we send an announcement, we include a very brief description of who we look for and send volunteers to a qualifying survey. This approach costs little but yields high-quality results.

We don’t pay our social connections for surveys, but many will be motivated to help a friend and will be very excited to hear about the outcomes. Share with them what you can as a “thank you” token.

For example, we used social network recruiting in early stages of Purple development. When we revealed the product months later, we posted a “thank you” link to the article to our social networks. Surprisingly even for us, many remembered the survey they took and were grateful to see the outcomes of their contribution.

Over-recruit

If you are trying to hit a certain sample size for “good” data, you need to over-recruit to remove the “bad” data. No survey is perfect and all can benefit from over-recruiting, but it’s almost a must for low-budget surveys. There are no rules, but we suggest over-recruiting by at least 20% to hit the sample size you need at the end. Since the whole survey costs you little, over-recruiting will equally cost little.

Cleaning up data

Cleaning up your data is another essential step of any survey that is particularly important for the one on a tight budget. A few simple tricks can increase the quality of responses, particularly if you use public recruiting resources. When choosing a survey tool, check what mechanisms are available for you to clean up your data.

Throw out duplicates

As mentioned earlier, some people may be motivated to complete the same survey multiple times and even under multiple profiles. We’ve spotted this when working with mTurk respondents by checking their Worker IDs. We had multiple cases when the same IDs were used to complete a survey multiple times. We ended up throwing away all responses associated with the “faulty IDs” and gained more confidence in our data at the end.

Check response time

With SurveyMonkey, you can calculate the time spent on the survey using the StartTime and EndTime data. We benchmarked the average time of the survey by piloting the survey in the office. This can be used as a pretty robust fool-proof mechanism.

If the benchmark time is eight minutes and you have surveys completed in three, you may question how carefully respondents were reading the questions. We flag such outliers as suspect and don’t include them in our analysis.

Add a dummy question

Dummy questions help filter out the respondent quickly answering survey questions at random. Dummy questions require the respondent to read carefully and then respond. People who click and type at random might answer it correctly, but it is unlikely. If the answer is incorrect, this is another flag we use to mark a respondent’s data as suspect.

Low-budget surveys are challenging, but not necessarily bad, and with a few tricks you can make them much more robust. If they are used as an indicative, rather than definitive, mechanism to supplement other design research activities, they can bring “good enough” insights to a project.

Educate your clients about the pros and cons of low-budget surveys and help them make a decision whether or not they want to invest more to get greater confidence in the quantitative results. Setting these expectations up front is critical for the client, but you never know, it could also be a good tool for negotiating a higher survey budget to begin with!

Enhancing the Mind-Meld

Written by: Mark Richman

Which version of the ‘suspended account’ dashboard page do you prefer?

Version A

Version A highlights the address with black text on a soft yellow background.

 

 

 

Version B

Version B does not highlight the service address.

 

 

Perhaps you don’t really care. Each one gets the job done in a clear and obvious way.

However, as the UX architect of the ‘overview’ page for a huge telecom leader, it was my job to tell the team which treatment we’d be using.

I was a freelancer with only four months tenure on this job, and in a company as large, diverse, and complex as this one, four months isn’t a very long time. There are a ton of things to learn—how their teams work, the latest visual standards, expected fidelity of wireframes, and most of all, selecting the ‘current’ interaction standards from a site with thousands of pages, many of which were culled from different companies following acquisitions or created at different points in time. Since I worked off-site, I had limited access to subject matter experts.

Time with the Telecom Giant’s UX leads is scarce, but Nick, my lead on this project , was a great guy with five years at the company, much of it on the Overview page and similar efforts. He and I had spent a lot of phone time going over this effort’s various challenges.

Version A, the yellow note treatment, had been created to highlight the suspended location if the “Home Phone” account covered more than one address. After much team discussion, we realized that this scenario could not occur, but since the new design placed what seemed like the proper emphasis on the ‘Account Suspended’ situation, I was confident that we’d be moving forward with version A.

So, why was I surprised when Nick said we’d “obviously” go with version B?

Whenever I start with a new company, I try to do a mind meld with co-workers to understand their approach, why they made certain decisions, and learn their priorities. Unless I’m certain there is a better way, I don’t want to go in with my UX guns blazing—I want to know whether they’d already considered other solutions, and if so, why they were rejected. This is especially true in a company like Telecom Giant, which takes user experience seriously.

I’d worked so closely with Nick on this project that I thought I knew his reasoning inside out. And when he came to a different conclusion, I wondered whether I’d ever be able to understand the company’s driving forces. If I wasn’t on the same page with someone who had the same job and a similar perspective, with whom I’d spent hours discussing the project, what chance did I have of seeing eye-to-eye with a business owner on the other side of the country or a developer halfway across the world?

Historical perspective

Version A (the yellow note treatment) was created by Ken, a visual designer who had an intimate knowledge of the telco’s design standards. This adhered to other instances where the yellow note was used to highlight an important situation.

Version B was the existing model, which had worked well in a section of the site that had been redesigned a year ago following significant user testing. Because of its success, this section–“Home Usage”–was earmarked as the model for future redesigns.

Once I had a bit of distance from the situation, I realized what the problem was. Although I had worked very closely with Nick, I didn’t have the same understanding of the company’s priorities.

My priorities were:

  • Consistency across the site
  • Accessibility
  • Using the most up to date and compelling interaction and design patterns
  • Modeling redesign efforts on “Home Usage” where possible

Because Nick had a background in visual design, I thought that he would want to use Ken’s design pattern, which seemed both more visually distinct and a better match for the situation. But Nick preferred the Home Usage pattern and may have had good reasons to think so.

First, Home Usage had been thoroughly tested, and since this was an ecommerce site with many hard-to-disentangle components, testing could have provided insight into its success factors, especially if individual components had been tested separately.

Second, by following the existing pattern, we wouldn’t wind up with two different treatments for the same situation. Even though the yellow note treatment might be more prominent, was it significant enough to shoulder the cost of changing the pattern in the existing Home Usage flow?

Now that I knew at least one piece of the puzzle, I wondered how I might have achieved a more complete ‘mind meld’ with Nick, so that we were more closely in sync.

Know your priorities—and check them out

Just being aware of the priorities I was following would have offered me the chance to discuss them directly with Nick. With so much information to take in, I hadn’t thought to clarify my priorities and compare them with my co-workers, but this would have made it easier to sync up.

Other barriers to knowledge transfer

Gabriel Szulanski1 identified three major barriers to internal knowledge transfer within a business. Although these are aimed at firm-wide knowledge, they seem relevant here for individuals as well:

Recipient’s lack of absorptive capacity

Absorptive capacity is defined as a firm’s “ability to recognize the value of new information, assimilate it, and apply it to commercial ends.”2 To encourage this, companies are urged to embrace the value of R&D and continually evaluate new information.

Szulanski notes that such capacity is “largely a function of (the recipient’s) preexisting stock of knowledge.”3 If existing knowledge might help or hinder gathering new information, how might we apply this to an individual?

  • As information load increases, it lessens your ability to understand it and properly place it within a mental framework.
  • While the new company may have hired you for your experience and knowledge, you might need to reevaluate some of that knowledge. For instance, it may be difficult to shed and reframe your priorities to be in sync with the new firm.

Causal ambiguity

Causal ambiguity refers to an inability to precisely articulate the reasons behind a process or capability. According to Szulanski, this exists “when the precise reasons for success or failure in replicating a capability in a new setting cannot be determined.”

How did causal ambiguity affect this transfer? While the site’s Home Usage section was promoted because of its successful testing and rollout, the reasons behind its success were never clear. Success of an ecommerce site depends on many factors, among them navigation, length and content of copy and labels, information density, and the site’s interaction design. Since Home Usage’s advantages had never been broken down into its components, and I hadn’t been there when usability tests were conducted, I could only see it as a black box.

To truly assimilate new knowledge, you need context. If none is provided, you need to know how to go out and get it. Ask about the reasons behind a model site. If possible, read any test reports. Keep asking until you understand and validate your conclusions.

An arduous relationship between the source and the recipient

Finally, knowledge transfer depends on the ease of communication and ‘intimacy’ between the source and recipient. Although my relationship with Nick was close, I worked off-site, which eliminated many informal opportunities for knowledge sharing. I couldn’t ask questions during a chance meeting or ‘ambush’ a manager by waiting for her to emerge from a meeting. Since I didn’t have access to Telecom Giant’s internal messaging system, I was limited to more formal methods such as email or phone calls.

A model for knowledge transfer

Thomas Jones offered this approach to knowledge transfer in a Quora post: “As they say in the Army: ‘an explanation, a demonstration, and a practical application.’ Storytelling, modeling, and task assignment … share your stories, model the behaviors you want to see and assign the tasks required to build competency.”4

Keeping “Home Usage” in mind, the story could be “how we came to follow this model,” the demonstration could be the research paper, and a practical application could be your work, evaluated by your lead.

In conclusion

Your ability to retain new information is essential to your success at a new company. However, your ability to understand the reasons behind the information and place these within a framework are even more important. Some techniques to help you do so are:

  • Be aware of your own design priorities and how they match with the firm’s. Treat the company’s priorities like any user research problem and check them out with your leads and co-workers.
  • To increase your absorptive capacity, evaluate your preconceptions and be prepared to change them.
  • Ask for the reasons behind a ‘model’ design. Read research reports if available.
  • Maximize your contact points. Follow-up emails can target ambiguous responses. If time with the UX leads is scarce, ask your co-workers about their view of priorities, patterns and the reasons behind them.

Further reading

1 Szulanski, G 1996, ‘Exploring Internal Stickiness: Impediments to the Transfer of Best Practice within the Firm’, Strategic Management Journal, vol. 17, pp. 27-43.

2 Absorptive capacity. Wikipedia entry.

3 Dierickx, Ingemar and Karel Cool. 1989. “Asset stock accumulation and sustainability of competitive advantage.” Management Science. 35 (December): 1504-1511.

4 “What patterns of behavior have proven to be most helpful in knowledge transfer?” Quora post.

When Information Design is a Matter of Life or Death

Written by: Thomas Bohm

In 2008, Lloyds Pharmacy conducted 20 minute interviews1 with 1,961 UK adults. Almost one in five people admitted to having taken prescription medicines incorrectly; more than eight million adults have either misread medicine labels or misunderstood the instructions, resulting in them taking the wrong dose or taking medication at the wrong time of day. In addition, the overall problem seemed to be more acute among older patients.

Almost one in five people admitted to having taken prescription medicines incorrectly; more than eight million adults have either misread medicine labels or misunderstood the instructions.

Medicine or patient information leaflets refer to the document included inside medicine packaging and are typically printed on thin paper (see figures 1.1–1.4). They are essential for the safe use of medicines and help answer people’s questions when taking the medicine.

If the leaflet works well, it can lead to people taking the medicine correctly, hopefully improving their health and wellness. If it works poorly, it can lead to adverse side effects, harm, or even death. Subsequently, leaflets are heavily regulated in the way they need to be designed, written, and produced. European2 and individual national legislation sets out the information to be provided, in a specific order, within a medicine information leaflet.

Paracetamol packaging, front.
Figure 1.1: Paracetamol packaging (front).

Paracetamol packaging (back)
Figure 1.2: Paracetamol packaging (back).

Paracetamol medicine information leaflet (front).
Figure 1.3: Paracetamol medicine information leaflet (front).

Paracetamol medicine information leaflet (back).
Figure 1.4: Paracetamol medicine information leaflet (back).

Adding to the design challenge is the fact that the guidelines for how medicine information leaflets are designed changes from country to country, and the guidelines are often vague.

One of the changes in the 2004 European Commission directive2 was to ensure that all medicine information leaflets ‘reflect the results of consultations with target patient groups.’ In other words, when producing a leaflet, user testing (or ‘readability testing’ as it is also known4) must be done. A satisfactory test outcome is when the information requested within the package leaflet can be found by 90% of test participants, of whom 90% can show that they understand it.3

The diagnostic testing method for medicine information leaflets also raises a unique challenge when designing leaflets and is more rigorous than the level of user testing most designers are used to.

Additionally, medicine information leaflets are required to be reviewed and approved by a competent authority, which varies from country to country, before being included in the packaging with the medicine.5

Possible Design Improvements

How can these materials be designed so that people end up taking the medicine as directed?

One issue with medicine information leaflets seems to be that most people do not read the document from start to finish, although it contains important information. Reasons for not reading or only skimming the leaflet from start to finish could be due to the amount of information or the leaflet design.

Competing sources of information introduce additional confusion. Sometimes the pharmacist will attach to the packaging a sticker with dosage instructions. That sticker can cover the dosage instructions printed on the packaging itself.

There are now potentially three sources of dosage information: the sticker, the packaging, and the leaflet, all with different densities of information. This creates an assumption on the part of the patient that everything they will need to know will be on the sticker–a dangerous assumption because patients do not read through the whole of the medicine information leaflet.

Medicine information leaflets are usually long and contain a wealth of information and complex terminology. An option would be to provide the document written to different educational levels.4

Sometimes leaflets do not make the most of headings and sectioning, which keeps people from finding quickly the information they need. Medicine information leaflets are usually minimally treated, featuring only plain text with headings in bold.

Could a more designed and illustrated appearance lead to people taking the medicine in the prescribed manner? A study6 suggests this is the case: Layouts that reduce text density, use purposeful sectioning, highlight key messages, and use a logical type hierarchy helped people to find the right information more quickly.

The example shown in figure 1.5 is a step in the right direction; the different types of information have been given a diversity of treatments to provide emphasis.

Redesigned medicine information leaflet from (Dickinson et al., 2010), www.consumation.com.
Figure 1.5: Redesigned medicine information leaflet from (Dickinson et al., 2010), www.consumation.com.

Layouts that reduce text density, use purposeful sectioning, highlight key messages, and use a logical type hierarchy helped people to find the right information more quickly.

In a similar vein, the United States Food and Drug Administration (FDA) recently proposed a redesign of nutrition labels on food packaging. Among the changes were putting calorie counts in large type, adjusting portion sizes to reflect how much Americans actually eat, and additional information about sugars in food.7

The Lloyd’s Pharmacy research stated that older people make the most mistakes when using medicine information due to either misreading medicine labels or misunderstanding the instructions. Clearer written instructions would solve the comprehension issue; a more ‘large print’ design would enable both older and a wider variety of people to better use the leaflet.

Medicine information leaflets are often printed on thin paper and folded many times to fit into the medicine package. There is a lot of show-through from the information printed on the back of the leaflet, which decreases readability. When the leaflet is unfolded, the paper crease marks affect the readability of the text (see figures 1.3 and 1.4). A possible improvement would be to print the leaflet on a thicker paper.

Article 63(2) of the European Commission, 2004,2 states that: ‘The package leaflet must be written and designed to be clear and understandable, enabling the users to act appropriately, when necessary with the help of health professionals.’

Diagnostic testing is examining an existing design to find out how it performs against the agreed performance requirements set at the scoping stage; for example, a satisfactory test outcome is when the information requested within the package leaflet can be found by 90% of test participants, of whom 90% can show that they understand it. Diagnostic testing takes the actions of people using the document as symptoms of the document’s health and is concerned with finding out what is wrong with a design. Diagnostic testing should be used iteratively—that is, repeated until its performance reaches the agreed benchmark. Diagnostic test questions are designed to see whether a consumer can find information quickly and easily and perform actions appropriately.8

Conclusion

Earlier research from Lloyds Pharmacy1 and Dickinson et al.6 demonstrates that design and writing has the potential to make a real difference in regard to medical errors and that design, writing, and production of a medicine information leaflet can have a real positive effect on people’s health.

The design of medicine information leaflets provides some interesting challenges because they might not be seen as a typical creative graphic design job. Just because they do not contain overly designed text or graphics, however, does not mean creativity is not needed, in fact creativity is usually lacking in leaflets typically produced.

Furthermore, creativity when designing medicine information leaflets usually comes in the form of clear writing, clear layout, and user testing—more of an information design challenge rather than graphic design.

The designer’s job is to clearly communicate the desired message. The designer also has to follow guidelines—in this case, not corporate identity guidelines but guidelines laid out in legislation and vetted by a regulatory body.

Effective design can make the difference between a person deciding to read a leaflet or not, or getting the information they need about the medicine they are taking or not. And that difference can be a matter of life or death. The not so typical design challenge of medicine information leaflets shows the importance effective design can have.

Endnotes

1 Lloyds Pharmacy. (2008). More than eight million patients admit medicine mistakes. Retrieved April 2008, from www.lloydspharmacy.com/wps/portal/aboutus/pr.

2 European Commission. (2004). Directive 2004/27/EC of the European Parliament and of the Council of 31 March 2004 amending Directive 2001/83/EC on the Community code relating to medicinal products for human use. Brussels: European Commission. Accessed January 2014, http://eur-lex.europa.eu/LexUriServ/ LexUriServ.do?uri=OJ:L:2004:136:0034:0057:EN:PDF.

3 European Commission. (2009). Guideline on the readability of the labelling and package leaflet of medicinal products for human use. Revision 1. Brussels: European Commission. Retrieved January 2014, http://ec.europa.eu/health/files/eudralex/vol-2/c/2009_01_12_readability_guideline_final_en.pdf.

4 van der Waarde, K. (2008a). Designing information about medicine for people. Retrieved April 2014, from www.valedesign.org.br/pdf/karen.pdf.

5 Medicines and Healthcare products Regulatory Agency. (2005). Always Read the Leaflet: Getting the best information with every medicine. Report of the Committee on Safety of Medicines Working Group on Patient Information. London: The Stationery Office. Retrieved January 2014, www.mhra.gov.uk/home/groups/p-la/documents/publication/con2018041.pdf.

6 Dickinson, D., Teather, J., Gallina, S., Newsom-Davis, E. (2010). Medicine package leaflets – does good design matter? Information Design Journal 18(3). Amsterdam: John Benjamins.

7 Tavernise, S. (2014). New F.D.A. Nutrition Labels Would Make ‘Serving Sizes’ Reflect Actual Servings. New York Times. 27 February 2014. Retrieved September 2014, from http://www.nytimes.com/2014/02/27/health/new-fda-nutrition-labels-would-make-serving-sizes-reflect-actual-servings.html.

8 Sless, D., and Shrensky, R. (2007). Writing about medicines for people. Australia: Communication Research Institute and The Australian Self-Medication Industry.

Teaching/Learning UX: Considerations for Academic-Industry Partnerships

Written by: Guiseppe Getto

Higher education is poised to help produce the next generation of user experience designers, but we can’t do it alone. In the wake of Fred Beecher’s recent “Ending the UX Designer Drought” and studies by Onward Search, UserTesting, and the Nielsen Norman Group, it is clear that the UX market is booming and that UX designers enjoy a high level of job satisfaction. It is also clear that too few UX professionals exist to meet current demand.

And while apprenticeship programs like Fred’s can help meet much of this demand, those of us in higher ed who have hitched our research, teaching, and service agendas–our entire professional identities–to UX are uniquely positioned to help students navigate to those apprenticeship programs, or even to help take the brunt of some of this training.

If we are to do so, however, we need members of industry to partner with us.

Obstacles to establishing UX workflows in higher ed

Before discussing ways that higher ed can help produce UX designers, let me discuss obstacles that we academics face, obstacles that make it difficult to nimbly respond to industry realities. I’m sure it won’t be news to anyone, for example, that the academy is very siloed.

At the same time that knowledge in the academy is often chopped up into discrete units called disciplines, however, there are other obstacles to establishing UX as a real concern that are specific to state- and federal-funded institutions of higher learning. The first of these that must not be underestimated is a status quo mentality that is endemic to academia but that doesn’t come from academics.

That may sound confusing, but remember that our schools are tied to state and federal budgets that are ultimately decided by politicians, not educators. Every program, major, minor, course, or even revision to an existing course, has to be vetted by administrators under increasing pressure by legislators to make every dollar count. This is often called a “strategic plan” in higher ed, and is our version of a business process model.

In this context, administrative response historically has been to veto any new programs that don’t clearly extend from an existing and well-established discipline. This may sound like a death knell for UX, an interdisciplinary practice that by its nature draws on knowledge from a wide variety of contexts, some academic and some not. And though it is a significant challenge, it is not an insurmountable one. It simply means that being a UX professional in higher ed means working within, and sometimes against, disciplinary boundaries.

What I mean by “disciplinary boundaries” is that every academic must do work that is understandable as a form of research, teaching, and/or service to a particular academic discipline and to a particular academic institution. Let me give a personal example to demonstrate what this workflow looks like. From my own discipline of technical communication and my own program seated at East Carolina University, I’m responsible for doing research, teaching, and service work that is recognizable as a contribution to the field of technical communication as well as to mid-size, doctoral-granting, state university.

This means that whatever I research has to be intelligible to other members of my field or I won’t get published and I won’t get tenure. This also means that I’m responsible for teaching a dizzying array of skill sets from technical writing to web design to UX design. Service is notoriously ill-defined in higher ed but typically requires serving on committees, serving on boards of professional organizations, and some form of community service (e.g. service-learning or volunteering).

I also should mention that there are exceptions to this workflow. Academic programs capable of reliably training students in the discipline of UX all on their own have  relatively recently sprung up at places like Kent State, Michigan State, and University of Washington. Largely interdisciplinary, these programs require a kind of perfect storm to form, such as a massive restructuring of a university system, a cluster of faculty hires in a particular specialization, or some very adventurous college administrators.

For the most part, what I’ve described is the uber-structure many academic professionals must work within. But there are ways to inject UX practices into this structure. Say I’m asked to teach a technical writing class, for instance, which is a central course in our program. A big focus of this class is the creation of documentation, because that’s part of the course description that was approved. But how is “documentation” increasingly being produced? In agile environments, where UXers or tech writers often build it directly into the information architecture of the product they’re designing. Following this trend, my recent technical writing course featured a series of learning modules that focused as much on traditional modes of documentation as it did on content strategy and IA.

For UX partnerships to work in higher ed, they must be enacted as a form of research, teaching, and/or service.

Why is this important? Because if many academic professionals must engage in work that fits this basic workflow, it means that we must always produce scholarship, teach courses, and engage in service that furthers the missions of our institutions as well as the knowledge-making practices of our disciplines. For UX partnerships to work in higher ed, they must be enacted as a form of research, teaching, and/or service.

Why academic-industry partnerships MUST exist

At the same time, it is also arguable that industry organizations cannot meet the need for new UX professionals alone, and more importantly, they shouldn’t have to. Currently, there are over 20 million students enrolled in institutions of higher education. And though of course only a fraction of those students might have the inclination to become a UX designer, that is simply too big a pool of potential new designers to pass up.

Academics can’t train UX professionals by ourselves, either. The discipline moves too fast and we are tied to a workflow that is very different from that of industry. This is thus a partnership that must happen to sustainably produce sufficient UX professionals to meet current demand. Millions of people from all walks of life pass through our classrooms, and if we can reach even a tiny percentage of those people and can set them on a path towards a career in UX, then UX professionals in both the academy and industry have an obligation to do so.

So, how can academic-industry partnerships be built that allow academics to keep their jobs? That’s what I turn to next.

Research partnerships

Academic research can be produced in many forms: empirical, theoretical, practice-based, quantitative, qualitative, and the like. All we really need from industry is access. We need to be able to see inside your organizations, to pick your brains through research interviews, and to survey you about new trends. We need to do focus groups about theories we have held for years to see if they pass muster with industry realities.

The best thing you can do for us in this context, in other words, is collaborate with us on our research projects.

Teaching partnerships

The largest venue within which UX could take off in higher ed is through teaching. In my own tenure as a college-level instructor for over ten years now I have taught 4-8 courses per year of 20-30 students per course. That means that conservatively I have personally taught over 1,500 students ranging from college freshman all the way up to graduate students, and in topics ranging from basic writing to UX design.

The only reason I know what UX is, and how to give students some practice in it, however, is because I have reached out to folks in industry and they have responded. I have gone to conferences like the IA Summit. I have completed dozens of webinars and workshops. I am an active member of my local chapter of the UXPA. And I have collaborated with industry partners on projects ranging from individual articles on how to teach UX to entire courses in UX.

If it weren’t for these partnerships, when I was recently asked to teach a graduate level course in UX for our program, I wouldn’t have been able to do so.

We know how to teach, but we will never be as cutting-edge at UX as those of you who work as UX designers every single day.

The best thing you can do for those of us who have opportunities to teach UX, then, is to help us with the subject matter we teach in our courses. We’re educators. We know how to teach, but we will never be as cutting-edge at UX as those of you who work as UX designers every single day. We need help with what to teach.

Service partnerships

There are also endless opportunities to get involved in academic service that fuels the UX discipline. This can range from the creation of book clubs and informal meet-ups to service-learning partnerships and other forms of contextualized learning. In any given community, there are thousands of local non-profits and schools in need of UX help, from content strategy to CMS development to usability testing existing websites. Hardly any of these organizations can afford to hire a UX professional, but students could work in partnership with a UX professional to serve them, and could develop expertise as a result.

On any given college campus, there are also dozens of student organizations on everything from business management to community service to web design, many of whom look for guest speakers to talk to them about career opportunities, how to land a job, how to gain traction in a new field, etc. It simply cannot be overstated that there is simply no shortage of mentoring opportunities for industry professionals to entertain.

Mentoring is thus the simplest and yet most valuable form of service. And it is incredibly simple: think about the first opportunity you got in UX and try to steer someone else toward that kind of opportunity, or even create one for them in your own organization.

This is what these partnerships could look like

We academics are the people who write those textbooks you were forced to buy in every introductory class in college. Imagine if they were relevant. Imagine if you helped us write them. Or imagine if you helped us write how-to articles for our academic journals so that other academic professionals could learn about UX in a venue that is intelligible to them. Or imagine if you helped perform research projects that were actually sound instances of UX design so that we could begin to solve our own UX problems and maybe even contribute some new solutions for you as well.

Imagine if you volunteered to come speak at our conferences about any of these topics, or even at informal meet-ups and colloquia at our institutions. We can rarely afford to pay you what you’re worth, or what you get for industry-sponsored conferences, but we will always appreciate the knowledge you bring.

Those of us in higher ed are in a position to introduce UX on a scale upon which it has never before been introduced.

Think of it this way: how many eighteen-year-olds graduate from high school and when asked by their guidance counselor what they want to be, say: “I want to be a UX designer!” Few, if any. Many of those bright young people enroll in college courses, however, part-time or full-time. Perhaps they decide to major in computer science, where they first learn about usability and its role in application development. Or perhaps they take a class in technical communication where they first encounter the central concepts of information architecture. Those of us in higher ed are in a position to introduce UX on a scale upon which it has never before been introduced.

But we need your help. If you don’t think we’re preparing students for opportunities as UX designers fresh out of our programs, then tell us that. Better yet: agree to be a guest speaker in our classes. Better yet: help us create learning modules on UX that we can use in multiple classes. Better yet: help us create and advocate for an entire class in UX. Better yet: volunteer to co-teach the class with us. Better yet: help us form an industry advisory committee for our entire program. The list goes on and on.

If you want that 18-year-old who is fresh out of high school to consider going into a field they may have never heard of, a field that probably first makes sense to them as a subset of an existing major, partner with your local university or community college to make sure that 18-year-old does hear about UX, to make sure they get some training in it, and to make sure they’re on a path toward being useful to you someday.

This is how all of this makes for a stronger UX community

If nothing else, I hope to start a conversation around expanding UX partnerships between interested professionals within both higher ed and industry. These partnerships exist, but they need to be the norm, not the exception. If every industry professional who reads this article volunteers as a research partner, teacher consultant, mentor, or service-learning partner at their local university, UX could become one of the most sought-after careers in higher ed. Students could be beating down academic doors asking for more experience in UX.

That kind of demand could create the kind of pressure we academics need to start new courses and majors in UX. If our students are demanding UX, then we can give it to them. If our students have no idea what UX is because they don’t hear about it until their senior year, then it becomes very difficult to create high-quality UX learning opportunities.

In the past three years, since my real interest in UX began, I have been working diligently to introduce students to UX in every course I teach. That’s a tall order considering that I can’t simply teach five sections (my current teaching load) of “Introductory UX Design” every year. I have had the opportunity to teach some standalone UX courses, but, as I mentioned above, mostly I introduce UX through individual learning modules and homework assignments in classes in technical communication, business writing, and web design.

But I have also met resistance, and not only from the academic side. A lot of industry-based people have helped me and inspired me in immeasurable ways, but I have also been met with downright suspicion. I have been asked if I feel like I am a form of competition for industry-based apprenticeship programs and for the newly formed Unicorn Institute, the UX field’s first standalone design school. Mostly, I have been asked repeatedly how I know enough about UX to teach it.

And the simple fact of the matter is: I don’t. I have been researching and practicing UX for three years now, research that has included in-depth practice with nearly every UX method on the market. I have published both research and practice-based articles on it (some of which were co-authored by UX designers), and have taught numerous courses in it, but I am not a full-time UX designer. Professionals like me are also all that stand between industry organizations looking to hire new UX talent and those 20 million people I mentioned earlier, few of whom start college even knowing what UX is.

My invitation is to see us as partners in helping to improve the quality of UX education regardless of where students and mentees are getting that education.

So, rather than seeing professionals like me as a form of competition–we simply don’t have the resources to compete with industry-based organizations, even if we wanted to–my invitation is to see us as partners in helping to improve the quality of UX education regardless of where students and mentees are getting that education.

Let’s face it: UX is difficult for many of us who study it, teach it, and do it for a living to define. We owe it to the next generation of UX professionals to introduce them to UX as soon as possible in their professional development, to be the frontline of UX education, so-to-speak. That’s the only way to make certain that students who are dedicated to becoming UX professionals have the opportunities they need to make that possibility a reality.