Researching Video Games the UX Way

Posted by

Video games are often overlooked in the scope of usability testing simply because, in a broad sense, their raison d’etre is so different than that of a typical functional interface: fun, engagement, and immersion, rather than usability and efficiency. Players are supposed to get a feeling of satisfaction and control from the interface itself, and in that sense, interaction is both a means and an end. The novelty and whimsy of the design occasionally comes at the expense of usability, which isn’t always a bad thing—that said, video games still have interfaces in their own right, and designing one that is easy to-use and intuitive is critical for players to enjoy the game.

Consider how video games are currently researched: market research-based focus groups and surveys dominate the landscape, measuring opinion and taste in a controlled lab environment, and largely ignoring players’ actual in-game behaviors. Behavior is obviously the most direct and unbiased source of understanding how players interact with the game—where they make errors, where they become irritated, where they feel most engaged. When Electronic Arts engaged Bolt|Peters to lead the player research project for Spore, we set out to do one better than the usual focus group dreck by coming at it from a UX research perspective.


One overarching principle guided the design of this study: we would let the users play the game in a natural environment, without the interference of other players, research moderators, or arbitrary tasks. This took a good bit of planning. Usually, we prefer to use remote research methods, which allow us to talk to our users in the comfort of their own homes. Spore, however, was a top-secret hush-hush project; we couldn’t very well send out disks for just anybody to get their hands on. Instead, CEO Nate Bolt came up with what we call a “Simulated Native Environment.” For each of the ten research sessions, we invited six participants to our loft office, where they were seated at a desk with a laptop, a microphone headset, and a webcam. We told them to play the game as if they were at home, with only one difference: they should think-aloud, saying what ever is going through their mind as they’re playing. When they reach certain milestones in the game, they would fill out a quick touchscreen survey at their side, answering a few questions about their impressions of the game.

Elsewhere, Nate, the clients from EA, and I were stationed in an observation room, where we set up projectors to display the players’ gameplay, the webcam video, and the survey feedback on the wall, which let us see the players’ facial expressions alongside their in-game behaviors. Using the microphone headset and the free game chat app TeamSpeak, we were able to speak with players one-on-one, occasionally asking them what they were trying to do or to go a little more in depth about something they’d said or done in the game.

Doesn’t that sound simple? Actually, the setup was a little brain-hurting: we had six stations; each station needed to have webcam, gameplay, survey, and TeamSpeak media feeds broadcast live to the observation room – that’s 18 video feeds and 6 audio feeds, and not only did the two (that’s right, two!) moderators have to be able to hear the participants’ comments, but so did the dozen or so EA members. On top of that, everything was recorded for later analysis.

“The feedback we received from users wasn’t based on tasks we’d ordered them to do, but rather on self-directed gameplay tasks the users performed on their own initiative”

The important thing about this approach is the feedback we received from players wasn’t based on tasks we’d ordered them to do, but rather on self-directed gameplay tasks the players performed on their own initiative. We didn’t tell players outright what to do or how to do things in the game, unless they were critically stuck (which was useful to know in itself). The observed behavior and comments were more stream-of-consciousness and less calculated in nature.

The prime benefits to our approach were the absence of moderators, which mitigated the Hawthorne effect, as well as the absence of other participants, eliminating groupthink. Additionally, the players were more at ease: it’s hard to imagine these video outtakes (see below) being replicated in a focus group setting. Most importantly, they weren’t responding to focus questions–they were just voicing their thoughts aloud, unprompted, which gave us insight into the things they noticed most about the game, rather than what we just assumed were the most important elements.


Over the year-long course of the project, there was one incident which proved to us just how important it was to preserve the self-directed task structure of our research. Because of the multiphase progression of Spore, we believe it was important to carefully structure the sessions to give players a chance to play each phase for a predetermined amount of time, and in a set order as if they were experiencing the game normally.

Partway through the second session, we started having doubts: even though we weren’t telling players what to do within each phase, what if our rigid timing and sequencing is affecting the players’ engagement and involvement with the game?

To minimize this, between sessions, we made a significant change to the study design: instead of telling users to stop at predetermined intervals and proceed to the next phase of the game, we threw out timing altogether and allowed users to play any part of the game they wanted, for as long as they wanted, in whatever order they wanted. The only stipulation was that they should try each phase at least once. Each session lasted six hours spread over two nights, so there was more than enough time to cover all five phases, even without prompting users to do so.

Sure enough, we saw major differences in player feedback. We are unable to provide specific findings for legal reasons, but we can say that the ratings for certain phases consistently improved (as compared with previous sessions). Additionally, a few of the lukewarm comments players had made about certain aspects of the game seemed to stem from the limiting research format, rather than the game itself.

It became clear that when conducting game research, it was vitally important to stick to the actual realities of natural gameplay as much as possible, even at the expense of precisely structured research formatting. You have to loosen up the control a little bit; video games are, after all, interactive and fun. It makes no more sense to formalize a gameplay experience than it does to add flashy buttons and animated graphics to a spreadsheet application.


There are a lot of ways to go with the native environment approach. Even with all efforts to keep the process as natural and unobtrusive as possible, there are still lots of opportunities to bring the experience even closer to players’ typical behaviors. The most obvious improvement is the promise of doing remote game research–allowing participants to play right at home, without even getting up.

Let’s consider what a hypothetical in-home game research session might look like: a player logs into XBox Live, and is greeted with a pop-up inviting him to participate in a one-hour user research study, to earn 8000 XBox Live points. (The pop-up is configured to appear only to players whose accounts are listed as 18 or older, to avoid issues of consent with minors.) The player agrees, and is automatically connected by voice chat to a research moderator, who is standing by. While the game is being securely delivered and installed to the player’s XBox, the moderator introduces the player to the study, and gets consent to record the session. Once the game is finished installing, the player tests the game for an hour, giving his think-aloud feedback the entire time, while the moderator takes notes and records the session. At the end of the session, the game is automatically and completely uninstalled from the player’s XBox, and the XBox Live points are instantly awarded to the player’s account.

Naturally, there are lots of basic infrastructure advances and logistical challenges to overcome before this kind of research becomes viable:

  • Broadband penetration
  • Participant access to voice chat equipment
  • Online recruiting for games, preferably integrated into an online gaming framework
  • Secure digital delivery of prototype or test build content
  • Gameplay screensharing or mirroring

For many PC users, these requirements are already feasible, and for games with built-in chat and/or replay functionality, the logistics should already be much easier to meet. Remote research on PCs is already viable (and, in fact, happens to be Bolt|Peters’s specialty). Console game research, on the other hand, would likely require a substantial investment by console developers to make this possible; handheld consoles present even more challenges.

We expect that allowing players to give feedback at home, the most natural environment for gameplay, would yield the most natural feedback, bringing game evaluation and gameplay testing further into the domain of good UX research.

Spore Research: Outtakes from bolt peters on Vimeo.

Science of Fun from bolt peters on Vimeo.


  1. I really enjoyed how you pull the Kimono back, even going so far as to post the sample video. The second video felt a little more like marketing, but it shows a lot of the rationale, and I love the shots of the observation room.

    Did you have to get permission from EA to post these? Was it difficult?

    I’d love to see more write ups and sample videos from other research.

  2. Thanks Austin glad you liked it. We did have to get permission from both the individual gamers and the Spore team at EA. But the Spore folks were unbelievably cool about giving us permission to talk about the details, with two big caveats:

    (1) It all had to be post-launch. While the game was still in development, when all the research took place, everything was super-duper top secret.
    (2) We can not ever discuss the actual findings from the study, only the techniques.

    The most difficult part was planning this before we even started testing by drafting really rigorous consent forms for participants and sending them info on how we would use their recordings. Some people opted-out, which of course was totally cool. Most were like “sure that’s hella funny lulz”

  3. I love the videos! The gamers are hella funny, LULZ. Makes me think about my own game face 😛

  4. Nate and Tony – AWESOME STUFF! I am trying to understand your technique a little bit better. Did you have two moderators, 6 subjects and 12+ clients all going at the same time? That sounds like mayhem! How did you control the situation – at least from the observation room? For example, I’ve been in many usability tests where it was just me moderating, one participant, and one colleague hanging out with three clients at the same time – and that was tough to control .. granted we have chatty clients, but that’s good… that means they are paying attention and they are engaged with what is being discovered. How did you guys do it? How did you know which subject to focus on at any given point? Write a part II on this! Thanks for sharing.

  5. Nate and Tony, your writeup really illustrates the importance of being flexible and responsive with research methods. Thanks for sharing the story.

  6. Hi Andres— Yeah, you’re absolutely right about the mayhem:

    We had two moderators dividing up the six participants between each other; otherwise you run the risk of asking redundant questions.

    We had the observers chat with us over IM whenever possible, or had them pass questions to us during the (very brief) lulls in testing; we also had a chat room in which the moderators and support staff could all coordinate with one another.

    As for who to focus on, that was largely driven by observing what each player was doing. If they were performing a task the developers were interested in, we switched the focus to them; we also encouraged players to thinkaloud while playing, not necessarily anticipating a response. If there was a technical glitch or request, we just had the participants wave at the camera to catch our attention.

  7. Good article. I’d been wondering for quite some time how research was performed on games considering the fun factor. It sets a great precedent for future research in games, especially how successful Spore has been. It was also great to hear that EA was willing to commit to such a costly research commitment.

  8. Hi Nate and Tony, this study sounds really fun! I had a quick question about the technical setup. Did you guys use any special remote viewing app for recording the screen running a 3D app or was it through DV cams? We had issues in past with 3D game testing as we could not view the 3D content remotely with a remote viewing client, we tried Morae recorder, Real VNC, Windows Remote Desktop, DameWare Mini Remote Control… but nothing work. As soon as the game starts the remote viewer would go black and it would come back while on regular desktop. i would really appreciate any suggestions here..- Thanks

  9. Trevor and Anuj – awesome questions. I *wish* we were allowed to talk about the findings of the study and how it related to game design decisions, but while EA has been absolutely amazing in letting us discuss the research, they have made it very clear we are not allowed to discuss any of the findings publicly. So I guess we can’t officially comment on what the game getting knocked says about the testing, other than that I would love to be able to talk about it someday.

    For the remote viewing software, we went through all the same frustrations you listed Anuj, and our final solution was just to run 300ft of VGA cable around the testing area and mirror the gaming laptop output on six seperate LCD monitors (DVI+VGA) that matched each of the participant’s screen’s. Then for recording we used ZD Soft game recorder running on each gaming machine. Pretty duct-tape-ish solution, i know, but it worked for our setup. In the future, I want to look at software+hardware that sends VGA signals over ethernet, since as you’ve noticed the 3d acceleration on games makes the regular software totally FAIL.

Comments are closed.