5 Steps to Your First User Test

The barrier to entry into user testing has never been lower. Forget size and budget. Companies who interact with users on a regular basis will swim, while others sink. Despite the ease of entry, many still neglect testing and rely on internal opinions to guide product decisions.

Though investing in a user researcher is ideal, this hurdle shouldn’t stop you. Look on your team for a strategic thinker and good listener to get started. When you witness the insights and aha moments that come from a test, the value will be clear as day.

These 5 steps are a simple introduction to user testing for non-researchers. It’s a flexible process to follow, avoiding the complexities in favor of action. With clear goals, and a sincere desire to gather honest feedback, you’re already well on your way. Jump in and get started. You may never look back.

STEP 1: Think First, Think Hard

User testing doesn’t start with tools and software. It doesn’t even start with a test plan or script. It starts with a question, paired with hard thinking on the goals and objectives of the test. In other words, “why talk to our users?”

Conducting a round of testing doesn’t mean you’ll see a measurable and immediate result from it. If you don’t think hard enough at the outset to answer the big ‘why’ then you’re setting yourself up for problems. This could come in the form of invalid findings or, worse yet, confirmation bias.

Know your question (or questions) you are seeking to answer. Have a goal, and be clear on what you’re testing and why you’re testing it. Figure out what types of people in your target market will help you gather helpful insights.

STEP 2: Plan the Test, Check It Twice

Once your goals are set, put together a test plan. Don’t worry about what it looks like. Focus on its purpose of clarifying how you intend to run the sessions.

Will it be in person or remote? Random or preselected participants? Recorded audio/video, or do you need a designated notetaker? Run through a few mock sessions with colleagues to check the session flow and improve your comfort with it.

Know what kind of tools are available (here’s a helpful collection), but don’t seek to be an expert in them all. There are many, all with different price points and functionality. Asses which is best to answer your question and achieve your goal.

Write a script, but keep in mind you’ll likely divert from it. Like everything we do, make the test as usable as you can (both for yourself and participants). Orient your participants to how the test will work, and how vital their feedback is.

Develop a contingency plan, since it’s likely everything won’t work as well as you expect it to. Test it with your colleagues, and prepare for technical and non-technical snafus. If you’re having a lot of difficulties, reevaluate what’s causing the problem and correct path.

STEP 3: Do It Right

Because you planned for this step — the doing — it will go as smoothly as possible, and make way for the best inputs for steps 4 and 5. The most important thing in ‘doing it right’ is to relax, and put the participants at ease. If you’re calm, and work through your test plan, all will work out.

Assume they know nothing about technology, and avoid tech-jargon as much as you can. Even terms that may seem obvious to you, could be confusing. Sure, you know it’s called a navigation bar, but they may understand it as “those 5 words at the top of the page”. Ask yourself, “would Grandma understand this question?”, and if not, rewrite it. Always err on the side of caution with this, or else they may feel embarrassed to admit they don’t know what you mean.

You’ll be successful in this step to the extent that you can get your participants talking. Think of yourself less as a researcher looking for an answer to a question and more of a psychoanalyst trying to understand the inner workings of their mind. Sure, sometimes you may prefer a simple answer, but never miss the ‘why’ beneath. In review, a data point is far less valuable if there isn’t an explanation of ‘why’.

STEP 4: Listen, Analyze, Repeat

Documenting the test is not optional. Document everything, whether audio, video, or detailed notes from your notetaker. You will forget most things, especially the details. You also run the risk of remembering something that didn’t happen or you misheard. If this test didn’t involve talking, organize assets to analyze and later share with stakeholders.

Beware, internal stakeholders may jump to conclusions in seeing these recordings. “Participant A said this, so we should go this direction”. This is dangerous. Your job is to listen to everything, analyze it, internalize it, sleep on it, and make sense of it. Cross reference the results with other research, and ensure stakeholders can see the full picture. Be honest about contradictions in the research, and firm on trends you’ve discovered.

Internal stakeholders are not the real concern here. You, the researcher/tester, are the first line of defense against biased results. Your preferences are irrelevant. If you asked leading questions, then consider the answers invalid. It is your responsibility to analyze both what the users said as well as how you asked a question.

STEP 5: Translate to Reality

Connect your findings to reality. The results will have a direct impact on the work of you and your colleagues. What does this mean for design? How about for development? What does it mean for content producers, or customer service teams? How could it impact the strategic direction of a product or department? When you report back, speak to how your findings impact each group.

Often entire teams aren’t looped into research insights and are even more so left out from the raw data of testing. This is a missed opportunity. Stakeholders who know the perspectives, needs, and pains of users are able to make conscious efforts to respond to them. Through user testing, companies open the door to creating a user-centered culture.

Now go do a test and share what what you learned!


This piece was originally published on Craig’s blog on Medium.