-
Notifications
You must be signed in to change notification settings - Fork 67
Methods team research strategy
This page describes the research activities the Method Cards team should regularly perform. Our research strategy has three components: analytics reporting, feature-focused usability testing, and full experience user research.
Frequency: weekly
The team should designate one person to look at our Google Analytics dashboard each week, document progress toward our goals here, and identify potential problem areas.
Here are some hypotheses we are currently testing and how analytics will help us.
We believe that outreach and improvements to the site will result in more government users learning about and using the cards. We'll know we're right when we see the number of government visitors to the site increase.
We believe that the new print capability that lets people print cards directly from the page will result in more people printing individual cards. We'll know we're right when we see an increase in the number of print events from individual card pages.
User story | Question | Metric |
---|---|---|
As a user, I need to be able to easily print a single card so that I can have a physical copy of only the method I am interested in. | Do users use the print links? Do users print the whole deck or individual cards? Do users want to print cards by category | Number of print events from the home page, number of print events from individual cards, number of print events from category pages |
As a government designer, I need to know that the cards exist and how I can get them, use them, and access them. | Are our improvements and recent outreach efforts increasing the number of visitors to the site? How about visits from government users specifically? | Number of overall visits, number of visits by government users, ratio of government users to other users |
We’d also like to use analytics to help us refine our product strategy. Tracking page views and print events for the individual methods will help us prioritize upcoming work.
Questions: Which methods should be our focus for various outreach activities? Which methods should we archive?
Metrics: Individual page views/overall site visits.
Frequency: ~quarterly
For course correction and input on specific features or UI components, we will talk with users regularly to get their feedback. These interviews should focus on a handful of tasks, features, or user interface components and should take about 30 minutes.
It’s often said that 85% of usability issues are found by talking with five users during an iterative testing process, so we will aim to talk to about five users in each round.
There are many methods for analyzing usability test data ranging in the time and experience they require. If the team does not have a dedicated user researcher, a lightweight way of prioritizing which issues to address is to use severity ratings that look at the frequency, impact, and persistence of the problem.
Frequency: ~bi-annually
End-to-end testing differs from feature-focused testing since the team is trying to assess the entire experience of the product.
The team should use affinity diagramming to pull out salient themes across test sessions, document the themes we identify, and the evidence for why that theme is present. We can then prioritize issues in the same way we would for feature-focused testing using severity ratings.