27 Oct 2008 [ Prev | Next ]

Ex 3-2: Usability Testing Exercise

  1. Read the Usability Testing handout that's also on the list for today.
  2. Find two websites that serve an intended audience that is accessible to you. (If you live on campus, pick a site intended to appeal to dorm residents.  If you live under a bridge, pick a site that appeals to the neighboring family of trolls.) You are going to need to watch at least two different people using the websites you choose, so you'll need to pick a site that's intended for the kinds of people you can use as usability testers.
  3. Analyze the purpose of the sites. What are users supposed to do on the site? (Sign up for a service, download a file, fill out an application, reserve a ticket, etc.)  Design a usability test that asks your volunteers to carry out the kinds of tasks that real users will need to carry out.
    1. Note that question such as "Do you like the color scheme?" or "Did you understand the layout?" is meaningless.
    2. Come up with concrete tasks -- facts to find, newsletters to sign up for, files to upload, etc.
    3. Based on your findings, write up a series of recommendations. (Feel free to send your recommendations to the webmasters for real -- just make sure that you frame your comments as constructive criticism, not an attack.)
    4. Upload your report (2-3 pages) into the slot on Turinitin.com. (Here are some models of a typical business report... I'd accept a five-paragraph essay, too, but if you want practice communicating in the corporate world, here are better formats to use: the memo, or the short report.) 
    5. You are welcome to post your findings on your blog, but that's optional.
Sample usability research.  (You should pick a different topic, but use this narrative to plan your own usability testing.)

  • I'm going to test SHU's website and St. Vincent's websites, and I'm going to take the point of view of a high school junior who's in the early stages of the college application process.
  • I send a survey out to current SHU students and current St. Vincent's students, and I learn that most of them trade web links with their parents, and most of them spend at least a little time exploring college websites together.
  • Based on what I learned from the survey, I'm going to set up a test in which I invite a high school junior and his/her parent to look at the school websites for the first time.  I want to observe what happens when a vistior comes to the site for the first time, so I'm not going to test anyone who has been to either website before.
  • My survey indicated that current students were attracted by the campus facilities, range of programs, athletics, and financial aid package (in that order). So, I'm gonig to create a series of questions, and see how long it takes for the testers to answer the questions.
    1. What is the most recent construction project on campus? What is the next big planned project on campus?
    2. How many majors does the school offer? How many faculty teach in the history program, the English program, the math program, and the psychology program?
    3. What percentage of students play a sport on campus?
    4. What is the average tutition, and how much does the average student actually pay out of pocket (minus loans and scholarships)?
  • Before my test subjects look at either website, I ask them what their opinions are.  "Rate Seton Hill's athletics program on a scale of 1 to 10, with 10 being the best."  I'd ask for several such ratings, including some that aren't that important to the test subjects. (When Fred Flintstone runs underneath a rock that's a different color than the background, you know that's the rock that's going to fall on him... so if you ask your subjects what they think about a specific thing, they'll be looking for it when they go to the website.) 
  • I would time how long it takes for each test subject to find the answer (or how long they try before they give up).
  • When they are finished, I would give the testers another questionnaire -- "Now rank the SHU athletics program again" or "Are you much more likely, somewhat more likely, neither more nor less likely, somewhat less likely, or much less likely to choose SHU?"  After each of these ranking questions, I'd ask "Why?"
  • I'd then synthesize all the responses, and use them to suggest improvements to the website.
  • While you won't be asked to do this for the current homework assignment, in the real world I would then create a revision of the SHU website that implements some changes, and I'd test the site again. Did the changes make the users find the answers faster, make fewer mistakes, have a more positive attitude?  Or did the changes hurt the site's usability score?  Were gains in one area wiped out by losses in another area?
Nobody knows how long it "should" take for a user to answer a question that you come up with in your usability test, but if you can show that each of your revisions to a site make the site more usable, then you can quantify your value to an employer: "Users find their information 56% faster, the make 20% fewer mistakes, and they are 10% more likely to say the site gives them a more favorable opinon of the organization."
 

I'm going to test high school juniors, who are sitting in front of the computer with one of their parents.


Ask test subject 1 to look at SHU's website first, and St. Vincent's website second.

Ask test subject 2 to look at St. Vincent's first, and SHU's second.



Categories
:

Leave a comment


Type the characters you see in the picture above.

August
          1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
31            
September
  01 2 03 4 05 6
7 08 9 10 11 12 13
14 15 16 17 18 19 20
21 22 23 24 25 26 27
28 29 30        
October
      01 2 03 4
5 06 7 08 9 10 11
12 13 14 15 16 17 18
19 20 21 22 23 24 25
26 27 28 29 30 31  
November
            1
2 03 4 05 6 07 8
9 10 11 12 13 14 15
16 17 18 19 20 21 22
23 24 25 26 27 28 29
30            
December
  01 2 03 4 05 6
7 08 9 10 11 12 13
14 15 16 17 18 19 20
21 22 23 24 25 26 27
28 29 30 31