7 – Evaluation

In order to uncover any  usability problems in the UI, so that they can be addresed during the iterative design process, we undertake a stage of product evaluation using  Nielsen’s Heuristics. These heuristics are a set of 10 user interface design guidelines applied to a UI, and evaluated by usability experts.

Nielsen’s Heuristics were developed by Jakob Nielsen in 1994, refined from a list originally developed in 1990, by Nielsen and Rolf Molich.

The Evaluators

The evaluators are software engineer Gordon McGuire, and myself, Cormac Maher (UI Designer), both of us were involved in the design of the prototype. Neither of us had domain knowledge prior to this project.

Procedure for the Heuristic Evaluation

The evaluation consisted of four phases:
  1. The evaluation
  2. A debriefing session to discuss the evaluation findings
  3. Joint estimation of the severity rating of each usability problem found
  4. Documentation of results and summary

The tasks/scenarios being evaluated were those developed in the previous stage of the assignment, “Paper Prototypes and Testing“.

The Screens Being Evaluated

  1. Screens used in Task #1 “An audience member should be able to ask a question, mid-event, within 3 clicks”
  2. Screens used in Task #2 “At the end of the day, an audience member should be able to access downloaded materials within 2 clicks”

The Evaluation

Evaluation was performed by each evaluator, alone. Each task/scenario was fully evaluated in a one-hour phase, this phase involved a detailed analysis of the individual screens and their components.

As an aid, I used a Xerox Corp. Heuristic Evaluation checklist, although the checklist itself was dated 1995, a lot of the checkpoints were still valid and I found them to be a great aid to identifying issues. You can view the files below:

Debriefing

After completion of the evaluation we conducted a short debriefing session. During the debriefing session we discussed the usability problems found, and assigned a severity rating to each.

Problems that directly related to the tasks being evaluated were considered “core”, these core problems were prioritised and scrutinised most closely during the debriefing session.

Severity Ratings

Severity ratings were considered in order to assess the potential impact of the issues on the usability of the system. The rating was based on:

  • Frequency the issue occurs
  • The impact of the issue when it occurs
  • The persistence – is it a one-time problem that users can overcome, or will the users be persistently bothered by it

Ratings were applied as follows:

0  – I don’t agree that this is a usability problem at all
1  – Cosmetic problem only-need not be fixed unless extra time is available on project
2  – Minor usability problem-fixing this should be given low priority
3  – Major usability problem – important to fix, so should be given high priority
4  – Usability catastrophe – imperative to fix this before product can be released

Evaluation of Audience Engagement app

The evaluation results for each task are listed below, the location of the issue, the issue description, the severity rating as estimated by both evaluators, and the evaluator who discovered the issue are all listed.

1. Visibility of system status

  • Always keep users informed about what is going on.
  • Provide appropriate feedback within reasonable time.

Evaluation

Task #1An audience member should be able to ask a question, mid event, within 3 clicks

Location Issue Severity Evaluator
Question List screen Ensure status of Edit icon is obvious when selected 1 CM
Question List screen Unclear how interactive the top image is 1 GMG

Task #2At the end of the day, an audience member should be able to access downloaded materials within 2 clicks

Location Issue Severity Evaluator
Event screen A and B Page title doesn’t describe screen contents 1 CM

2. Match between system and the real world

  • Speak the users’ language, with words, phrases and concepts familiar to the user, rather than system-oriented terms.
  • Follow real-world conventions, making information appear in a natural and logical order.

Evaluation

Task #1An audience member should be able to ask a question, mid event, within 3 clicks

Location Issue Severity Evaluator
Join Event screen Join Event label makes sense, but suggest adding action, say “View” to Previous Events list heading. 2 CM
Question List screen Select menu doesn’t have label to indicate function. Maybe add “View”. 2 CM

Task #2At the end of the day, an audience member should be able to access downloaded materials within 2 clicks

3. User control and freedom

  • Users often choose system functions by mistake.
  • Provide a clearly marked “out” to leave an unwanted state without having to go through an extended dialogue.
  • Support undo and redo.

Evaluation

Task #1An audience member should be able to ask a question, mid event, within 3 clicks

Location Issue Severity Evaluator
Add Question screen Suggest confirmation if user wants to cancel asking a question once they have begum typing. To prevent loss of inputted text accidentally. 2 CM
Join Event screen It’s not clear if it does, but the Event Code text field should support copy & paste. 0 CM
Add Question screen It’s not clear if it does, but the Question textarea should support copy & paste. 0 CM
Question List screen No clear way of returning to event login screen. 3 GMG
Question List screen No way to delete question. 2 GMG

Task #2At the end of the day, an audience member should be able to access downloaded materials within 2 clicks

Location Issue Severity Evaluator
Event screen No clear way to return to login screen. 3 GMG

4. Consistency and standards

  • Users should not have to wonder whether different words, situations, or actions mean the same thing.
  • Follow platform conventions.

Evaluation

Task #1An audience member should be able to ask a question, mid event, within 3 clicks

Location Issue Severity Evaluator
All screens Caps used for page titles, checklist 4.1 this as one of their checkpoints. Discuss further. 1 CM
Add Question screen Fonts seem larger. 0 GMG
Join Event screen Uneven spacing on screen. 0 GMG

Task #2At the end of the day, an audience member should be able to access downloaded materials within 2 clicks

Location Issue Severity Evaluator
All screens Windows don’t appear to have titles. 1 CM
Event screens – A and B Question filtering drop-down has no label. 2 CM
Event screen B Location of video player interrupts flow of screen. 1 GMG

5. Error prevention

  • Even better than good error messages is a careful design which prevents a problem from occurring in the first place.
  • Eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action.

Evaluation

Task #1An audience member should be able to ask a question, mid event, within 3 clicks

Location Issue Severity Evaluator
All screens Ensure fields with errors receive focus. 0 CM

Task #2At the end of the day, an audience member should be able to access downloaded materials within 2 clicks

Location Issue Severity Evaluator
All screens Ensure fields with errors receive focus. 0 CM

6. Recognition rather than recall

  • Make objects, actions, and options visible.
  • User should not have to remember information from one part of the dialogue to another.
  • Instructions for use of the system should be visible or easily retrievable whenever appropriate.

Task #1An audience member should be able to ask a question, mid event, within 3 clicks

Location Issue Severity Evaluator
Join Event screen Add breathing space around Event Code input. 1 CM
Join Event screen Create more distinction between Join Event and Previous Event zones. 1 CM
All screens Mark mandatory fields as mandatory? 2 CM
Add Question screen Add “All” option to Presenter select menu. 2 GMG

Task #2At the end of the day, an audience member should be able to access downloaded materials within 2 clicks

Location Issue Severity Evaluator
Event screens – A and B Download link needs icon. 1 GMG
All screens Mark mandatory fields as mandatory? 2 CM

7. Flexibility and efficiency of use

Accelerators — unseen by the novice user — may often speed up the interaction for the expert user so that the system can cater to both inexperienced and experienced users.

  • Allow users to tailor frequent actions.

Evaluation

Task #1An audience member should be able to ask a question, mid event, within 3 clicks

Location Issue Severity Evaluator
Join Event screen Should the Even Code field support autocomplete. 0 GMG

Task #2At the end of the day, an audience member should be able to access downloaded materials within 2 clicks

8. Aesthetic and minimalist design

  • Dialogues should not contain information which is irrelevant or rarely needed.
  • Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility.

Evaluation

Task #1An audience member should be able to ask a question, mid event, within 3 clicks

Location Issue Severity Evaluator
Question List screen Make “Up” arrow more obvious. 1 CM
Question List screen Add more whitespace between dropdown and questions. 1 CM
Question List screen Increase size of “Event Title” to match other page titles. 0 CM

Task #2At the end of the day, an audience member should be able to access downloaded materials within 2 clicks

Location Issue Severity Evaluator
Join Event screen No Submit button for joining event. 2 CM
Event screen Review amount of information, or give more breathing space to content. 1 GMG

9. Help users recognise, diagnose, and recover from errors

  • Expressed in plain language (no codes)
  • Precisely indicate the problem
  • Constructively suggest a solution.

Evaluation

Task #1An audience member should be able to ask a question, mid event, within 3 clicks

Task #2At the end of the day, an audience member should be able to access downloaded materials within 2 clicks

10. Help and documentation

  •  Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation.
  • Help information should be easy to search, focused on the user’s task, list concrete steps to be carried out, and not be too large.

Evaluation

Task #1An audience member should be able to ask a question, mid event, within 3 clicks

Task #2At the end of the day, an audience member should be able to access downloaded materials within 2 clicks

Evaluation Summaries

Task #1 – An audience member should be able to ask a question, mid event, within 3 clicks  
   
Total number of issues 21
   
Severity Level 0 7
Severity Level 1 7
Severity Level 2 6
Severity Level 3 1
Severity Level 4 0
Task #2 – At the end of the day, an audience member should be able to access downloaded materials within 2 clicks  
   
Total number of issues 10
   
Severity Level 0 1
Severity Level 1 5
Severity Level 2 3
Severity Level 3 1
Severity Level 4 0

View the original analysis data, for task 1, here

View the original analysis data, for task 2, here

References

Heuristic Evaluations and Expert Reviews | Usability.gov. Usability.gov. Retrieved 25 November 2016, from https://www.usability.gov/how-to-and-tools/methods/heuristic-evaluation.html

10 Heuristics for User Interface Design: Article by Jakob Nielsen. (2017). Nngroup.com. Retrieved 25 November 2016, from https://www.nngroup.com/articles/ten-usability-heuristics/

Severity Ratings for Usability Problems: Article by Jakob Nielsen. (2017). Nngroup.com. Retrieved 26 November 2016, from https://www.nngroup.com/articles/how-to-rate-the-severity-of-usability-problems/

Chambers, L. (2017). How to run an heuristic evaluation – UX Mastery. UX Mastery. Retrieved 26 November 2016, from http://uxmastery.com/how-to-run-an-heuristic-evaluation/