Skip to Main Content
It looks like you're using Internet Explorer 11 or older. This website works best with modern browsers such as the latest versions of Chrome, Firefox, Safari, and Edge. If you continue with this browser, you may see unexpected results.

User Testing: Expert Review & Heuristics


Expert review and heuristics are separate things, but closely related. They tie together as when an expert review is done, the reviewer is using their own knowledge of heuristics to evaluate the website for any usability issues.


Heuristics are a set of principles that dictate what makes good user interaction design in a product. Heuristics can be used as a guide for making appropriate design decisions.

Jakob Nielsen has 10 heuristics for interaction design that are used heavily in the UX(user experience) industry.

1. Visibility of system status
The system should always keep users informed about what is going on, through appropriate feedback within reasonable time.

2. Match between system and the real world
The system should speak the users' language, with words, phrases and concepts familiar to the user, rather than system-oriented terms. Follow real-world conventions, making information appear in a natural and logical order.

3. User control and freedom
Users often choose system functions by mistake and will need a clearly marked "emergency exit" to leave the unwanted state without having to go through an extended dialogue. Support undo and redo.

4. Consistency and standards
Users should not have to wonder whether different words, situations, or actions mean the same thing. Follow platform conventions.

5. Error prevention
Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action.

6. Recognition rather than recall
Minimize the user's memory load by making objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate.

7. Flexibility and efficiency of use
Accelerators — unseen by the novice user — may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions.

8. Aesthetic and minimalist design
Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility.

9. Help users recognize, diagnose, and recover from errors
Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution.

10. Help and documentation
Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user's task, list concrete steps to be carried out, and not be too large.

Expert Review

In an expert review, experts in a field of UX or web design are required to examine the product and using their own knowledge and known interaction design principles, decide whether it complies with recognized usability principles/heuristics.

How Many Reviewers?

Jakob Nielsen recommends best practice is to have 3-5 expert evaluators, as it is impossible for one evaluator to discover all the usability problems, and each evaluator often discovers different issues to the previous. This is supported by the image below, which was taken from a case study of a heuristic evaluation where 19 evaluators found 16 usability problems in a voice response system that allowed customers to access their bank accounts. From this it can be seen that there are a lot of issues only seen by a small number of evaluators, and some of the evaluators that discovered a smaller amount of problems also discovered some of the more complex problems.

How Often?

Like facilitated user testing, expert review is part of an iterative test process.

Test Process

In an expert evaluation (see resources section for the full email and feedback script), each evaluator inspects the interface alone. Evaluation results can be either done in a written report by the evaluator or discussed verbally with an observer/facilitator who will then take notes. In contrast to a user test, the observer does not need to record & interpret the evaluators actions, they only need to record their comments. Also, in contrast to traditional user testing scenarios is the ability of the observer to answer questions. The observer is encouraged to answer questions in a heuristic evaluation, as it helps the evaluator better assess the user interface.

It is up to the evaluator to decide how they want to do the evaluation. Generally, the evaluators will go through the interface once to get a feel for the system, and then they will go through it again, this time knowing how specific elements fit into the larger design.

The output of the heuristic testing is a list of usability problems, with a reference to which usability principle that was violated. The severity of the violation can be described by using a heuristic scale.

Nielsen (1994) developed the following 0 to 4 rating scale to be used to rate the severity of usability problems:

= I don't agree that this is a usability problem at all
= Cosmetic problem only: need not be fixed unless extra time is available on project
= Minor usability problem: fixing this should be given low priority
= Major usability problem: important to fix, so should be given high priority
= Usability catastrophe: imperative to fix this before product can be released

(Nielsen 1994)


Generally, the evaluator does not provide fixes to these problems, but a solution can usually be developed by looking at the violation in question and making it adhere to the principle.


  • To be used in conjunction with user testing
  • Can be done at any time, but may be better to do with a functional prototype, after the facilitated user testing
  • Reviewers should be financially compensated for their knowledge and time

Charles Darwin University acknowledges the traditional custodians across the lands on which we live and work, and we pay our respects to Elders both past and present.
CRICOS Provider No: 00300K (NT/VIC) 03286A (NSW) RTO Provider No: 0373 Privacy StatementCopyright and DisclaimerLibrary Webmaster • ABN 54 093 513 649