A Note on
ISO Testing of Computer Pointing Devices

I. Scott MacKenzie

Dept. of Computer Science and Engineering

York University

Toronto, Ontario, Canada M3J 1P3

mack “at” cse.yorku.ca

Last update: 15-May-17

Background

The evaluation of computer pointing devices is tricky. Although there is an abundance of published evaluations in the disciplines of human-computer interaction and human factors, the methodologies are often ad hoc. The experimental procedures, while perhaps internally valid, are inconsistent from one study to the next, and this greatly diminishes our ability to understand the results or to undertake comparisons between studies. As a consequence, there is a lot of literature to examine, but the reader is in a quandary on what it all means.

Fortunately, an ISO standard was published in 2000 that addresses this particular problem. The full standard is ISO 9241, Ergonomic design for office work with visual display terminals (VDTs). The standard is in seventeen parts. Part 9 of the standard is called Requirements for non-keyboard input devices. An updated version was released in 2012 as ISO/TC 9241-411.

ISO 9241-9 and ISO/TC 9241-411 describe tests to evaluate the performance, comfort, and effort in using computer pointing devices. This note focuses on performance evaluations.

The procedures for conducting a performance evaluation are well laid out and, if followed, yield a strong and valid performance evaluation of one or more pointing devices. Between-study comparisons will be improved because of (at least) three reasons:

1.      The methodology is consistent from one study to the next (as long as the standard is followed).

2.      The standard suggests including a representative device, such as a mouse, for comparison purposes. Thus, a "base-line condition" exists, and this will strengthen across-study comparisons if other studies use the same base-line device.

3.      The metric for comparison is Throughput, which includes both the speed and accuracy of users' performance. Hence, Throughput has an inherent ability to normalize for behavioural differences (i.e., speed vs. accuracy) across users or across studies.

Why Use ISO 9241-9?

Obviously, within-study comparisons are also valid, and herein lies one of the most appealing features of the tests described in ISO9241-9. Companies in the business of developing, manufacturing, and marketing computer pointing devices want to know how good their product is. It might be a new pointing device, or it might be an improvement on existing technology. The company may want the evaluation performed for "internal" purposes (Should we bring this prototype to market?) or for "external" purposes (Is our device as good as a competitor's device?).

ISO testing can answer a variety of questions, such as the following:

·         Is our pointing device as good as a mouse?

·         Is "Brand A" mouse as good as "Brand B" mouse?

·         Is a touchpad as good as a pointing stick?

·         Is a finger-controlled trackball as good as a thumb-controlled trackball?

·         Is our trackball (or touchpad, or joystick, or whatever) as good as our competitor’s?

·         Should buttons be placed "above" or "below" a touchpad in a notebook computer?

·         Is "lift-and-tap" as good as "buttons" for select operations on a touchpad?

·         Is "Device Driver A" as good as "Device Driver B" for a certain pointing device?

The possibilities are unlimited. You can see in the questions above that the evaluations and comparisons are not limited to "Device A" vs. "Device B". ISO 9241-9 can also evaluate and compare any characteristic of a device or interaction technique that might affect user performance. For example, for a given diameter of trackball, which of two ball masses is better? This is a straight-forward question to answer in the context of ISO testing.

Each bullet above includes the phrase “as good as” as the basis of comparison.  Operationally, the phrase implies testing using both quantitative and qualitative procedures.  For the quantitative tests, users are given a series of tasks and measurements are made on their performance.  These measurements serve as the basis for comparison. (This is a greatly simplified explanation, but that's the general idea.  See any of the papers below for further details.)

For the qualitative tests, users are given a questionnaire on their comfort and preferences. The response items are provided in Douglas, Kirkpatrick, and MacKenzie (1999), Zhang and MacKenzie (2007), McArthur, Castellucci, and MacKenzie (2009), and Natapov, Castellucci, and MacKenzie (2009).  See below.  As well, the experimenter maintains a journal during the experiment and notes any anomalous behaviour that is not captured by the experimental software (e.g., a preponderance of clutching operations).

Resources

If you would like a copy of ISO 9241-9, it is available for purchase through ANSI’s Electronic Standards Store:

http://webstore.ansi.org/ansidocstore/

The author has developed elaborate testing tools that implement the evaluation tests in ISO 9241-9.  They are freely available as downloads on the web site for my recent book, Human-Computer Interaction: An Empirical Research Perspective.  Click on “FittsTaskTwo”. 

The tool also records and stores trace data and performance measures associated the cursor path   A separate utility, which is included in the download, facilitates viewing the trace data.  It is called FittsTrace.  Click here to view the API.

For more information, please contact Scott MacKenzie at mack “at” cse.yorku.ca or visit his home page.

References

The following papers present pointing device performance evaluations conforming to ISO 9241-9:

1.      Sasangohar, F., MacKenzie, I. S., & Scott, S. D. (2009). Evaluation of mouse and touch input for a tabletop display using Fitts’ reciprocal tapping task.  Proceedings of the 53rd Annual Meeting of the Human Factors and Ergonomics Society – HFES 2009, pp. 839-843. Santa Monica, CA: Human Factors and Ergonomics Society.

2.      McArthur, V., Castellucci, S. J., & MacKenzie, I. S. (2009). An empirical comparison of “Wiimote” gun attachments for pointing tasks.  Proceedings of the ACM Symposium on Engineering Interactive Computing Systems – EICS 2009, pp. 203-208. New York: ACM.

3.      Natapov, D., Castellucci, S. J., & MacKenzie, I. S. (2009). ISO 9241-9 evaluation of video game controllers. Proceedings of Graphics Interface 2009, pp. 223-230. Toronto: Canadian Information Processing Society.

4.      Zhang, X., & MacKenzie, I. S. (2007). Evaluating eye tracking with ISO 9241 – Part 9. Proceedings of HCI International 2007, pp. 779-788. Heidelberg: Springer.

5.      Soukoreff, R. W., & MacKenzie, I. S. (2004). Towards a standard for pointing device evaluation: Perspectives on 27 years of Fitts’ law research in HCI. International Journal of Human-Computer Studies, 61, 751-789.

6.      MacKenzie, I. S., Jusoh, S. (2001). An evaluation of two input devices for remote pointing. To appear in Proceedings of the Eighth IFIP Working Conference on Engineering for Human-Computer Interaction – EHCI 2001. Heidelberg, Germany: Springer-Verlag.

7.      Silfverberg, M., MacKenzie, I. S., Kauppinen, T. (2001). An isometric joystick as a pointing device for hand-held information terminals. Proceedings  of Graphics Interface 2001, pp. 119-126. Toronto, Canada: Canadian Information Processing Society.

8.      MacKenzie, I. S., Kauppinen, T., & Silfverberg, M. (2001). Accuracy measures for evaluating computer pointing devices. Proceedings of the ACM Conference on Human Factors in Computing Systems – CHI 2001, pp. 9-16 New York: ACM.

9.      Douglas, S. A, Kirkpatrick, A. E., & MacKenzie, I. S. (1999). Testing pointing device performance and user assessment with the ISO 9241, Part 9 standard. Proceedings of the ACM Conference on Human Factors in Computing Systems - CHI '99, pp. 215-222. New York: ACM. [PDF]

10.  MacKenzie, I. S., & Oniszczak, A. (1998). A comparison of three selection techniques for touchpads. Proceedings of the ACM Conference on Human Factors in Computing Systems – CHI ‘98, pp. 336-343. New York: ACM. [PDF]