Fitts's Law

Fitts's law is a mathematical relationship that captures how speed and accuracy trade off in human movement. Originally formulated by Paul M. Fitts (Fitts, 1954), it predicts the time it takes to move to a target as a function of the target's size and the distance that needs to be covered to attain it. In Fitts's study, people were asked to move a hand-held stylus between two targets as fast as possible without missing them. The targets were of identical width (W) and were separated center-to-center by a certain amplitude (A). By systematically varying the width of the targets and distance between them, he found that the average movement time (MT) to go from one target to the other was specified by the following equation: where a and b are empirically defined constants, and the expression log2(2·A/W) was termed the index of difficulty (ID) of the movement. Thus MT is predicted to increase linearly with ID: MT = a + b·ID. The index of difficulty is defined such that it increases with the distance that needs to be covered and the narrowness of the targets (required movement accuracy). Moreover, MT is predicted to be the same as long as the ratio between movement amplitude and target width (A/W) remains unchanged. Keywords: Fitts's law; aiming; speed-accuracy trade off; motor control