A law used in the Keystroke Level Model for estimating the time taken by an operator to complete a computing task. Fitts Law is used to determine the time to get a mouse pointer over an on-screen area.

t = a + b * log2((D/S) + 1)

Where a and b are determined empirically (typically a = 0, b= 0.1), D is the distance to the area and S is the size of the area.

The Keystroke Level Model is used in user-interface analysis to predict how long a task will take by considering the duration of each tiny step, such as a keystroke, or, as here, a mouse movement.