Or more objectively, the number of digits which can be relied upon to be accurate.

Each successive computation performed with approximate numbers (such as those expressed in a floating-point register in a computer), will produce some amount of additional error.

If you perform many calculations using numbers with relatively few significant digits, each successive step in the calculation will have fewer significant digits.

Compare and contrast accuracy and precision. Each additional significant digit increases precision. Each approximated digit that actually matches the correct value is an increase in accuracy.

The total number of digits, from the leftmost nonzero digit, that are expected to be accurate, is the number of significant digits.