Big-O notation describes the time/processing requirement for an algorithm. Only the highest order term (fastest growing) is displayed,
so for very small input sets it may be inaccurate. For example, if an algorithm takes x^3 + 10000x seconds to perform a task, the Big-O (order) is x^3 -- clearly misleading. As x tends toward infinity, however, it is more accurate.