Simply stated, Big-O notation is a formal mechanism to describe the performance or complexity of an algorithm in relation to its inputs.
The performance of an algorithm is described by its time complexity and its space complexity. Time complexity describes how much longer the algorithm takes to complete as the input grows arbitrarily. The space complexity describes how much memory an algorithm takes to complete as the inputs grow. Together, these concepts are commonly referred to as algorithmic complexity.