We introduce big O notation, which provides an upper bound for the runtime of our algorithm, to simplify our analysis. We often choose the simplest and tightest bound, for example O(n), O(nlogn)...Understanding the algorithm analysis is very important and necessary in this information-exploding world. Although the effect of choosing a time-saving algorithm is not very obvious in our current study, when we are dealing with large data it can be a significant difference between two choices of algorithms . The performance of the algorithm does count.
In addition to selection sort, bubble sort, insertion sort, we basically learned merge sort and quick sort in CSC148. We can tell from the chart provided above that the merge sort and the quick sort are more efficient than those previous sorting algorithms we learned in CSC108. Both of the two algorithms are based on the use of recursion. We can use Master Theorem to compute the time complexity of a recursion algorithm. I learned this from my TA in lab.
I found this website very useful in understanding efficiency of the algorithms "http://bigocheatsheet.com/ ". So I'm sharing it for you all.
However, sorting algorithms are not limited to those I mentioned above. There's also an important sorting algorithm called tim sort, which is used for the python built-in sort.