Sunday, March 30, 2014

Week 11 - March 24

The two new sorting algorithms I have encountered are "Quick" sort and "Merge" sort. Since I already learnt recursion, I guess these algorithms would make more sense now than before. This week was all about the efficiency of sorting algorithms in worst case (Big-Oh). 

Both Quick sort and Merge sort use recursion, which is what makes them faster compared to other sorting algorithms. Quick sort partitions the array into to parts with respect to a pivot and sorts these partitions separately. It keeps doing this process recursively and then finally combines all the partitioned arrays together. This algorithm is said to have O(n^2) complexity which is not really close to a linear complexity (linear complexity is the best efficiency one can have). Merge sort is similar but instead it keeps dividing the array into n sub-arrays with only 1 element in it. Then it recursively merges the n sub-arrays in order. This algorithm is said to have O(nlogn) complexity which is close to linear complexity, and hence it is very efficient. 

I think that learning about the complexities of algorithms is a very important topic in computer science because we (humans) want everything to be done as fast as possible. Humans hate waiting! The faster the program, the more incentive it creates for others to use it! This is a really fun topic and this week's lab was the easiest lab of all the labs. However I am still getting used to this Big-Oh notation and it's kind of complicated to understand. 

No comments:

Post a Comment