Searching and Sorting
Search things in a list:
- if the list doesn't change frequently, better to sort first, then use binary-search everytime.
Divide and conquer
divide the problem into subproblems, solve them, merge them.
- in-place: Use a constant amount of memory. Everything can be in-place, except Mergesort.The in-place property is independent of loops/recursion stacks (in our class's definition)
- stable: preserves the relative order of elements of the same value
|Insertion Sort||O(n^2)||O(n) (nearly sorted)||O(n^2)|
|Algo/Input type||Random||Equal||Asending||Descending||Nearly Ascending||Nearly Descending|
|(Optimal) Bubble sort||O(N^2)||O(N)||O(N)||O(N^2)||O(kN) but k not sig.||O(N^2)|
|(Min) Selection sort: so poor, but you can find 3 smallest elems||O(N^2)||O(N^2)||O(N^2)||O(N^2)||O(N^2)||O(N^2)|
|Merge sort: best!||O(N logN)||O(N logN)||O(N logN)||O(N logN)||O(N logN)||O(N logN)|
- about it's not in-place: it's not in-place, because need a new list when merging 2 lists.
about it's in-place: we do say quicksort is in-place, meaning it's space complexity is \(O(1)\), but we are talking about the space consumption in the
heap memory. Considering extra memory consumption in the
stack memory, we still need \(O(log n)\) space, no matter for recursion one or iterative one.
Regarding the space complexity of quicksort, we said that it is in-place and therefore has O(1) space complexity. Strictly speaking, if we include the space required for stack frames for the recursive calls, we will need O(n) space in the worst case, and O(log n) space on the average case. However, because we have seen that in the worst case, we are basically doing bubble sort, then with some optimization we don't really need O(n) space, and the worst case space complexity would be O(log n). Furthermore, if we did quicksort iteratively, we would still need to maintain a stack data structure which would take up O(log n) space, and therefore it is correct to say that the space complexity of quicksort is O(log n). We said that it is O(1) space complexity, because we are pretending that stack frames take up 0 memory :P (or it is O(1) space complexity with respect to the heap memory).
So I think it's ok to say it is 空间换时间
3. Selection sort
about it's not stable: By definition, it will swap 2 elements, which causes unstable. However,
Selection sort can be made Stable if instead of swapping, the minimum element is placed in its position without swapping i.e. by placing the number in its position by pushing every element one step forward. But we should also use linked list to implement the stable version, or the
insertwould be as slow as bubblesort
4. Bubble sort
Not comparing Sorting Algorithms
1. counting sort
5. Heap Sort
top k elements: O(k logn) (output sensitive)
- can stop at any time: extract the first k elements. When k << n, this is very useful. O(n + k logn)
- can be in-place: after extract the top element, put it at the last of the array.
Heapsort is not cache friendly. Because the computer will predict that you'll read the array in sequence, so it will cache the following elems. Quicksort takes this advantage. Mergesort not.