# Some notes on Searching and Sorting

## Searching and Sorting

### Searching algorithms

Search things in a list:

- if the list doesn't change frequently, better to sort first, then use binary-search everytime.

#### Divide and conquer

divide the problem into subproblems, solve them, merge them.

### Sorting algorithms

- in-place: Use a constant amount of memory. Everything can be in-place, except Mergesort.The in-place property is independent of loops/recursion stacks (in our class's definition)
- stable: preserves the relative order of elements of the same value

Sorting Algorithm | Average | Best | Worst | In-place | Stable |
---|---|---|---|---|---|

Bubblesort | O(n^2) | O(n) | O(n^2) | Yes | Yes |

Selection Sort | O(n^2) | O(n^2) | O(n^2) | Yes | |

Insertion Sort | O(n^2) | O(n) (nearly sorted) | O(n^2) | ||

Mergesort | O(nlogn) | O(nlogn) | O(nlogn) | No | Yes |

Quicksort | O(nlogn) | O(nlogn) | O(n^2) | No |

Algo/Input type | Random | Equal | Asending | Descending | Nearly Ascending | Nearly Descending |
---|---|---|---|---|---|---|

(Optimal) Bubble sort | O(N^2) | O(N) | O(N) | O(N^2) | O(kN) but k not sig. |
O(N^2) |

(Min) Selection sort: so poor, but you can find 3 smallest elems |
O(N^2) | O(N^2) | O(N^2) | O(N^2) | O(N^2) | O(N^2) |

Insertion sort | O(N^2) | O(N) | O(N) | O(N^2) | O(N) | O(N^2) |

Merge sort: best! |
O(N logN) | O(N logN) | O(N logN) | O(N logN) | O(N logN) | O(N logN) |

###### 1. Mergesort

**about it's not in-place:**it's not in-place, because need a new list when merging 2 lists.

###### 2. Quicksort

**about it's in-place**: we do say quicksort is in-place, meaning it's space complexity is \(O(1)\), but we are talking about the space consumption in the`heap memory`

. Considering extra memory consumption in the`stack memory`

, we still need \(O(log n)\) space, no matter for recursion one or iterative one.Yongqi's explanation:

Regarding the space complexity of quicksort, we said that it is in-place and therefore has O(1) space complexity. Strictly speaking, if we include the space required for stack frames for the recursive calls, we will need O(n) space in the worst case, and O(log n) space on the average case. However, because we have seen that in the worst case, we are basically doing bubble sort, then with some optimization we don't really need O(n) space, and the worst case space complexity would be O(log n). Furthermore, if we did quicksort iteratively, we would still need to maintain a stack data structure which would take up O(log n) space, and therefore it is correct to say that the space complexity of quicksort is O(log n). We said that it is O(1) space complexity, because we are pretending that stack frames take up 0 memory :P (or it is O(1) space complexity with respect to the heap memory).

So I think it's ok to say it is 空间换时间

##### 3. Selection sort

**about it's not stable:**By definition, it will swap 2 elements, which causes unstable. However,Selection sort can be made Stable if instead of swapping, the minimum element is placed in its position without swapping i.e. by placing the number in its position by pushing every element one step forward. But we should also use linked list to implement the stable version, or the

`insert`

would be as slow as bubblesort

##### 4. Bubble sort

#### Not comparing Sorting Algorithms

##### 1. counting sort

适合：整数；很多的重复的同一种数，例如所有数都在0-100之间

缺点：额外空间

优点：O(n)

##### 5. Heap Sort

Heapify: O(n)

top k elements: O(k logn) *(output sensitive)*

- can stop at any time: extract the first k elements. When k << n, this is very useful. O(n + k logn)
- can be in-place: after extract the top element, put it at the last of the array.

Heapsort is **not cache friendly**. Because the computer will predict that you'll read the array in sequence, so it will cache the following elems. Quicksort takes this advantage. Mergesort not.