You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: source/lectures/misc/sorting.md
+80-5Lines changed: 80 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -51,17 +51,28 @@ We start by defining two simple "helper" methods:
51
51
52
52
## Insertion Sort Algorithm
53
53
54
+
### Implementation
55
+
54
56
This algorithm is [nicely explained and illustrated on wikipedia](https://en.wikipedia.org/wiki/Insertion_sort), and can be implemented as follows:
55
57
56
58
```
57
59
!include`snippetStart="// Insertion Algorithm", snippetEnd="// Done with insertion Algorithm"` code/projects/Sorting/Sorting/Sorting.cs
58
60
```
59
61
62
+
### Description
63
+
64
+
This algorithm moves the `bar` from the beginning of the list to the end, one by one.
65
+
At every step, it position a `slot` on the bar and look *back*, moving the value at the `slot` earlier and earlier on as long as its predecessor is smaller than itself.
66
+
67
+
### Complexity
68
+
60
69
[As explained on wikipedia](https://en.wikipedia.org/wiki/Insertion_sort#Best,_worst,_and_average_cases), the simplest worst case input is an array sorted in reverse order.
61
-
With an array sorted in reverse order, every iteration of the inner loop will scan and shift the entire sorted subsection of the array before inserting the next element. This gives insertion sort a quadratic running time (i.e., $O(n^2)$).
70
+
With an array sorted in reverse order, every iteration of the inner loop will scan and shift the entire sorted subsection of the array (i.e., from `bar` to the beginning) before inserting the next element. This gives a quadratic running time (i.e., $O(n^2)$).
62
71
63
72
## Heapsort Algorithm
64
73
74
+
### Implementation
75
+
65
76
We first define some helper methods:
66
77
67
78
```
@@ -71,25 +82,89 @@ We first define some helper methods:
71
82
and then leverage the heap structure to sort:
72
83
73
84
```
74
-
!include`snippetStart="// Heapsort Algorithm", snippetEnd="// Done with heapsort Algorithm"` code/projects/Sorting/Sorting/Sorting.cs
85
+
!include`snippetStart="// Heapsort algorithm", snippetEnd="// Done with heapsort algorithm"` code/projects/Sorting/Sorting/Sorting.cs
75
86
```
76
87
77
88
Note that `PercDown` builds a *max heap*: once the values are "pre-sorted **greater value first**", removing the first one to move it to the *end* of the list makes the list sorted from smallest to greatest value once we are done.
78
89
79
-
The `PercDown` is first called $N / 2$ times, which is equivalent to $O(n)$.
80
-
Its complexity is $O(\log(n))$, and it is called a second time $O(n)$ times.
81
-
So, we get an overall performance of $O((n + n) \times \log n) = O(n \times \log(n))$.
90
+
### Description
91
+
92
+
This algorithm works in two steps:
93
+
94
+
- First, it constructs the heap "in-place", by arranging the elements from the last-to-lower level (`listP.Count / 2` corresponds to "the last parent") to the first level (`i >=0`), in increasing order (i.e., this is a max heap, the greater value is first),
95
+
- Then, it recursively extract the first element, and place it after the end of the heap: note that `PercDown(listP, 0, i)` makes it so that the heap is considered to stop at index `i`, as it decreases in size by 1 at every iteration.
96
+
97
+
### Complexity
98
+
99
+
- The `PercDown` method has complexity $O(\log(n))$, since it iterates through the tree from top to bottom, i.e., it is relative to the tree height, which is $O(\log(n))$.
100
+
- The first step calls `PercDown` $n / 2$ times, which is equivalent to $O(n)$, so overall this first step is $O(n \times \log(n))$.
101
+
- The second step also calls `PercDown` $n$ times, so it is overall $O(n \times \log(n))$ as well.
102
+
103
+
Hence, the complexity of heapsort is $O(n \times \log(n))$ by [the sum rule](./docs/programming_and_computer_usage/complexity#simplifications).
82
104
83
105
## Bubble Algorithm
84
106
107
+
### Implementation
85
108
86
109
```
87
110
!include`snippetStart="// Bubble Algorithm", snippetEnd="// Done with bubble algorithm."` code/projects/Sorting/Sorting/Sorting.cs
88
111
```
89
112
113
+
### Description
114
+
115
+
The nested loop accomplishes the following: "from the beginning of the list to where I stopped the last time -1, go through the elements one by one and swap them if needed".
116
+
117
+
### Complexity
118
+
119
+
Since both loops depends on the size of the list, $n$, the algorithm is overall $O(n^2)$: we need to perform $n$ times $n$ operations.
120
+
90
121
## ShellSort Algorithm
91
122
123
+
### Implementation
124
+
92
125
```
93
126
!include`snippetStart="// ShellSort Algorithm", snippetEnd="// Done with shellSort algorithm."` code/projects/Sorting/Sorting/Sorting.cs
94
127
```
95
128
129
+
### Description
130
+
131
+
Consider a list of size 30, we have (assuming `current.CompareTo(listP[slot - gap]) < 0` is always `true`):
132
+
133
+
`gap` | `next` | `slot` | `slot - gap`
134
+
---- | ---- | ----- | ----- |
135
+
11 | 11 | 11 | 0
136
+
" | 12 | 12 | 1
137
+
" | 13 | 13 | 2
138
+
… | … | … | … |
139
+
" | 22 | 22 | 11
140
+
" | " | 11 | 0
141
+
" | 23 | 23 | 12
142
+
" | " | 12 | 1
143
+
… | … | … | … |
144
+
11 | 30 | 30 | 19
145
+
" | " | 19 | 8
146
+
5 | 5 | 5 | 0
147
+
" | " | 6 | 1
148
+
" | " | 7 | 2
149
+
… | … | … | … |
150
+
" | 10 | 10 | 5
151
+
" | 10 | 5 | 0
152
+
" | 11 | 11 | 6
153
+
" | " | 6 | 1
154
+
… | … | … | … |
155
+
2 |
156
+
… | … | … |
157
+
1 |
158
+
159
+
The important point is to understand that we generate the sequences
0 commit comments