Skip to content

Commit

Permalink
Add notes about O(n) vs. O(n log n) make_heap issue
Browse files Browse the repository at this point in the history
  • Loading branch information
Morwenn committed Feb 5, 2018
1 parent 22f2fdb commit 0da94da
Showing 1 changed file with 6 additions and 0 deletions.
6 changes: 6 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -203,6 +203,9 @@ than O(n log n) time. In both cases, sorting the heap dominates the algorithm an
original algorithm described by Bron & Hesselink or the revisited algorithm that I will describe later, both are split
into these two distinct phases.
*Note: I am actually wondering whether one of the `make_heap` functions described here runs in O(n), see the
[corresponding issue](https://github.com/Morwenn/poplar-heap/issues/1).*
## Original poplar sort
The original poplar sort algorithm actually stores up to log2(n) integers to represent the positions of the poplars. We
Expand Down Expand Up @@ -755,6 +758,9 @@ due to the insertion sort optimization, but also to the fact computing the size
O(1) and not in O(log n). That said, the complexity is the same: O(n log n) time and O(1) space. We might not have
found an O(n) algorithm to construct the poplar heap, but this one is definitely interesting.
*Note: I am actually wondering whether this version of `make_heap` runs in O(n), see the [corresponding
issue](https://github.com/Morwenn/poplar-heap/issues/1).*
## Additional poplar heap algorithms
While these functions are not needed to implement poplar sort, the C++ standard library also defines two functions to
Expand Down

0 comments on commit 0da94da

Please sign in to comment.