Skip to content

Commit 0bc0718

Browse files
author
Nikos M
committed
v.0.3.0
1 parent 072c39b commit 0bc0718

24 files changed

+848
-1209
lines changed

README.md

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,10 @@
11
Sorting Algorithms
22
===================
33

4-
__Various sorting algorithms implementations in JavaScript (IN PROGRESS)__
4+
![sort.js](/sort.jpg)
5+
6+
7+
__Various sorting algorithms implementations in JavaScript__
58

69

710
[sort.min.js](https://raw.githubusercontent.com/foo123/SortingAlgorithms/master/test/js/sort.min.js)

beeld.config

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -37,6 +37,7 @@ tasks =[{}]
3737
./src/comparison/LibrarySort.js
3838
./src/comparison/MergeSort.js
3939
./src/comparison/QuickSort.js
40+
./src/comparison/TreeSort.js
4041

4142
### Number/Count-Based Algorithms ###
4243
./src/arithmetic/CountingSort.js
@@ -47,7 +48,7 @@ tasks =[{}]
4748
## a couple of custom algorithms ##
4849
./src/arithmetic/IndexSort.js
4950
# not complete, in progress
50-
./src/arithmetic/StatisticalSort.js
51+
#./src/arithmetic/StatisticalSort.js
5152

5253
# !!not implemented yet!!
5354
./src/other/TimSort.js
@@ -64,7 +65,7 @@ tasks =[{}]
6465
replace =[{}]
6566

6667
"@@ROOT@@" = "this"
67-
"@@VERSION@@" = "0.2.5"
68+
"@@VERSION@@" = "0.3.0"
6869
"@@USE_STRICT@@" = '"use strict";'
6970
"@@MODULE@@" = "Sort"
7071

manual.md

Lines changed: 69 additions & 36 deletions
Original file line numberDiff line numberDiff line change
@@ -1,68 +1,99 @@
11

22

3-
Sorting Series (a kind of discrete optimization problem)
4-
lies at the center of Computer Science and Algorithms
5-
because of its many uses
3+
Sorting Series, which is also a **kind of discrete optimization problem** (i.e the permutation function `p` of `0..N-1` which **maximizes** `0*a[p[0]]+1*a[p[1]]+..+(N-1)*a[p[N-1]]` is the **permutation which sorts the array `a` in ascending order** that is `a[p[0]] <= a[p[1]] <= .. <= a[p[N-1]]`), lies at the center of Computer Science and Algorithms because of its many uses.
64

75
(Ref. http://en.wikipedia.org/wiki/Sorting_algorithm)
86

9-
Also Sorting, in one way or another, is integral part
10-
of many other important algorithms and applications (see eg. Knuth TAOCP)
7+
Furthermore Sorting, in one way or another, is integral part of many other important algorithms and applications (see eg. Knuth TAOCP)
118

12-
For example Sorting is very closely associated to Searching,
13-
another topic of immense importance and applications
9+
For example Sorting is very closely associated to Searching, another topic of immense importance and applications.
1410

15-
Under certain sorting states, searching can be achieved in O(logN) time
16-
or even in O(1) time (constant) for almost every search term
11+
Under certain sorting states, searching can be achieved in `O(logN)` time or even in `O(1)` time (constant) for almost every search term.
1712

1813
Sorting has 3 approaches:
1914

2015
(eg. NIST.gov maintains a dictionary of various algorithms at: http://xlinux.nist.gov/dads// )
2116

17+
2218
###Block vs. Online/Adaptive:
2319

24-
* In the Block case, the whole array is available at once
25-
for this case many algorithms are known
26-
(comparison-based=> O(N^2), O(NlogN) complexities)
27-
and
28-
(number/count based=> O(N) complexity) (see below)
20+
1. In the Block case, the whole array is available at once
21+
for this case many algorithms are known (comparison-based=> `O(N^2)`, `O(NlogN)` complexities) and (number/count based=> `O(N)` complexity) (see below)
22+
23+
2. In the Adaptive/Online case, the input series is
24+
accesed one at a time (for example an time-input signal). In this case some of the previous algorithms can be transformed to work adaptively
2925

30-
* In the Adaptive/Online case, the input series is
31-
accesed one at a time (for example an time-input signal)
32-
In this case some of the previous algorithms can be transformed to work adaptively
26+
Apart from that, there are algorithms (like Dynamic Lists, Dynamic Heaps and Balanced Trees, Tries, eg AVL Trees)
27+
which keep an input sequence always in a 'sorted' state (with each new input) with relatively low complexity (eg `O(logN)`)
3328

34-
Apart from that, there are algorithms
35-
(like Dynamic Lists, Dynamic Heaps and Balanced Trees, Tries, eg AVL Trees)
36-
which keep an input sequence always in a 'sorted' state (with each new input)
37-
With relatively low complexity (eg O(logN))
3829

3930
###Comparison-Based vs. Arithmetic/Count-Based:
4031

4132
* Comparison-based sorting algorithms (InsertionSort, MergeSort, QuickSort, etc..) sort
4233
a series by comparing elements with each other in some optimum sense
4334

44-
The best time complexity of these algorithms is (at present) O(NlogN)
35+
The best time complexity of these algorithms is `O(NlogN)`
4536

4637
However better than this can be achieved
4738

48-
* Arithmetic/Count-based sorting algorithms (CountingSort, BucketSort, RadixSort, etc..),
49-
do not use comparisons (of any kind) between elements,
50-
but instead use their arithmetic/counting/statistical properties
39+
* Statistics-based sorting algorithms (CountingSort, BucketSort, RadixSort, etc..), do not use comparisons (of any kind) between elements, but instead use their arithmetic/statistical properties
5140

52-
This makes possible algorithms which can sort in linear O(N) time (the fastest possible)
41+
This makes possible algorithms which can sort in linear `O(N)` time (the fastest possible)
5342
However these algorithms have some limitations (eg only Integers, or special kinds of Numbers)
5443

55-
Is O(N) sorting possible for arbitrary random numbers??
44+
45+
> Is `O(N)` sorting possible for arbitrary random numbers??
46+
47+
48+
Computing the value of a certain number `n` requires approximately `O(logn)` *"primitive digit"* operations. Since (statisticaly) the **values of numbers in a list is correlated to the size of the list itself** (i.e a list of size `N` contains random numbers in the range `0..N` with **very high probability** over lists of same size for numbers in a given range), one then has an overall complexity of `O(NlogN)` even for arithmetic-based sorting algorithms (see for example *"what is the true complexity of radix sort?"*).
49+
50+
> Classical algorithms for integer sorting require **assumptions about the size of the integers** to be sorted, or else have a **running time dependent on the size**.
51+
52+
-- [Sorting in Linear Time?](https://www.cs.unc.edu/~plaisted/comp550/linear%20time%20sorting.pdf)
53+
54+
However the catch here is that same holds for comparing arbitrary numbers, computationaly one has to compare `primitive digit` by `primitive digit` in sequence on average, hence an additional `O(logn)` complexity for comparison-based algorithms.
55+
56+
57+
> Is `O(NlogN)` complexity a kind of *strict base line* for this computational model??
58+
59+
According to Knuth's theoretical lower bound theorem for general (comparison) sorting algorithms (note `O(logN!) = O(NlogN)`): the `O(NlogN)` bound is asymptoticaly tight (see also [information-theoretic lower bound for comparison sorts](http://www.inf.fh-flensburg.de/lang/algorithmen/sortieren/lowerbounden.htm) ie &Omega;(NlogN) ).
60+
61+
62+
A summary of various sorting/searching algorithms can be found in [this pdf](http://epaperpress.com/sortsearch/download/sortsearch.pdf)
63+
64+
65+
**Included Algorithms**
66+
67+
* Builtin (JavaScript's default sorting algorithm)
68+
* [Bubble Sort](http://en.wikipedia.org/wiki/Bubble_sort)
69+
* [Cocktail Sort](http://en.wikipedia.org/wiki/Cocktail_shaker_sort)
70+
* [Cycle Sort](http://en.wikipedia.org/wiki/Cycle_sort)
71+
* [Heap Sort](http://en.wikipedia.org/wiki/Heap_sort)
72+
* [Insertion Sort](http://en.wikipedia.org/wiki/Insertion_sort)
73+
* [Library Sort](http://en.wikipedia.org/wiki/Library_sort)
74+
* [Shell Sort](http://en.wikipedia.org/wiki/Shellsort)
75+
* [Quick Sort](http://en.wikipedia.org/wiki/Quicksort)
76+
* [Tree Sort](http://en.wikipedia.org/wiki/Tree_sort)
77+
* [Merge Sort](http://en.wikipedia.org/wiki/Merge_sort)
78+
* [Counting Sort](http://en.wikipedia.org/wiki/Counting_sort)
79+
* [Bucket Sort](http://en.wikipedia.org/wiki/Bucket_sort)
80+
* [Radix Sort](http://en.wikipedia.org/wiki/Radix_sort) (**not implemented yet**)
81+
* [Burst Sort](http://en.wikipedia.org/wiki/Burstsort) (**not implemented yet**)
82+
* [Tim Sort](http://en.wikipedia.org/wiki/Timsort) (**not implemented yet**)
83+
* Permutation Sort (**custom**)
84+
* Index Sort (**custom**)
85+
* Statistical Sort (**custom, in progress**)
86+
5687

5788
------------------------------------------------------
5889

5990
NOTE: The calculation of asymptotic complexity is done usually (using recursive relations)
6091
with the Master Theorem :
6192

6293
Refs.
63-
http://en.wikipedia.org/wiki/Master_theorem,
64-
http://en.wikipedia.org/wiki/Introduction_to_Algorithms
65-
94+
http://en.wikipedia.org/wiki/Master_theorem,
95+
http://en.wikipedia.org/wiki/Introduction_to_Algorithms
96+
6697

6798
T(n) = aT(n/b) + f(n), a>=1, b>1
6899

@@ -78,14 +109,16 @@ In a concice library
78109

79110

80111
> __Algorithms as a technology__ Suppose computers were infinitely fast and memory was free. Would you have any reason to study algorithms? The answer is yes, if for no other reason than that you would still like to demonstrate that your solution method terminates and does so with the correct answer.
81-
...Of course, computers may be fast but not infinitely fast and memory may be cheap but not completely free. Computing time is therefore a bounded resource, and so is space in memory. These resources should be used wisely and algorithms that are efficient in terms of time and space will help you do so.
82-
This demostrates that algorithms, like computer hardware, are a __technology__ . Total system performance depends on choosing efficient algorithms as much as choosing fast hardware. Just as rapid advances are being made in other computer technologies, they are being made in algorithms as well. (__Introduction to algorithms, 2nd Ed. Cormen,Leiserson,Rivest,Stein__)
83-
84-
85-
112+
...Of course, computers may be fast but not infinitely fast and memory may be cheap but not completely free. Computing time is therefore a bounded resource, and so is space in memory. These resources should be used wisely and algorithms that are efficient in terms of time and space will help you do so.
113+
This demostrates that algorithms, like computer hardware, are a __technology__ . Total system performance depends on choosing efficient algorithms as much as choosing fast hardware. Just as rapid advances are being made in other computer technologies, they are being made in algorithms as well. (__Introduction to algorithms, 2nd Ed. Cormen,Leiserson,Rivest,Stein__)
114+
115+
116+
117+
__Algorithms as a "green" technology__
118+
86119
Additionaly, every operation/instruction a computer performs has an energy consumption cost. So an efficient algorithm saves energy!
87120
An efficient algorithm performs a computation by trying to use the resources in the best possible manner, so effectively uses energy in the best possible manner.
88121
Where does energy come from? It comes from burning coal (mainly).
89122
So there you have it, efficient code is ecological!
90-
Better start learning your [complexity](http://en.wikipedia.org/wiki/Computational_complexity_theory) soon.
123+
Better start learning your [complexity]( http://en.wikipedia.org/wiki/Computational_complexity_theory) soon.
91124

sort.jpg

6.86 KB
Loading

0 commit comments

Comments
 (0)