Skip to content

Big oh uniform usage #82

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Jul 8, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 0 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,6 @@ Welcome to **Data Structures and Algorithms in Go**! 🎉 This project is design
* [Palindrome](./recursion/is_palindrome_test.go)
* [Climbing Stairs](./recursion/climbing_stairs_test.go)
* [Exponentiation](./recursion/exponentiation_test.go)
* [Permutations](./recursion/permutations_test.go)
* [Regular Expressions Matching](./recursion/)
* [Divide and Conquer](./dnc//README.md)
* [Binary Search](./dnc/binary_search_test.go)
Expand Down
55 changes: 33 additions & 22 deletions complexity.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ t│ t│ ... t│
└────────────────────────► └────────────────────────► └────────────────────────►
n n n

O(N Log N) O(Log 2^n) O(2^n)
O(n*Log n) O(Log 2^n) O(2^n)
▲ . ▲ . ▲ .
│ .. │ . │ .
│ . │ . │ .
Expand Down Expand Up @@ -75,57 +75,68 @@ However, it is essential to note that this is not always the case. In practice,

Big O notation of an algorithm can be simplified using the following two rules:

1. Remove constants. `O(n) + 2*O(n Log n) + 3*O(K) + 5` is simplified to `O(n) + O(n Log n) + O(K)`.
2. Remove non dominant, or slower terms. `O(n) + O(n Log n) + O(K)` is simplified to `O(n Log n)` because `O(n Log n)` is the most dominant term..
1. Remove constants. `O(n) + 2*O(n*Log n) + 3*O(K) + 5` is simplified to `O(n) + O(n*Log n) + O(K)`.
2. Remove non dominant, or slower terms. `O(n) + O(n*Log n) + O(K)` is simplified to `O(n*Log n)` because `O(n*Log n)` is the most dominant term..

### Constant - O(K) or O(1)

Constant time complexity represents the most efficient scenario for an algorithm, where the execution time remains constant regardless of the input size. Achieving constant time complexity often involves eliminating loops and recursive calls. Examples:

* Reads and writes in a [hash table](../hashtable)
* Enqueue and Dequeue in a [queue](../queue)
* Push and Pop in a [stack](../stack)
* Finding the minimum or maximum in [heap](../heap)
* Removing the last element of a [doubly linked list](../linkedlist)
* Reads and writes in a [hash table](./hashtable/README.md)
* Enqueue and Dequeue in a [queue](./queue/README.md)
* Push and Pop in a [stack](./stack/README.md)
* Finding the minimum or maximum in [heap](./heap/README.md)
* Removing the last element of a [doubly linked list](./linkedlist/README.md)
* [Max without conditions](./bit/max_function_without_conditions.go)

### Logarithmic - O(Log n)

Attaining logarithmic time complexity in an algorithm is highly desirable as it eliminates the need to iterate through every input in order to solve a given problem. Examples:

* Searching sorted items using [Binary Search](../dnc)
* Inserting, Deleting and Searching in a [Binary Search Tree](../tree)
* Push and Pop in [heap](../heap)
* Searching sorted items using [Binary Search](./dnc/binary_search.go)
* Inserting, Deleting and Searching in a [Binary Search Tree](./tree/README.md)
* Push and Pop in [heap](./heap/README.md)
* [Square Root](./dnc/square_root.go)
* [Median in a Stream](./heap/median_in_a_stream.go)

### Linear - O(n)

Linear time complexity is considered favorable when an algorithm necessitates traversing every input, with no feasible way to avoid it. Examples:

* Removing the last element in a [singly linked list](../linkedlist)
* Searching an unsorted [array](../array) or [linked list](../linklist)
* Removing the last element in a [singly linked list](./linkedlist/README.md)
* Searching an unsorted [array](./array/README.md) or [linked list](./linkedlist/README.md)
* [Number of Islands](./graph/number_of_islands.go)
* [Missing Number](./hashtable/missing_number.go)

### O(n Log n)
### O(n*Log n)

The time complexity of O(n log n) is commonly observed when it is necessary to iterate through all inputs, and can yield an out come at the same time through an efficient operation. Sorting is a common example. It's not possible to sort items faster than O(n log n). Examples:
The time complexity of O(n*Log n) is commonly observed when it is necessary to iterate through all inputs, and can yield an out come at the same time through an efficient operation. Sorting is a common example. It's not possible to sort items faster than O(n*Log n). Examples:

* [Merge Sort](../dnc) and [Heap Sort](../heap)
* In order traversal of a [Binary Search Tree](../tree)
* [Merge Sort](./dnc/merge_sort.go) and [Heap Sort](./heap/README.md)
* [Knapsack](./greedy/knapsack.go)
* [Find Anagrams](./hashtable/find_anagrams.go)
* In order traversal of a [Binary Search Tree](./tree/README.md)

### Polynomial - O(n^2)

Polynomial time complexity marks the initial threshold of problematic time complexity for algorithms. This complexity often arises when an algorithm includes nested loops, involving both an inner loop and an outer loop. Examples:

* Bubble sort rehearsal problem in [array](../array)
* Naive way of searching an unsorted [array](../array) for duplicates by using nested loops
* [Bubble Sort](./array/bubble_sort.go)
* [Cheapest Flight](./graph/cheapest_flights.go)
* [Remove Invalid Parenthesis](./graph/remove_invalid_parentheses.go)

### Exponential O(2^n)

Exponential complexity is considered highly undesirable; however, it represents only the second-worst complexity scenario. Examples:

* Basic [Recursive](../recursion) implementation of Fibonacci
* Tower of Hanoi rehearsal in [divide and conquer](../dnc)
* [Climbing Stairs](./recursion/climbing_stairs.go)
* [Tower of Hanoi](./dnc/towers_of_hanoi.go)
* [Generate Parenthesis](./backtracking/generate_parenthesis.go)
* Basic [Recursive](./recursion/README.md) implementation of Fibonacci

### Factorial O(n!)

Factorial time complexity represents the most severe time complexity for an algorithm. Understanding the scale of factorials is crucial, as even the estimated total number of atoms in the universe, which is approximately 10^80, is smaller than the factorial of 57. Example:

* Permutations rehearsal in [back tracking](../backtracking)
* [N queens](./backtracking/n_queens.go)
* [Permutations](./backtracking/permutations.go)
2 changes: 1 addition & 1 deletion dnc/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@ func search(list []int, target int) int {

## Complexity

If used inappropriately, DNC algorithms can lead to an exponential number of unnecessary recursive calls, resulting in a time complexity of O(2^n). However, if an appropriate dividing strategy and base case that can be solved directly are identified, DNC algorithms can be very effective, with a time complexity as low as O(log n) in the case of binary search. As DNC algorithms are recursive in nature, their complexity analysis is analogous to that of [recursive](../recursion) algorithms.
If used inappropriately, DNC algorithms can lead to an exponential number of unnecessary recursive calls, resulting in a time complexity of O(2^n). However, if an appropriate dividing strategy and base case that can be solved directly are identified, DNC algorithms can be very effective, with a time complexity as low as O(Log n) in the case of binary search. As DNC algorithms are recursive in nature, their complexity analysis is analogous to that of [recursive](../recursion) algorithms.

## Application

Expand Down
2 changes: 1 addition & 1 deletion dnc/binary_search.go
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
package dnc

// BinarySearch solves the problem in O(log n) time and O(1) space.
// BinarySearch solves the problem in O(Log n) time and O(1) space.
func BinarySearch(list []int, search int) int {
return binarySearchRecursive(list, 0, len(list), search)
}
Expand Down
2 changes: 1 addition & 1 deletion dnc/merge_sort.go
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
package dnc

// MergeSort solves the problem in O(n log n) time and O(n) space.
// MergeSort solves the problem in O(n*Log n) time and O(n) space.
func MergeSort(list []int) []int {
if len(list) <= 1 {
return list
Expand Down
2 changes: 1 addition & 1 deletion dnc/square_root.go
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
package dnc

// SquareRoot solves the problem in O(log n) time and O(1) space.
// SquareRoot solves the problem in O(Log n) time and O(1) space.
func SquareRoot(number, precision int) float64 {
start := 0
end := number
Expand Down
2 changes: 1 addition & 1 deletion graph/network_delay_time.go
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ const (
edgeDestination, edgeCost = 0, 1
)

// NetworkDelayTime solves the problem in O(n log n) time and O(n) space.
// NetworkDelayTime solves the problem in O(n*Log n) time and O(n) space.
func NetworkDelayTime(n, k int, edges [][3]int) int {
var (
verticesMap, edgesHeap = verticesAndEdges(edges, k)
Expand Down
2 changes: 1 addition & 1 deletion greedy/knapsack.go
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ type KnapsackItem struct {
Value int
}

// Knapsack solves the problem in O(n*log(n)) time and O(1) space.
// Knapsack solves the problem in O(n*Log n) time and O(1) space.
func Knapsack(items []KnapsackItem, capacity int) int {
sort.Slice(items, func(i, j int) bool {
return items[i].Value/items[i].Weight > items[j].Value/items[j].Weight
Expand Down
2 changes: 1 addition & 1 deletion greedy/task_scheduling.go
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ type Event struct {
EndTime int
}

// Solves the problem in O(n*log(n)) time and O(1) space.
// Solves the problem in O(n*Log n) time and O(1) space.
func ScheduleEvents(events []Event) []Event {
sort.Slice(events, func(i, j int) bool {
return events[i].EndTime < events[j].EndTime
Expand Down
2 changes: 1 addition & 1 deletion hashtable/find_anagrams.go
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ import "sort"

type sortRunes []rune

// FindAnagrams solves the problem in O(n*log(n)) time and O(n) space.
// FindAnagrams solves the problem in O(n*Log n) time and O(n) space.
func FindAnagrams(words []string) [][]string {
anagrams := make(map[string][]string)
for _, word := range words {
Expand Down
6 changes: 3 additions & 3 deletions heap/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ A heap must satisfy two conditions:
1. The structure property requires that the heap be a complete binary search [tree](../tree), where each level is filled left to right, and all levels except the bottom are full.
2. The heap property requires that the children of a node be larger than or equal to the parent node in a min heap and smaller than or equal to the parent in a max heap, meaning that the root is the minimum in a min heap and the maximum in a max heap.

As a result, if you push elements to the min or max heap and then pop them one by one, you will obtain a list that is sorted in ascending or descending order, respectively. This sorting technique is also an O(NLogN) algorithm known as heap sort. Although there are other sorting algorithms available, none of them are faster than O(NLogN).
As a result, if you push elements to the min or max heap and then pop them one by one, you will obtain a list that is sorted in ascending or descending order, respectively. This sorting technique is also an O(n*Log n) algorithm known as heap sort. Although there are other sorting algorithms available, none of them are faster than O(n*Logn).

When pushing a new element to a heap, because of the structure property we always add the new element to the first available position on the lowest level of the heap, filling from left to right. Then to maintain the heap property, if the newly inserted element is smaller than its parent in a min heap (larger in a max heap), then we swap it with its parent. We continue swapping the swapped element with its parent until the heap property is achieved.

Expand Down Expand Up @@ -81,11 +81,11 @@ In Go, the heap implementation is based on slices. The heap property is maintain

## Complexity

The time complexity of pushing and popping heap elements is O(LogN). On the other hand, initializing a heap, which involves pushing N elements, has a time complexity of O(NLogN).
The time complexity of pushing and popping heap elements is O(LogN). On the other hand, initializing a heap, which involves pushing N elements, has a time complexity of O(n*Log n).

The insertion strategy entails percolating the new element up the heap until the correct location is identified. Similarly, the deletion strategy involves percolating down the heap.

Pushing and Popping heap elements are all O(LogN) operations. The strategy for inserting is the new element is percolating up the heap until the correct location is found. similarly the strategy for deletion is to percolate down.
Pushing and Popping heap elements are all O(Log n) operations. The strategy for inserting is the new element is percolating up the heap until the correct location is found. similarly the strategy for deletion is to percolate down.

## Application

Expand Down
2 changes: 1 addition & 1 deletion heap/k_closest_points_to_origin.go
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ type (
pointsHeap []*point
)

// KClosestPointToOrigin solves the problem in O(nlogk) time and O(k) space.
// KClosestPointToOrigin solves the problem in O(n*Log k) time and O(k) space.
func KClosestPointToOrigin(points [][]int, k int) [][]int {
if len(points) <= 1 {
return points
Expand Down
2 changes: 1 addition & 1 deletion heap/median_in_a_stream.go
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ func newMedianKeeper() medianKeeper {
return medianKeeper{&maxHeap{}, &minHeap{}}
}

// addNumber solves the problem in O(log n) time and O(n) space.
// addNumber solves the problem in O(Log n) time and O(n) space.
func (m *medianKeeper) addNumber(num int) {
if m.len()%2 == 0 {
if m.len() == 0 {
Expand Down
2 changes: 1 addition & 1 deletion heap/merge_sorted_list.go
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ type (
priorityQueue []*linkedlist.Node
)

// MergeSortedLists solves the problem in O(nlogk) time and O(k) space.
// MergeSortedLists solves the problem in O(n*Log k) time and O(k) space.
func MergeSortedLists(lists []*linkedlist.Node) *linkedlist.Node {
pq := new(priorityQueue)

Expand Down
2 changes: 1 addition & 1 deletion heap/sliding_maximum.go
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ import "container/heap"

type slidingWindow []int

// MaxSlidingWindow solves the problem in O(nlogk) time and O(k) space.
// MaxSlidingWindow solves the problem in O(n*Log k) time and O(k) space.
func MaxSlidingWindow(numbers []int, k int) []int {
output := []int{}
if len(numbers) <= 1 || len(numbers) < k {
Expand Down
4 changes: 0 additions & 4 deletions recursion/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,10 +73,6 @@ Given n the number of steps, return in how many ways you can climb these stairs

Given x and n, return x raised to the power of n in an efficient manner. [Solution](exponentiation.go) [Test](exponentiation_test.go)

### Permutations

Given a set of integers like `{1,2}`, return all possible permutations like `{1,2},{2,1}`. [Solution](permutations.go) [Test](permutations_test.go)

### Regular Expressions Matching

Given an input and a regular expression pattern where `.` denotes to any character and `*` denotes to zero or more of the proceeding characters, write a recursive function to return true if the input matches the pattern and false otherwise. [Solution](regular_expressions.go) [Test](regular_expressions_test.go)
20 changes: 0 additions & 20 deletions recursion/permutations.go

This file was deleted.

32 changes: 0 additions & 32 deletions recursion/permutations_test.go

This file was deleted.

8 changes: 4 additions & 4 deletions tree/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,21 +40,21 @@ There are many types of trees. Some important tree types include:

A Binary Search Tree (BST) is a type of sorted tree where, for every node n, the values of all nodes in its left subtree are less than n and the values of all nodes in its right subtree are greater than n.

Performing an In-Order traversal of a binary search tree and outputting each visited node results in a sorted (In-Order) list of nodes. This is known as the tree sort algorithm, which has a time complexity of O(NLogN). While there are other sorting algorithms available, none are more efficient than O(NLogN).
Performing an In-Order traversal of a binary search tree and outputting each visited node results in a sorted (In-Order) list of nodes. This is known as the tree sort algorithm, which has a time complexity of O(NLogN). While there are other sorting algorithms available, none are more efficient than O(n*Log n).

### BST Complexity

The time complexity of operations such as Search, Deletion, Insertion, and finding the minimum and maximum values in a binary search tree is O(h), where h represents the height of the tree.

## AVL - Height Balanced BST

A height balanced binary search tree has a height of O(log n) and its left and right subtrees of all nodes have equal heights.
A height balanced binary search tree has a height of O(Log n) and its left and right subtrees of all nodes have equal heights.

In order to maintain balance after an insertion, a single rotation is needed if the insertion was on the outer side, either left-left or right-right, while a double rotation is required if the insertion was on the inner side, either left-right or right-left.

### AVL Complexity

Same as a Binary Search Tree except that the height of the tree is known. So Search, Deletion, Insertion, and finding Min and Max in an AVL tree are all O(LogN) operations.
Same as a Binary Search Tree except that the height of the tree is known. So Search, Deletion, Insertion, and finding Min and Max in an AVL tree are all O(Log n) operations.

## Trie

Expand All @@ -66,7 +66,7 @@ Insertion and Search are done in O(K), where K is the length of the word.

## Application

Trees, such as Binary Search Trees (BSTs), can offer a time complexity of O(log n) for searches, as opposed to the linear access time of linked lists. Trees are widely employed in search systems, and operating systems can represent file information using tree structures.
Trees, such as Binary Search Trees (BSTs), can offer a time complexity of O(Log n) for searches, as opposed to the linear access time of linked lists. Trees are widely employed in search systems, and operating systems can represent file information using tree structures.

## Rehearsal

Expand Down