Skip to content

pinarkizilarslan/Algorithms-And-Complexity

Repository files navigation

Algorithms And Complexity

Coin Exchange

You are given a set of coins {1, 3, 20, 25}, and an amount. Design a dynamic programming algorithm that returns the minimum number of coins necessary to make up for this amount.
For example, assume you have the coins given above, and the amount is 68.
Then the minimum number of coins necessary to make up for this amount is:
• 1x25 + 2x20 + 1x3 = 25+40+3 = 68 -> 4 coins
Notice that the greedy algorithm would return:
• 2x25 + 6x3 = 50 + 18 = 68 -> 8 coins

Edit Distance

A dynamic programming algorithm that determines the minimum edit distance between X and Y. The edit distance between X and Y to be the minimum number of single character insertions, deletions, and replacements applied to X to make it equal to Y. For example, if X = backache and Y = sackrace, then the edit distance is 3.
The sequence of changes is :

• Replace x_1 with y_1('s')
• Insert y_5('r') after x_4
• Delete x_7('h')

Find Smallest Interval

Finds the smallest [x..y] interval with at least k in a list of “n” numbers consisting of the given integers A = [a0, a1, …, an-1] and k >=2.
The numbers of the list fall within this range (The size of the [x..y] range is y-x.)
The algorithm prints the range at the end.

Huffman Coder

In computer science and information theory, a Huffman code is a particular type of optimal prefix code that is commonly used for lossless data compression.

You are given an array of N unsigned chars (symbols) where each symbol is represented by 8-bits. Design a Huffman Coder that returns the length of the codeword for each of the possible 256 symbols and the total number of bits required to encode this data. Requirement: After computing the frequency of occurrence of each symbol, you must implement a min-heap (priority queue) to implement the Huffman coder.
For example, given Data = [‘a’, ‘b’, ‘b’, ‘a’, ‘c’, ‘d’, ‘a’, ‘a’, ‘c’, ‘a’, ‘c’]
Codeword lengths: [1, 3, 2, 3]
a=0
b=110
c=10
d=111
Total number of bits required to encode the data: 17

Long Integer Multiplication

In this project the divide-and-conquer long integer multiplication algorithm, whose running time is O(n1.58), was applied.

Program will first prompt for N, the number of digits that each integer has. Then the program will prompt for the name of the file where the first integer is stored, then prompt for the name of the file where the second integer is stored. It will then multiply the integers using the divide-and-conquer long integer multiplication algorithm and write the result to a file named “result.txt”.

Here is a typical interaction between the user and program:
Enter N: 1000
Enter first filename: A.txt
Enter second filename: B.txt
The resulting integer is written to 'result.txt'

Minimum Coin Exchange

Given a set of coins that includes 1 returns the minimum number of coins necessary to represent the amount.
Coins are not given in sorted order. This is a Greedy Algorithm and works with O(nlogn).

For example, assume you are given a set of coins {1, 5, 10, 25}, and the amount is 68. Then the minimum number of coins necessary to make up for this amount is:
Two 25 coins
One 10 coin
One 5 coin
Three 1 coins
Total: 7 coins (2x25+1x10+1x5+3x1 = 68).

Minimum Cost Problem

You are given a NxN matrix with each slot containing a number indicating the cost of going through that slot.
Initially you are at (0, 0). You are only allowed to go Down or Right. You want to go to (N-1, N-1). Find the path that has the minimum total cost.
In this problem, you are asked to implement the Bottom-up DP implementation for the minimum cost path problem.
Algorithm must return not only the cost of the minimum path, but also the indices of the matrix slots on the minimum cost path.

Recursive Insertion Sort

Insertion sort can be expressed as a recursive procedure as follows:
In order to sort A[0..n-1], we recursively sort A[0..n-2] and then insert A[n-1] into the sorted array A[0..n-2].

Sorting Algorithms

Bubble Sort:
Time complexity: Average is O(n²). Bubble goes through list comparing adjacent element pairs and swaps pairs based on value.

Selection Sort:
Time complexity: Average is O(n²) and best case is O(n²). Selection goes through the list taking out the minimum element and, starting at the beginning, placing them in order.

Insertion Sort:
Time complexity: Average is O(n²) and best is O(n). Iterate over list and place each element in its correct place as you get to it from first to last.

Merge Sort:
Time complexity: All three cases are O(nlogn). Split arrays in half to create already sorted single element arrays and merge sorted arrays back together putting everything in the proper order.

Quick Sort:
Time complexity: Average case is O(nlogn). Using a pivot point element, the array is split into two parts where everything below the pivot point is smaller and everything above the pivot is larger. The pivot point then moves between the values on the left, putting them sequentially and then the values on the right. This is a "divide and conquer" algorithm because it involves making the larger list into smaller lists that are easier to sort.

Heap Sort:
Time complexity: All cases are O(nlogn). Using a tree based data structure for storing elements in a sorted fashion.

Radix Sort:
Time complexity: Average and best is O(nk). Radix uses bits to sort by digits, starting with first or last digit.

Counting Sort:
Time complexity: Average and best is O(n+k). Counting sorts by keeping track of the amount of times each value appears in the input list.

About

Basic Algorithms and Complexity Problems

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published