Skip to content

Commit 9aae18b

Browse files
Merge pull request #10 from codebygarrysingh/lin-regression-toolkit-py
Added README file for regression toolkit
2 parents 208748a + 03fb3db commit 9aae18b

File tree

1 file changed

+57
-0
lines changed

1 file changed

+57
-0
lines changed
Lines changed: 57 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,57 @@
1+
# Linear Regression Toolkit
2+
3+
**Description:**
4+
5+
This toolkit library provides a set of functions for implementing simple linear regression. Linear regression is a widely used method for modeling the relationship between a dependent variable (target) and one or more independent variables (features). This library is focused on simple linear regression, which deals with a single independent variable and a single dependent variable. It includes functions for making predictions, computing the cost function, and performing gradient descent to optimize model parameters.
6+
7+
**Table of Contents:**
8+
9+
- [Usage](#usage)
10+
- [Functions](#functions)
11+
- [Example Usage](#example-usage)
12+
- [Notes](#notes)
13+
14+
## Usage
15+
16+
This library contains the following functions:
17+
18+
1. `compute_dep_var(x, w, b)`
19+
20+
This function computes the dependent variable's predicted values based on the input independent feature data `x`, weight parameter `w`, and bias parameter `b`. It returns an array of predicted values.
21+
22+
2. `compute_cost_fn(x, y, w, b)`
23+
24+
This function calculates the cost function, which represents the error in the model's predictions. It uses the sum of squared errors between predicted and actual dependent variable values.
25+
26+
3. `compute_deriv_fn(x, y, w, b)`
27+
28+
This function computes the derivatives of the weight and bias parameters, which are used in gradient descent to update the model parameters.
29+
30+
4. `compute_gradient_descent(x, y, n, a, w, b)`
31+
32+
This function performs gradient descent to optimize the weight and bias parameters for the linear regression model. It returns the final optimized weight and bias, as well as a history of cost function values over iterations.
33+
34+
## Example Usage
35+
36+
Here's an example of how you can use these functions to perform linear regression:
37+
38+
```python
39+
import numpy as np
40+
41+
# Sample data
42+
x = np.array([1, 2, 3, 4, 5])
43+
y = np.array([2, 4, 5, 4, 5])
44+
45+
# Initial weight and bias
46+
w_initial = 1
47+
b_initial = 0
48+
49+
# Number of iterations and learning rate
50+
iterations = 100
51+
learning_rate = 0.01
52+
53+
# Perform gradient descent
54+
final_w, final_b, cost_history = compute_gradient_descent(x, y, iterations, learning_rate, w_initial, b_initial)
55+
56+
print("Optimized weight:", final_w)
57+
print("Optimized bias:", final_b)

0 commit comments

Comments
 (0)