|
1 | 1 | # Linear Algebra
|
| 2 | +It is the study of vectors and linear functions. |
| 3 | +- Explanation 1: |
| 4 | + - It is a branch of mathematics that lets you concisely describe coordinates and interactions of planes in higher dimensions and perform operations on them. |
| 5 | +- Explanation 2: |
| 6 | + - Linear algebra is a branch of mathematics that deals with vectors (quantities with both magnitude and direction), matrices (rectangular arrays of numbers), and linear transformations (functions that preserve addition and scalar multiplication). |
| 7 | +- Explain like I am 5 |
| 8 | + - Imagine you have a bunch of LEGO blocks. Each block is like a number, and you can stack them in different ways. If you line them up in rows and columns, that’s like a matrix. If you push or stretch them in a certain direction, that’s like a transformation. Linear algebra helps us understand how things change when we add, move, or stretch these blocks in a straight and predictable way. It’s like playing with numbers in an organized way. |
| 9 | +- it is 1 of the main building blocks of machine learning |
| 10 | +## Applications in Machine Learning |
| 11 | +1. Data set and Date Files |
| 12 | + - We fit the model on a data set in ML |
| 13 | + - This data set is either a matrix or a vector |
| 14 | + - e.g.: our model could be a fitness-related model that predicts the quality of sleep |
| 15 | +2. Images and Photographs |
| 16 | + - Computer vision application |
| 17 | + - you cannot send an image to a model and expect it to understand |
| 18 | + - each image is made of pixels that are colored squares of varying intensities |
| 19 | + - a black and white image is a single-pixel |
| 20 | + - a colored image has 3-pixel values for RGB |
| 21 | + - all images are stored as a matrix |
| 22 | + - each operation (e.g.: cropping, scaling, et cetera) that is performed on the image is described using the notation and operations of linear algebra |
| 23 | +3. Data Preparation |
| 24 | + - dimensionality reduction |
| 25 | + - usually, we come across data that is made up of thousands of variables and our model becomes extremely complicated |
| 26 | + - this is when dimensionality reduction comes into play |
| 27 | + - data sets are represented as matrices and then we can use matrix factorization methods to reduce it into its constituent parts |
| 28 | + - 1 hot encoding |
| 29 | + - it is used when working with categorical data |
| 30 | + - such as class labels for classification problems or categorical input variables |
| 31 | + - it is common to encode categorical variables to make them easier to work with |
| 32 | +4. Linear Regression |
| 33 | + - Used for predicting numerical values in simple regression problems |
| 34 | + - the most common way of solving linear regression is via the least squares optimization that is solved using matrix factorization methods from linear regression |
| 35 | +5. Regularization |
| 36 | + - Overfitting is 1 of the greatest obstacles in ML |
| 37 | + - When a model is too close a fit for the available data to the point that i does not perform well with any new or outside data |
| 38 | + - It is a concept from Linear algebra that is used to prevent the model from overfitting |
| 39 | + - Simple models are models that have smaller coefficient values |
| 40 | + - It is a technique that is often used to encourage a model to minimize the size of coeeficients while it's being fit on data |
| 41 | +6. Principal Component Analysis (PCA) |
| 42 | + - modeling data with many features is challenging and it's hard to know which features of data are relevant and which are not |
| 43 | + - 1 of the methods for automatic reducing the number of columns of a data set is principle component analysis |
| 44 | + - this method is used in ML to create projections of high dimensional data for both visualization and for training models |
| 45 | + - The core of PCA method is a metric factorization method |
| 46 | +7. Latent Semantic Analysis (LSA) |
| 47 | + - it is a form of data preparation used in natural language processing, a subfield of ML for working with text data |
| 48 | + - in this case, documents are usually represented as a large matrices of word occurrences |
| 49 | + - then we can apply matrix factorization methods to them in order to be able to easily compare, query, and use them as the basis for the ML model |
| 50 | +8. Recommender Systems |
| 51 | + - They are used each time you buy something on Amazon or a similar shop and you get recommendations of products based on your previous purchases |
| 52 | +9. Deep Learning (DL) |
| 53 | + - it is a specific subfield of ML |
| 54 | + - Scaled up to multiple dimensions, DL methods work with vectors, matrices, and tensors of inputs and coefficients |
| 55 | +## Vectors |
0 commit comments