A Machine Learning
Repository, and its algorithms
"Life is a Set of random variables of opportunities
"
- A
Random Variable
is:
- Neither a
Random
: always follow adistribution
- Nor a
Variable
: it is a function ofProbability Density
(i.e. frequency)
The Random Variable
is Quoted from Professor Krishna Jaganaathan
, IIT Madras
(from a Probability Foundation for Electrical Engineers
Course )
The objective is to Utilize the most of it, effectively- ~90% of the time
1. Focused The best learning is the One where pushing, to face off the Poly-shaped, Spaghetti monster complexity (like this one)
2. Relaxed
Keep Space in mind, call for relaxation, when Early signs :
- Once
Creativity
starts lacking, - new ideas becometransparent
Novelity
vanishes into thin air, & Cloud of Routine takes over (its shadows become in everywhere you go)
-
Use an RNG (Random Number Generator) (
StableRNG
) -
Divide & Conquer
Algorithm (see Repository: Cause & Effect )
- Outline a
widrowHoff
Algorithm - Outline a
Green's function
Algorithm
- How to build A Neural-Network Model (Learning algorithm, Error function
erf
, optimization) - Which differentiation Module to Pick? 🤔
- How could we implement an Optimization function?
Prof. Sirnivasa Singupta
, Department of Electronics and Electrical Communication Engineering, IIT Kharagpur.
To the Heroes behind the Scene, a list of Human Beings, & tutors, in which I couldn't do this project without, in the Alphabetic Order:
- Professor
Steven G. Johnson
@stevengj: providing an MIT OCWIntroduction to Numerical Methods
18.335J / 6.337J Julia course (usefully Concise Course) on GitHub Tamas K. Papp [PhD]
@tpapp: Useful Code Help on theJulia Discourse
Professor Tim Holy
(Neuroscience) @timholy: help with Arrays (& Holmes) (as featured, in juliacon2016 Keynote - an overview of Arrays)
- Opens up the possibility for a Statistical Repo 💡
- For more info, please visit the Discussion
-The Ziggurat paper, by Christopher D McFarland : A Modified Ziggurat Algorithm for Generating Exponentially and Normally Distributed Pseudorandom Numbers
(including the author's view on the Tail generation issue)
Good points of this paper is: it does not use rejection regions
(what makes it Interesting for application) (Please Review on p.3, if you will)
We do not improve upon these approaches here and, instead, reuse previous techniques ... Overall, the ZA is ideal for distributions with Infrequent Sampling from the tail, i.e. not heavy-tailed distributions.
Paper on could be found from here A modified ziggurat algorithm for generating exponentially- and normally-distributed Pseudo-random Numbers
A Great Neural Network course, easily Explained by a Humble Prof. S. Sengupta
, (S: Sirnivasa) [R.I.P], by the NPTEL of India right here
This project is a Sown seed
in the ground 🌱
If you do not mind, have free time, can help & give a hand
🤝
Please step in: Your Help would be much Appreciated
- Thank You 🙏
The author won't be held responsible, for any immature actions, & or any signs of code abuse, at all costs