A pure Java implementation of the AdaGrad Backpropagation algorithm used in dense layered sigmoid activation neural networks.
This was my first ever project that got me hooked into studying deep learning and machine learning on my own.
I did this project for my grade 7 science fair project, and I'm posting it now for potentially educational content.
The crux of the task is for the neural network algorithm to guess the next number in an arbitrary complex arithmetic series pattern. My implementation beat human performance by a large margin of seconds, demonstrating how AI could help humans solve complex tasks with better efficiency.
This project was completed in 2015, right when Tensorflow was released after I showed my project to my school science fair, and when the Adam paper became popular to pop in my google feed. I remember that I bitterly laughed when I read the news article that Google released Tensorflow, somehow making my next project to create an easy to use deep learning API obsolete.
This GitHub repo is an Eclipse C project. You can directly run the project by referring to my recorded demonstration of the algorithm here: Link
Will post the exact steps to run the algorithm.