Skip to content

Commit

Permalink
Add logo
Browse files Browse the repository at this point in the history
  • Loading branch information
rounakdatta committed Apr 13, 2024
1 parent e546385 commit 4c9c295
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 0 deletions.
4 changes: 4 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,7 @@
## smolgrad.ml

![smolgrad.ml-logo](smolgrad.ml_logo.png)

Smolgrad is a very simple and educational project around the constructs of a neural network. It implements algorithmic (or automatic) differentiation for some common binary operators, can dynamically build out the graph out of an algebraic expression and then apply backpropagation on that graph. Additionally, it also abstracts the concept of a neuron, stacks up those neurons into layers and propagates input through the network. It is written in OCaml, for the fun (and pun) of it.

Needless to say, it is very rudimentary at the moment and doesn't implement training of the weights and biases. However that is in plan, along with exploration of some other architectures! Of course, it is an inspiration from legendary projects like [micrograd](https://github.com/karpathy/micrograd), [tinygrad](https://github.com/tinygrad/tinygrad) and such.
Expand All @@ -14,3 +17,4 @@ This is built as a library, but not yet published. To experiment, you can tweak
- [ ] Implement training.
- [ ] Use a plotting library like [oplot](https://github.com/sanette/oplot) to visualize training and output for simple classification scenarios.
- [ ] Use a graph library like [ocamlgraph](https://anwarmamat.github.io/ocaml/ocamlgraph) to visualize the DAG generated out of variable operations.
- [ ] Explore inputs beyond scalar values.
Binary file added smolgrad.ml_logo.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit 4c9c295

Please sign in to comment.