Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

init release #1

Merged
merged 1 commit into from
Feb 19, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 11 additions & 0 deletions .github/dependabot.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
# To get started with Dependabot version updates, you'll need to specify which
# package ecosystems to update and where the package manifests are located.
# Please see the documentation for all configuration options:
# https://docs.github.com/github/administering-a-repository/configuration-options-for-dependency-updates

version: 2
updates:
- package-ecosystem: "cargo" # See documentation for possible values
directory: "/" # Location of package manifests
schedule:
interval: "weekly"
22 changes: 22 additions & 0 deletions .github/workflows/rust.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
name: Rust

on:
push:
branches: [ "main" ]
pull_request:
branches: [ "main" ]

env:
CARGO_TERM_COLOR: always

jobs:
build:

runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v3
- name: Build
run: cargo build --verbose
- name: Run tests
run: cargo test --verbose
25 changes: 25 additions & 0 deletions Cargo.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
[package]
name = "tinygrad"
version = "0.1.0"
edition = "2021"
description = "You like pytorch? You like micrograd? You love tinygrad! ❤️"
license = "MIT"
keywords = ["pytorch", "machine-learning ", "deep-learning", "tinygrad"]
categories = ["Science"]
repository = "https://github.com/wiseaidev/tinygrad"
documentation = "https://docs.rs/tinygrad"
authors = ["Mahmoud Harmouch <[email protected]>"]

# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html

[dependencies]
ndarray = "0.15.6"

[lib]
crate-type = ["cdylib"]

[profile.release]
codegen-units = 1
opt-level = "z"
lto = "thin"
strip = "symbols"
118 changes: 116 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,2 +1,116 @@
# tinygrad
You like pytorch? You like micrograd? You love tinygrad! ❤️
# ✨️ tinygrad

[![Crates.io](https://img.shields.io/crates/v/tinygrad.svg)](https://crates.io/crates/tinygrad)
[![docs](https://docs.rs/tinygrad/badge.svg)](https://docs.rs/tinygrad/)
[![License](https://img.shields.io/badge/license-MIT-blue.svg)](LICENSE)

A Rust crate for building and training neural networks. `tinygrad` provides a simple interface for defining tensors, performing forward and backward passes, and implementing basic operations such as dot products and summation.

## 🚀 Quick Start

Get started with the `tinygrad` library by following these simple steps:

1. Install the `tinygrad` crate by adding the following line to your `Cargo.toml` file:

```toml
[dependencies]
tinygrad = "0.1.0"
```

1. Use the `Tensor` and `ForwardBackward` traits to create and work with tensors:

```rust
use ndarray::{array, Array1};
use tinygrad::{Tensor, Context, TensorTrait};

// Create a tensor
let value = array![1.0, 2.0, 3.0];
let tensor = Tensor::new(value);

// Perform forward and backward passes
let mut ctx = Context::new();
let result = tensor.forward(&mut ctx, vec![tensor.get_value()]);
tensor.backward(&mut ctx, array![1.0, 1.0, 1.0].view());
```

3. Implement custom operations by defining structs that implement the `ForwardBackward` trait:

```rust
use ndarray::ArrayView1;
use tinygrad::{ForwardBackward, Context, TensorTrait};

// Example operation: Dot product
struct Dot;

impl ForwardBackward for Dot {
fn forward(&self, _ctx: &mut Context, inputs: Vec<ArrayView1<f64>>) -> f64 {
let input = &inputs[0];
let weight = &inputs[1];
input.dot(weight)
}

fn backward(&self, ctx: &mut Context, grad_output: ArrayView1<f64>) {
// Implement backward pass
// ...
}
}
```

# 🔧 Usage Example

```rust
use ndarray::{array, Array1};
use tinygrad::{Tensor, Context, TensorTrait};

fn main() {
let input = array![1.0, 2.0, 3.0];
let weight = array![4.0, 5.0, 6.0];

let input_tensor = Box::new(Tensor::new(input));
let weight_tensor = Box::new(Tensor::new(weight));

let dot_fn = Dot;
let mut ctx = Context::new();

let inputs = vec![
input_tensor.get_value(),
weight_tensor.get_value(),
];
let output = dot_fn.forward(&mut ctx, inputs);

println!("Dot product: {:?}", output);

let grad_output = array![1.0, 1.0, 1.0];
dot_fn.backward(&mut ctx, grad_output.view());

let grad_input = &input_tensor.grad.clone();
let grad_weight = &weight_tensor.grad.clone();

println!("Gradient for input: {:?}", grad_input);
println!("Gradient for weight: {:?}", grad_weight);
}
```

# 🧪 Testing

Run tests for the `tinygrad` crate using:

```bash
cargo test
```

## 🌐 GitHub Repository

You can access the source code for the `tinygrad` crate on [GitHub](https://github.com/wiseaidev/tinygrad).

## 🤝 Contributing

Contributions and feedback are welcome! If you'd like to contribute, report an issue, or suggest an enhancement, please engage with the project on [GitHub](https://github.com/wiseaidev/tinygrad). Your contributions help improve this crate for the community.

# 📘 Documentation

Full documentation for `tinygrad` is available on [docs.rs](https://docs.rs/tinygrad/).

# 📄 License

This project is licensed under the [MIT License](LICENSE).
Loading
Loading