Skip to content

This repository contains the core components of Large Language Models (LLMs). It covers key research papers, architectures, in-depth conceptual explanations, implementations, and more. I hope it serves as a valuable resource to understand the 'MIND' behind these NONHUMAN entities.

Notifications You must be signed in to change notification settings

NONHUMAN-SITE/MIND

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MIND

Main Image

Welcome to MIND, a repository designed to help you explore and understand the core components of Large Language Models (LLMs). Here, you will find diverse materials, including key research papers, architectures, in-depth conceptual explanations, and implementations.

For additional notes and resources, visit: www.nonhuman.site.


🚀 Roadmap

Below is the roadmap for this repository. Each topic is structured to guide your learning and includes a checklist to track progress:

1. Architecture

  • 1.1 Attention Is All You Need
  • 1.2 GPT-2

2. Training

  • 2.1 Data (Tokenizer, types of datasets used)
  • 2.2 LoRA
  • 2.3 RLHF
  • 2.4 DPO
  • 2.5 Scaling Laws

3. Validation

  • 3.1 Benchmarks

4. New Architectures

  • 4.1 MOE (Mixture of Experts)
  • 4.2 VLM (Vision-Language Models)
  • 4.3 Speech-to-Speech

5. Roadmap to O(1) (Test Time Compute)

  • 5.1 Optimization Techniques

🛠 Installation

To get started, follow these steps:

# Clone the repository
git clone https://github.com/NONHUMAN-SITE/MIND  

# Navigate to the project folder
cd MIND  

# Create a virtual environment
conda create --name mind  

# Activate the environment
conda activate mind  

# Install the required dependencies
pip install -r requirements.txt  

✅ Updates

Current Progress

We are currently working on:

  • 1.1 Attention Is All You Need

Completed Tasks

  • Initial setup and structure

Thank you for your interest in MIND! Feel free to contribute and share your insights as we continue to explore the fascinating world of Large Language Models.

About

This repository contains the core components of Large Language Models (LLMs). It covers key research papers, architectures, in-depth conceptual explanations, implementations, and more. I hope it serves as a valuable resource to understand the 'MIND' behind these NONHUMAN entities.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published