Mixture of Agents using Groq with Godot (not wrapper ,can export Web)
this is not based langchain,however I think almost same behavior.
https://huggingface.co/spaces/Akjava/llm-moa
I used Godot4.3r1 for my webexport,but it was developed under 4.2.2
python version - not my projects https://github.com/skapadia3214/groq-moa
the paper seemds say refer all version(cycle 1 - last),but it would consume too much token.
All the code are MIT
prompts are Apache2.0 I'm not sure prompt license,but I make prompt based groq-moa version. that why it relase under Apache2.0
I recommend use llama 3.1, because of tokens.
sometime broken with other referece,stop using ref in code.
I feel need more bigger LLM. main needs 405b ,agents need around 70b size
I'll support ollama and Gemeni,chatgpt before next month.
characters,add move variant images or remove them depends on response.
This project implements the Mixture-of-Agents architecture proposed in the following paper:
@article{wang2024mixture,
title={Mixture-of-Agents Enhances Large Language Model Capabilities},
author={Wang, Junlin and Wang, Jue and Athiwaratkun, Ben and Zhang, Ce and Zou, James},
journal={arXiv preprint arXiv:2406.04692},
year={2024}
}
For more information about the Mixture-of-Agents concept, please refer to the original research paper and the Together AI blog post.