Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add LocalAI #91

Merged
merged 2 commits into from
Nov 29, 2023
Merged

Add LocalAI #91

merged 2 commits into from
Nov 29, 2023

Conversation

mudler
Copy link
Contributor

@mudler mudler commented Nov 28, 2023

What is this tool for?

LocalAI is the free, Open Source OpenAI alternative. LocalAI act as a drop-in replacement REST API that’s compatible with OpenAI API specifications for inferencing. It allows you to run LLMs, generate images, audio, and more locally or on-prem with consumer grade hardware, supporting multiple model architectures.

What's the difference between this tool and similar ones?

  • LocalAI approach encompasses multiple backends to run inference, such as llama.cpp, or vllm to name a few
  • Is multi-model
  • Container and Kubernetes friendly (helm charts, and local deployment)

Anyone who agrees with this pull request could submit an Approve review to it.

README.md Outdated Show resolved Hide resolved
Co-authored-by: Kelvin S. do Prado <[email protected]>
@kelvins kelvins merged commit eb001d5 into kelvins:main Nov 29, 2023
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants