Large Language Model API interface. This is a simple API interface for large language models which run on Ollama and Anthopic.
The module includes the ability to utilize:
- Maintaining a session of messages
- Tool calling support
- Streaming responses
There is a command-line tool included in the module which can be used to interact with the API. For example,
# Display help
docker run ghcr.io/mutablelogic/go-llm:latest --help
# Interact with Claude to retrieve news headlines, assuming
# you have an API key for Anthropic and NewsAPI
docker run \
--interactive -e ANTHROPIC_API_KEY -e NEWSAPI_KEY \
ghcr.io/mutablelogic/go-llm:latest \
chat claude-3-5-haiku-20241022
See the documentation here for integration into your own Go programs. To create an Ollama agent,
import (
"github.com/mutablelogic/go-llm/pkg/ollama"
)
func main() {
// Create a new agent
agent, err := ollama.New("https://ollama.com/api/v1/")
if err != nil {
panic(err)
}
// ...
}
To create an Anthropic agent,
import (
"github.com/mutablelogic/go-llm/pkg/anthropic"
)
func main() {
// Create a new agent
agent, err := anthropic.New(os.Getev("ANTHROPIC_API_KEY"))
if err != nil {
panic(err)
}
// ...
}
You create a chat session with a model as follows,
import (
"github.com/mutablelogic/go-llm"
)
func session(ctx context.Context, agent llm.Agent) error {
// Create a new chat session
session := agent.Model("claude-3-5-haiku-20241022").Context()
// Repeat forever
for {
err := session.FromUser(ctx, "hello")
if err != nil {
return err
}
// Print the response
fmt.Println(session.Text())
}
}
This module is currently in development and subject to change. Please do file feature requests and bugs here. The license is Apache 2 so feel free to redistribute. Redistributions in either source code or binary form must reproduce the copyright notice, and please link back to this repository for more information:
go-llm
https://github.com/mutablelogic/go-llm/
Copyright (c) 2025 David Thorpe, All rights reserved.