Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add docs for Aider support #30

Merged
merged 2 commits into from
Jan 13, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 7 additions & 0 deletions docs/about/changelog.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,13 @@ Major features and changes are noted here. To review all updates, see the

:::

Related: [Upgrade CodeGate](../how-to/install.md#upgrade-codegate)

- **Aider support** - 13 Jan, 2025\
CodeGate version 0.1.6 adds support for [Aider](https://aider.chat/), an LLM
pair programmer in your terminal. See the
[how-to guide](../how-to/use-with-aider.mdx) to learn more.

- **Semantic versioning for container image** - 8 Jan, 2025\
Starting with v0.1.4, the CodeGate container image is published with semantic
version tags corresponding to
Expand Down
7 changes: 4 additions & 3 deletions docs/about/faq.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,8 @@ sidebar_position: 10
No, CodeGate works _with_ your AI code assistant, as a local intermediary
between your client and the LLM it's communicating with.

### Does CodeGate work with any plugins other than Copilot and Continue?
### Does CodeGate work with any other IDE plugins or coding assistants?

Currently, CodeGate works with GitHub Copilot and Continue. We are actively
exploring additional integrations based on user feedback.
We are actively exploring additional integrations based on user feedback.
[Join the community on Discord](https://discord.gg/stacklok) to let us know
about your favorite AI coding tool!
4 changes: 3 additions & 1 deletion docs/how-to/install.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ application settings, see [Configure CodeGate](./configure.md)

### Alternative run commands {#examples}

Run with minimal functionality for use with **Continue**:
Run with minimal functionality for use with **Continue** or **Aider**:

```bash
docker run -d -p 8989:8989 -p 9090:9090 --restart unless-stopped ghcr.io/stacklok/codegate:latest
Expand Down Expand Up @@ -152,6 +152,7 @@ Now that CodeGate is running, proceed to configure your IDE integration.

- [Use CodeGate with GitHub Copilot](./use-with-copilot.mdx)
- [Use CodeGate with Continue](./use-with-continue.mdx)
- [Use CodeGate with Aider](./use-with-aider.mdx)

## Remove CodeGate

Expand All @@ -160,3 +161,4 @@ integration:

- [Remove CodeGate - GitHub Copilot](./use-with-copilot.mdx#remove-codegate)
- [Remove CodeGate - Continue](./use-with-continue.mdx#remove-codegate)
- [Remove CodeGate - Aider](./use-with-aider.mdx#remove-codegate)
73 changes: 73 additions & 0 deletions docs/how-to/use-with-aider.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,73 @@
---
title: Use CodeGate with Aider
description: Configure the Aider for CodeGate
sidebar_label: Use with Aider
sidebar_position: 90
---

import AiderProviders from '../partials/_aider-providers.mdx';

[Aider](https://aider.chat/) is an open source AI coding assistant that lets you
pair program with LLMs in your terminal.

CodeGate works with the following AI model providers through Aider:

- Local / self-managed:
- [Ollama](https://ollama.com/)
- Hosted:
- [OpenAI](https://openai.com/api/)

:::note

This guide assumes you have already installed Aider using their
[installation instructions](https://aider.chat/docs/install.html).

:::

## Configure Aider to use CodeGate

To configure Aider to send requests through CodeGate:

<AiderProviders />

## Verify configuration

To verify that you've successfully connected Aider to CodeGate, type
`/ask codegate-version` into the Aider chat in your terminal. You should receive
a response like "CodeGate version 0.1.0":

## Next steps

Learn more about CodeGate's features:

- [Access the dashboard](./dashboard.md)
- [CodeGate features](../features/index.mdx)

## Remove CodeGate

If you decide to stop using CodeGate, follow these steps to remove it and revert
your environment.

1. Stop Aider and unset the environment variables you set during the
configuration process:

**OpenAI:** `unset OPENAI_API_BASE` (macOS/Linux) or
`setx OPENAI_API_BASE ""` (Windows)

**Ollama:** `unset OLLAMA_API_BASE` (macOS/Linux) or
`setx OLLAMA_API_BASE ""` (Windows)

1. Re-launch Aider.

1. Stop and remove the CodeGate container:

```bash
docker stop codegate && docker rm codegate
```

1. If you launched CodeGate with a persistent volume, delete it to remove the
CodeGate database and other files:

```bash
docker volume rm codegate_volume
```
22 changes: 12 additions & 10 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,25 +37,27 @@ CodeGate supports several development environments and AI providers.

AI coding assistants / IDEs:

- **[GitHub Copilot](https://github.com/features/copilot)** with Visual Studio
Code
- **[GitHub Copilot](./how-to/use-with-copilot.mdx)** with Visual Studio Code
(JetBrains coming soon!)

- **[Continue](https://www.continue.dev/)** with Visual Studio Code and
- **[Continue](./how-to/use-with-continue.mdx)** with Visual Studio Code and
JetBrains IDEs

CodeGate supports the following AI model providers with Continue:

- Local / self-managed:
- [Ollama](https://ollama.com/)
- [llama.cpp](https://github.com/ggerganov/llama.cpp)
- [vLLM](https://docs.vllm.ai/en/latest/serving/openai_compatible_server.html)
- Ollama
- llama.cpp
- vLLM
- Hosted:
- [OpenRouter](https://openrouter.ai/)
- [Anthropic](https://www.anthropic.com/api)
- [OpenAI](https://openai.com/api/)
- OpenRouter
- Anthropic
- OpenAI

**[Aider](./how-to/use-with-aider.mdx)** with Ollama and OpenAI

As the project evolves, we plan to add support for more IDE assistants and AI
models.
model providers.

## How to get involved

Expand Down
120 changes: 120 additions & 0 deletions docs/partials/_aider-providers.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,120 @@
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';

<Tabs groupId="aider-provider">
<TabItem value="openai" label="OpenAI" default>

You need an [OpenAI API](https://openai.com/api/) account to use this provider.

Before you run Aider, set environment variables for your API key and to set the
API base URL to CodeGate's API port. Alternately, use one of Aider's other
[supported configuration methods](https://aider.chat/docs/config/api-keys.html)
to set the corresponding values.

<Tabs groupId="os">
<TabItem value="macos" label="macOS / Linux" default>

```bash
export OPENAI_API_KEY=<YOUR_API_KEY>
export OPENAI_API_BASE=http://localhost:8989/openai
```

:::note

To persist these variables, add them to your shell profile (e.g., `~/.bashrc` or
`~/.zshrc`).

:::

</TabItem>
<TabItem value="windows" label="Windows">

```bash
setx OPENAI_API_KEY <YOUR_API_KEY>
setx OPENAI_API_BASE http://localhost:8989/openai
```

:::note

Restart your shell after running `setx`.

:::

</TabItem>
</Tabs>

Replace `<YOUR_API_KEY>` with your
[OpenAI API key](https://platform.openai.com/api-keys).

Then run `aider` as normal. For more information, see the
[Aider docs for connecting to OpenAI](https://aider.chat/docs/llms/openai.html).

</TabItem>
<TabItem value="ollama" label="Ollama">

You need Ollama installed on your local system with the server running
(`ollama serve`) to use this provider.

CodeGate connects to `http://host.docker.internal:11434` by default. If you
changed the default Ollama server port or to connect to a remote Ollama
instance, launch CodeGate with the `CODEGATE_OLLAMA_URL` environment variable
set to the correct URL. See [Configure CodeGate](/how-to/configure.md).

Before you run Aider, set the Ollama base URL to CodeGate's API port using an
environment variable. Alternately, use one of Aider's other
[supported configuration methods](https://aider.chat/docs/config/api-keys.html)
to set the corresponding values.

<Tabs groupId="os">
<TabItem value="macos" label="macOS / Linux" default>

```bash
export OLLAMA_API_BASE=http://localhost:8989/ollama
```

:::note

To persist this setting, add it to your shell profile (e.g., `~/.bashrc` or
`~/.zshrc`) or use one of Aider's other
[supported configuration methods](https://aider.chat/docs/config/api-keys.html).

:::

</TabItem>
<TabItem value="windows" label="Windows">

```bash
setx OLLAMA_API_BASE http://localhost:8989/ollama
```

:::note

Restart your shell after running `setx`.

:::

</TabItem>
</Tabs>

Then run Aider:

```bash
aider --model ollama/<MODEL_NAME>
```

Replace `<MODEL_NAME>` with the name of a coding model you have installed
locally using `ollama pull`.

We recommend the [Qwen2.5-Coder](https://ollama.com/library/qwen2.5-coder)
series of models. Our minimum recommendation for quality results is the 7
billion parameter (7B) version, `qwen2.5-coder:7b`.

This model balances performance and quality for typical systems with at least 4
CPU cores and 16GB of RAM. If you have more compute resources available, our
experimentation shows that larger models do yield better results.

For more information, see the
[Aider docs for connecting to Ollama](https://aider.chat/docs/llms/ollama.html).

</TabItem>
</Tabs>
Loading