Skip to content

Commit

Permalink
Harmonize readme and dev guide (fixes #700) (#701)
Browse files Browse the repository at this point in the history
* Remove redundancy and differences between README and Development Guide

- Removed information from README that repeated or conflicted with the Development Guide (docs/development.md).
- Moved development guidance that was only in the README into the Development Guide
- Added a note to clarify context needed for running integration tests.

* Add information to integration test instructions for creating the necessary test data

* Fix guidance to remove explicit venv creation and use poetry properly throughout

---------

Co-authored-by: Luke Hinds <[email protected]>
  • Loading branch information
wright-io and lukehinds authored Jan 23, 2025
1 parent acd4025 commit a72d86d
Show file tree
Hide file tree
Showing 2 changed files with 74 additions and 82 deletions.
81 changes: 0 additions & 81 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -135,87 +135,6 @@ Check out the developer reference guides:
- [Configuration system](./docs/configuration.md)
- [Logging system](./docs/logging.md)

### Local setup

```bash
# Get the code
git clone https://github.com/stacklok/codegate.git
cd codegate

# Set up virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate

# Install dev dependencies
pip install -e ".[dev]"
```

### Testing

To run the unit tests, execute this command:

```bash
pytest
```

To run the integration tests, create a `.env` file in the repo root directory
and add the following properties to it:

```plain
ENV_OPENAI_KEY=<YOUR_KEY>
ENV_VLLM_KEY=<YOUR_KEY>
ENV_ANTHROPIC_KEY=<YOUR_KEY>
```

Then the integration tests can be executed by running:

```bash
python tests/integration/integration_tests.py
```

## 🐳 Docker deployment

### Build the image

```bash
make image-build
```

### Run the container

```bash
# Basic usage with local image
docker run -p 8989:8989 -p 9090:9090 codegate:latest

# With pre-built pulled image
docker pull ghcr.io/stacklok/codegate:latest
docker run --name codegate -d -p 8989:8989 -p 9090:9090 ghcr.io/stacklok/codegate:latest

# It will mount a volume to /app/codegate_volume
# The directory supports storing Llama CPP models under subdirectory /models
# A sqlite DB with the messages and alerts is stored under the subdirectory /db
docker run --name codegate -d -v /path/to/volume:/app/codegate_volume -p 8989:8989 -p 9090:9090 ghcr.io/stacklok/codegate:latest
```

### Exposed parameters

- CODEGATE_VLLM_URL: URL for the inference engine (defaults to
[https://inference.codegate.ai](https://inference.codegate.ai))
- CODEGATE_OPENAI_URL: URL for OpenAI inference engine (defaults to
[https://api.openai.com/v1](https://api.openai.com/v1))
- CODEGATE_ANTHROPIC_URL: URL for Anthropic inference engine (defaults to
[https://api.anthropic.com/v1](https://api.anthropic.com/v1))
- CODEGATE_OLLAMA_URL: URL for OLlama inference engine (defaults to
[http://localhost:11434/api](http://localhost:11434/api))
- CODEGATE_APP_LOG_LEVEL: Level of debug desired when running the codegate
server (defaults to WARNING, can be ERROR/WARNING/INFO/DEBUG)
- CODEGATE_LOG_FORMAT: Type of log formatting desired when running the codegate
server (default to TEXT, can be JSON/TEXT)

```bash
docker run -p 8989:8989 -p 9090:9090 -e CODEGATE_OLLAMA_URL=http://1.2.3.4:11434/api ghcr.io/stacklok/codegate:latest
```

## 🤝 Contributing

We welcome contributions! Whether it's bug reports, feature requests, or code
Expand Down
75 changes: 74 additions & 1 deletion docs/development.md
Original file line number Diff line number Diff line change
Expand Up @@ -147,7 +147,8 @@ The project uses several tools to maintain code quality:

### 3. Testing

Run the test suite with coverage:
#### Unit Tests
To run the unit test suite with coverage:

```bash
poetry run pytest
Expand All @@ -156,6 +157,35 @@ poetry run pytest
Tests are located in the `tests/` directory and follow the same structure as the
source code.

#### Integration Tests
To run the integration tests, create a `.env` file in the repo root directory and add the
following properties to it:
```
ENV_OPENAI_KEY=<YOUR_KEY>
ENV_VLLM_KEY=<YOUR_KEY>
ENV_ANTHROPIC_KEY=<YOUR_KEY>
```

Next, run import_packages to ensure integration test data is created:
```bash
poetry run python scripts/import_packages.py
```

Next, start the CodeGate server:
```bash
poetry run codegate serve --log-level DEBUG --log-format TEXT
```

Then the integration tests can be executed by running:
```bash
poetry run python tests/integration/integration_tests.py
```

You can include additional properties to specify test scope and other information. For instance, to execute the tests for Copilot providers, for instance, run:
```bash
CODEGATE_PROVIDERS=copilot CA_CERT_FILE=./codegate_volume/certs/ca.crt poetry run python tests/integration/integration_tests.py
```

### 4. Make commands

The project includes a Makefile for common development tasks:
Expand All @@ -168,6 +198,49 @@ The project includes a Makefile for common development tasks:
- `make build`: build distribution packages
- `make all`: run all checks and build (recommended before committing)

## 🐳 Docker deployment

### Build the image

```bash
make image-build
```

### Run the container

```bash
# Basic usage with local image
docker run -p 8989:8989 -p 9090:9090 codegate:latest

# With pre-built pulled image
docker pull ghcr.io/stacklok/codegate:latest
docker run --name codegate -d -p 8989:8989 -p 9090:9090 ghcr.io/stacklok/codegate:latest

# It will mount a volume to /app/codegate_volume
# The directory supports storing Llama CPP models under subdirectory /models
# A sqlite DB with the messages and alerts is stored under the subdirectory /db
docker run --name codegate -d -v /path/to/volume:/app/codegate_volume -p 8989:8989 -p 9090:9090 ghcr.io/stacklok/codegate:latest
```

### Exposed parameters

- CODEGATE_VLLM_URL: URL for the inference engine (defaults to
[https://inference.codegate.ai](https://inference.codegate.ai))
- CODEGATE_OPENAI_URL: URL for OpenAI inference engine (defaults to
[https://api.openai.com/v1](https://api.openai.com/v1))
- CODEGATE_ANTHROPIC_URL: URL for Anthropic inference engine (defaults to
[https://api.anthropic.com/v1](https://api.anthropic.com/v1))
- CODEGATE_OLLAMA_URL: URL for OLlama inference engine (defaults to
[http://localhost:11434/api](http://localhost:11434/api))
- CODEGATE_APP_LOG_LEVEL: Level of debug desired when running the codegate
server (defaults to WARNING, can be ERROR/WARNING/INFO/DEBUG)
- CODEGATE_LOG_FORMAT: Type of log formatting desired when running the codegate
server (default to TEXT, can be JSON/TEXT)

```bash
docker run -p 8989:8989 -p 9090:9090 -e CODEGATE_OLLAMA_URL=http://1.2.3.4:11434/api ghcr.io/stacklok/codegate:latest
```

## Configuration system

CodeGate uses a hierarchical configuration system with the following priority
Expand Down

0 comments on commit a72d86d

Please sign in to comment.