Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable the integration tests for Ollama #802

Merged
merged 3 commits into from
Jan 29, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
39 changes: 37 additions & 2 deletions .github/workflows/integration-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -58,10 +58,10 @@ jobs:
echo "Running container from image: $DOCKER_IMAGE"

# Run the container
docker run --name $CONTAINER_NAME -d -p 8989:8989 -p 9090:9090 \
-p 8990:8990 \
docker run --name $CONTAINER_NAME -d --network host \
-v "$(pwd)"/codegate_volume:/app/codegate_volume \
-e CODEGATE_APP_LOG_LEVEL=DEBUG \
-e CODEGATE_OLLAMA_URL=http://localhost:11434 \
--restart unless-stopped $DOCKER_IMAGE

# Confirm the container started
Expand Down Expand Up @@ -146,6 +146,41 @@ jobs:
run: |
poetry run python tests/integration/integration_tests.py

- name: Run Ollama
run: |
docker run -d -v ollama:/root/.ollama --network host --name ollama ollama/ollama
docker ps -f name=ollama
echo "Loop until the endpoint responds successfully"
while ! curl --silent --fail --get "http://localhost:11434" >/dev/null; do
echo "Ollama not available yet. Retrying in 2 seconds..."
sleep 2
done
echo "Ollama is now available!"
docker exec -d ollama ollama run qwen2.5-coder:0.5b

sleep 120 # Sleep for 2 minutes to allow Ollama to download the model. TODO: Improve this
docker logs ollama

# Verify the Ollama API is working
curl http://localhost:11434/api/generate -d '{
"model": "qwen2.5-coder:0.5b",
"prompt": "Why is the sky blue?",
"stream": false
}'

docker logs ollama

- name: Run integration tests - Ollama
env:
CODEGATE_PROVIDERS: "ollama"
run: |
poetry run python tests/integration/integration_tests.py

- name: Print the Ollama container logs (useful for debugging)
if: always()
run: |
docker logs ollama

- name: Print the container logs (useful for debugging)
if: always()
run: |
Expand Down
7 changes: 4 additions & 3 deletions tests/integration/testcases.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ headers:
openai:
Authorization: Bearer ENV_OPENAI_KEY
ollama:
Content-Type: application/json
llamacpp:
anthropic:
x-api-key: ENV_ANTHROPIC_KEY
Expand Down Expand Up @@ -275,20 +276,20 @@ testcases:
"role":"user"
}
],
"model":"qwen2.5-coder:latest",
"model":"qwen2.5-coder:0.5b",
"stream":true,
"temperature":0
}
likes: |
Hello! How can I assist you today? If you have any questions or need guidance on secure coding practices, software security, package analysis, or anything else related to cybersecurity, feel free to ask!
Hello! How can I assist you today?

ollama_fim:
name: Ollama FIM
provider: ollama
url: http://127.0.0.1:8989/ollama/api/generate
data: |
{
"model": "qwen2.5-coder:latest",
"model": "qwen2.5-coder:0.5b",
"max_tokens": 4096,
"temperature": 0,
"stream": true,
Expand Down