From 6cef1dcc84c3e9d68333d9860bce939fd8f98b0a Mon Sep 17 00:00:00 2001 From: Wessel Bruinsma Date: Mon, 20 Jan 2025 15:35:13 +0100 Subject: [PATCH] Improve description --- docs/_toc.yml | 2 +- docs/foundry/intro.md | 6 ++---- docs/foundry/server.md | 7 ++++++- docs/foundry/submission.md | 26 ++++++++++++++++---------- 4 files changed, 25 insertions(+), 16 deletions(-) diff --git a/docs/_toc.yml b/docs/_toc.yml index bf6dc58..99aa861 100644 --- a/docs/_toc.yml +++ b/docs/_toc.yml @@ -19,6 +19,6 @@ parts: chapters: - file: foundry/intro - file: foundry/submission - - file: foundry/server - file: foundry/demo_hres_t0 + - file: foundry/server - file: foundry/api diff --git a/docs/foundry/intro.md b/docs/foundry/intro.md index 92e1ceb..d3b570f 100644 --- a/docs/foundry/intro.md +++ b/docs/foundry/intro.md @@ -1,6 +1,4 @@ # Aurora on Azure AI Foundry -Aurora can be hosted on [Azure AI Foundry](https://learn.microsoft.com/en-us/azure/ai-studio/what-is-ai-studio). - -This part of the documentation describes how you can produce predictions with Aurora running on a Foundry endpoint, -and how you can launch a Foundry endpoint that hosts Aurora. +Aurora [is available as a model on Azure AI Foundry](https://ai.azure.com/explore/models/Aurora/version/1/registry/azureml)! +This part of the documentation describes how you can produce predictions with Aurora running on a Foundry endpoint. diff --git a/docs/foundry/server.md b/docs/foundry/server.md index ea95da9..9f1adfd 100644 --- a/docs/foundry/server.md +++ b/docs/foundry/server.md @@ -1,5 +1,10 @@ -# Hosting Aurora +# Creating an Endpoint +Likely you don't need to create an endpoint yourself, +because Aurora is already available in the [Azure AI model catalog](https://ai.azure.com/explore/models). + +Nevertheless, should you want to create an endpoint, +then you can follow these instructions. The model is served via [MLflow](https://mlflow.org/). First, make sure that `mlflow` is installed: diff --git a/docs/foundry/submission.md b/docs/foundry/submission.md index 76c0183..7a2c62b 100644 --- a/docs/foundry/submission.md +++ b/docs/foundry/submission.md @@ -1,20 +1,26 @@ # Submitting Predictions -To produce predictions on Azure AI Foundry, the client will communicate with the host through -a blob storage container. - -First, create a client that can communicate with your Azure AI Foundry endpoint: +To produce Aurora predictions on Azure AI Foundry, +you need an endpoint that hosts Aurora. +To create such an endpoint, find Aurora in the [Azure AI Foundry model catalog](https://ai.azure.com/explore/models), +click "Deploy", and follow the instructions. +Once the endpoint has been deployed, +it will have an endpoint URL and access token. +Then create a `FoundryClient` using this URL and token: ```python from aurora.foundry import FoundryClient foundry_client = FoundryClient( - endpoint="https://endpoint/", + endpoint="https://endpoint_url/", token="TOKEN", ) ``` -Then set up a blob storage container for communication with the host: +You will communicate with the endpoint through a blob storage container. +You need to create this blob storage container yourself. +Create one, and generate a URL that includes a SAS token _with both read and write rights_. +Then create `BlobStorageChannel` with the blob storage container URL with SAS appended: ```python from aurora.foundry import BlobStorageChannel @@ -24,9 +30,8 @@ channel = BlobStorageChannel( ) ``` -The SAS token needs both read and write rights. -The blob storage container will be used to send the initial condition to the host and to retrieve -the predictions from the host. +This blob storage container will be used to send the initial condition to the endpoint +and to retrieve the predictions from the endpoint. ```{warning} It is important that the SAS token has both read and write rights. @@ -35,7 +40,8 @@ To generate a SAS token with read and write rights, navigate to the container in go to "Shared access tokens", and select both "Read" and "Write" under "Permissions". ``` -You can now submit requests in the following way: +You're all done now! +You can submit requests for predictions in the following way: ```python from datetime import datetime