Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve Foundry docs #62

Merged
merged 1 commit into from
Jan 21, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/_toc.yml
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,6 @@ parts:
chapters:
- file: foundry/intro
- file: foundry/submission
- file: foundry/server
- file: foundry/demo_hres_t0
- file: foundry/server
- file: foundry/api
6 changes: 2 additions & 4 deletions docs/foundry/intro.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,4 @@
# Aurora on Azure AI Foundry

Aurora can be hosted on [Azure AI Foundry](https://learn.microsoft.com/en-us/azure/ai-studio/what-is-ai-studio).

This part of the documentation describes how you can produce predictions with Aurora running on a Foundry endpoint,
and how you can launch a Foundry endpoint that hosts Aurora.
Aurora [is available as a model on Azure AI Foundry](https://ai.azure.com/explore/models/Aurora/version/1/registry/azureml)!
This part of the documentation describes how you can produce predictions with Aurora running on a Foundry endpoint.
7 changes: 6 additions & 1 deletion docs/foundry/server.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,10 @@
# Hosting Aurora
# Creating an Endpoint

Likely you don't need to create an endpoint yourself,
because Aurora is already available in the [Azure AI model catalog](https://ai.azure.com/explore/models).

Nevertheless, should you want to create an endpoint,
then you can follow these instructions.
The model is served via [MLflow](https://mlflow.org/).
First, make sure that `mlflow` is installed:

Expand Down
26 changes: 16 additions & 10 deletions docs/foundry/submission.md
Original file line number Diff line number Diff line change
@@ -1,20 +1,26 @@
# Submitting Predictions

To produce predictions on Azure AI Foundry, the client will communicate with the host through
a blob storage container.

First, create a client that can communicate with your Azure AI Foundry endpoint:
To produce Aurora predictions on Azure AI Foundry,
you need an endpoint that hosts Aurora.
To create such an endpoint, find Aurora in the [Azure AI Foundry model catalog](https://ai.azure.com/explore/models),
click "Deploy", and follow the instructions.
Once the endpoint has been deployed,
it will have an endpoint URL and access token.
Then create a `FoundryClient` using this URL and token:

```python
from aurora.foundry import FoundryClient

foundry_client = FoundryClient(
endpoint="https://endpoint/",
endpoint="https://endpoint_url/",
token="TOKEN",
)
```

Then set up a blob storage container for communication with the host:
You will communicate with the endpoint through a blob storage container.
You need to create this blob storage container yourself.
Create one, and generate a URL that includes a SAS token _with both read and write rights_.
Then create `BlobStorageChannel` with the blob storage container URL with SAS appended:

```python
from aurora.foundry import BlobStorageChannel
Expand All @@ -24,9 +30,8 @@ channel = BlobStorageChannel(
)
```

The SAS token needs both read and write rights.
The blob storage container will be used to send the initial condition to the host and to retrieve
the predictions from the host.
This blob storage container will be used to send the initial condition to the endpoint
and to retrieve the predictions from the endpoint.

```{warning}
It is important that the SAS token has both read and write rights.
Expand All @@ -35,7 +40,8 @@ To generate a SAS token with read and write rights, navigate to the container in
go to "Shared access tokens", and select both "Read" and "Write" under "Permissions".
```

You can now submit requests in the following way:
You're all done now!
You can submit requests for predictions in the following way:

```python
from datetime import datetime
Expand Down
Loading