Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ access to the blob storage via service account.

The service account must exist and contain the credentials for the blob storage. First, create a secret with the credentials that the logger agent will use to access the blob storage. The secret must be in the same namespace as the InferenceService.

For S3:
```yaml
apiVersion: v1
kind: Secret
Expand All @@ -33,6 +34,46 @@ data:
AWS_SECRET_ACCESS_KEY: [YOUR_SECRET_ACCESS_KEY]
```

For Azure:
```yaml
apiVersion: v1
kind: Secret
metadata:
name: agent-logger-secret
namespace: default
type: Opaque
data:
AZURE_SERVICE_URL: [YOUR_SERVICE_URL]
AZURE_ACCESS_TOKEN: [YOUR_TOKEN]
AZURE_TENANT_ID: [YOUR_TENANT_ID]
```

For GCS:
```yaml
apiVersion: v1
kind: Secret
metadata:
name: agent-logger-secret
namespace: default
type: Opaque
data:
gcloud-application-credentials.json: |
{
"type": "service_account",
"project_id": "[YOUR_PROJECT_ID]",
"private_key_id": "[YOUR_PRIVATE_KEY_ID]",
"private_key": "[YOUR_PRIVATE_KEY]",
"client_email": "[YOUR_CLIENT_EMAIL]",
"client_id": "110958787537438673409",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "[YOUR_X509_CERT_URL]",
"universe_domain": "googleapis.com"
}
```


Next, create a service account that provides the secret.

```yaml
Expand All @@ -49,14 +90,16 @@ secrets:

Create the inference service and configure the blob storage logger.

When specifying the logger configuration you must specify the bucket url for the data that will be stored. The URL protocol is used to determine the cloud storage:
- `s3://` - S3-compatible
- `abfs://` - Azure
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this the correct url prefix?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Azure supports multiple url prefixes; adfs (azure directory file store) is a supported option. The default for azure urls is https, but that is indistinguishable from a standard HTTPS upstream. We need a way to clearly identify the Azure URLs. I am happy to consider alternatives.

- `gs://` - Google cloud store

When specifying the logger configuration you must specify the logger format for the data that will be stored. Valid values are:
- `json` - The log messages will be stored as JSON files.

Additional supported formats are planned for future releases, such as `parquet` and `csv`.

:::note
Currently, the blob storage implementation is limited to S3 storage.
:::

```yaml
apiVersion: serving.kserve.io/v1beta1
Expand All @@ -67,7 +110,7 @@ spec:
predictor:
logger:
mode: all
url: s3://[YOUR_BUCKET_NAME]
url: [YOUR_BUCKET_URL]
storage:
path: /logs
parameters:
Expand Down