diff --git a/docs/cluster-management/config.md b/docs/cluster-management/config.md index 05b7640085..3b530c13ba 100644 --- a/docs/cluster-management/config.md +++ b/docs/cluster-management/config.md @@ -75,17 +75,17 @@ The docker images used by the Cortex cluster can also be overriden, although thi ```yaml # docker image paths -image_operator: cortexlabs/operator:0.16.0 -image_manager: cortexlabs/manager:0.16.0 -image_downloader: cortexlabs/downloader:0.16.0 -image_request_monitor: cortexlabs/request-monitor:0.16.0 -image_cluster_autoscaler: cortexlabs/cluster-autoscaler:0.16.0 -image_metrics_server: cortexlabs/metrics-server:0.16.0 -image_nvidia: cortexlabs/nvidia:0.16.0 -image_fluentd: cortexlabs/fluentd:0.16.0 -image_statsd: cortexlabs/statsd:0.16.0 -image_istio_proxy: cortexlabs/istio-proxy:0.16.0 -image_istio_pilot: cortexlabs/istio-pilot:0.16.0 -image_istio_citadel: cortexlabs/istio-citadel:0.16.0 -image_istio_galley: cortexlabs/istio-galley:0.16.0 +image_operator: cortexlabs/operator:0.16.1 +image_manager: cortexlabs/manager:0.16.1 +image_downloader: cortexlabs/downloader:0.16.1 +image_request_monitor: cortexlabs/request-monitor:0.16.1 +image_cluster_autoscaler: cortexlabs/cluster-autoscaler:0.16.1 +image_metrics_server: cortexlabs/metrics-server:0.16.1 +image_nvidia: cortexlabs/nvidia:0.16.1 +image_fluentd: cortexlabs/fluentd:0.16.1 +image_statsd: cortexlabs/statsd:0.16.1 +image_istio_proxy: cortexlabs/istio-proxy:0.16.1 +image_istio_pilot: cortexlabs/istio-pilot:0.16.1 +image_istio_citadel: cortexlabs/istio-citadel:0.16.1 +image_istio_galley: cortexlabs/istio-galley:0.16.1 ``` diff --git a/docs/deployments/system-packages.md b/docs/deployments/system-packages.md index dcdcabf7cb..a3273801e9 100644 --- a/docs/deployments/system-packages.md +++ b/docs/deployments/system-packages.md @@ -48,11 +48,11 @@ mkdir my-api && cd my-api && touch Dockerfile Cortex's base Docker images are listed below. Depending on the Cortex Predictor and compute type specified in your API configuration, choose one of these images to use as the base for your Docker image: -* Python Predictor (CPU): `cortexlabs/python-predictor-cpu-slim:0.16.0` -* Python Predictor (GPU): `cortexlabs/python-predictor-gpu-slim:0.16.0` -* TensorFlow Predictor (CPU and GPU): `cortexlabs/tensorflow-predictor-slim:0.16.0` -* ONNX Predictor (CPU): `cortexlabs/onnx-predictor-cpu-slim:0.16.0` -* ONNX Predictor (GPU): `cortexlabs/onnx-predictor-gpu-slim:0.16.0` +* Python Predictor (CPU): `cortexlabs/python-predictor-cpu-slim:0.16.1` +* Python Predictor (GPU): `cortexlabs/python-predictor-gpu-slim:0.16.1` +* TensorFlow Predictor (CPU and GPU): `cortexlabs/tensorflow-predictor-slim:0.16.1` +* ONNX Predictor (CPU): `cortexlabs/onnx-predictor-cpu-slim:0.16.1` +* ONNX Predictor (GPU): `cortexlabs/onnx-predictor-gpu-slim:0.16.1` Note: the images listed above use the `-slim` suffix; Cortex's default API images are not `-slim`, since they have additional dependencies installed to cover common use cases. If you are building your own Docker image, starting with a `-slim` Predictor image will result in a smaller image size. @@ -62,7 +62,7 @@ The sample Dockerfile below inherits from Cortex's Python CPU serving image and ```dockerfile # Dockerfile -FROM cortexlabs/python-predictor-cpu-slim:0.16.0 +FROM cortexlabs/python-predictor-cpu-slim:0.16.1 RUN apt-get update \ && apt-get install -y tree \ diff --git a/get-cli.sh b/get-cli.sh index ef8f6478a0..a5ef072fdc 100755 --- a/get-cli.sh +++ b/get-cli.sh @@ -16,7 +16,7 @@ set -e -CORTEX_VERSION_BRANCH_STABLE=0.16.0 +CORTEX_VERSION_BRANCH_STABLE=0.16.1 case "$OSTYPE" in darwin*) parsed_os="darwin" ;;