Deploying KServe on OKE
This post demonstrated how to deploy an XGBoost model using KServe on Oracle Kubernetes Engine (OKE). Starting from model upload to Object Storage, we served the model via KServe and exposed it through Istio Gateway. Instead of deploying the frontend in-cluster, we built a Streamlit app hosted on Streamlit Community Cloud, which sends requests to the public inference endpoint. This end-to-end setup showcases a scalable and cloud-native ML deployment pipeline on OCI, separating model serving and user interface layers for flexibility and ease of maintenance.
Read more