Overview |
---|
1 - Setup the work environment |
2 - Install Knative |
3 - Deploy a Knative Service |
4 - Create a Knative Revision |
5 - Traffic Management |
6 - Auto-Scaling |
Debugging Tips |
This workshop is an adaptation of the Knative Hands-on Workshop I created for IBM.
The original IBM Workshop uses preprovisioned Kubernetes or OpenShift clusters on the IBM Cloud based on IBM Cloud Kubernetes Service (IKS) or Red OpenShift on IBM Cloud (ROKS).
This modified version of the workshop is based on Minikube running on your own workstation.
Knative is a framework running on top of Kubernetes that makes it easier to perform common tasks such as scaling up and down, routing traffic, canary deployments, etc. According to the Knative web site it is “abstracting away the complex details and enabling developers to focus on what matters. It solves the ‘boring but difficult’ parts of deploying and managing cloud native services so you don’t have to.”
How is the experience of deploying an application on Kubernetes versus Knative?
It is an additional layer installed on top of Kubernetes.
It has two distinct components, originally it were three. The third was called Knative Build, it is now a project of its own: Tekton.
This workshop will focus on Knative Serving and will cover the following topics, work through them in sequence:
To complete this workshop, basic understanding of Kubernetes itself and application deployment on Kubernetes is instrumental!
You can find detailed information and learn more about Knative here:
code/cloud-native-starter
directory