This is the last installment of the 4-part series (Part 1) (Part 2) (Part 3) where we configure and deploy an artificial intelligence workload running on an “edge” Internet of Things (AIoT) workload on cloud native edge infrastructure. The application does predictive maintenance of turbines. For the background and explanation of the use case, refer to the previous part of the tutorial.

Start by cloning the GitHub repository that contains the code, configuration, and Kubernetes deployment manifests. Feel free to explore the source code, configuration, and manifests available in the repository. You can build container images for each of the services to store them in a private registry.

As we start deploying each microservice, I will discuss the design decisions and configuration choices. If you have a K3s cluster configured with Project Calico and Portworx, you can deploy the entire application without building the images.

Read the entire article at The New Stack

Janakiram MSV is an analyst, advisor, and architect. Follow him on Twitter,  Facebook and LinkedIn.