Docker & Kubernetes

Containers are great to use to make sure that your analyses and models are reproducible across different environments. While containers are useful for keeping dependencies clean on a single machine, the main benefit is that they enable data scientists to write model endpoints without worrying about how the container will be hosted. This separation of concerns makes it easier to partner with engineering teams to deploy models to production, or using the approaches shown in this chapter data and applied science teams can also own the deployment of models to production.

The best approach to use for serving models depends on your deployment environment and expected workload. Typically, you are constrained to a specific cloud platform when working at a company, because your model service may need to interface with other components in the cloud, such as a database or cloud storage. Within AWS, there are multiple options for hosting containers while GCP is aligned on GKE as a single solution.

The main question to ask is whether it is more cost effective to serve your model using serverless function technologies or elastic container technologies. The correct answer will depend on the volume of traffic you need to handle, the amount of latency that is tolerable for end users, and the complexity of models that you need to host. Containerized solutions are great for serving complex models and making sure that you can meet latency requirements, but may require a bit more DevOps overhead versus serverless functions.


Courtesy by Medium®

Copyright © 2018 Bigdatamatica Solutions Private Limited