OpenSource For You

Google develops TensorFlow Serving library


Google has released a stable version of TensorFlow Serving. The new open source library is designed to serve machine-learned models in a production environmen­t by offering out-of-the-box integratio­n with TensorFlow models.

First released in beta this February, TensorFlow Serving is aimed at facilitati­ng the deployment of algorithms and experiment­s while maintainin­g the same server architectu­re and APIs. The library can help developers push multiple versions of machine learning models and even roll them back.

Developers can use TensorFlow Serving to integrate with other model types along with TensorFlow learning models. You need to use a Docker container to install the server binary on non-Linux systems. Notably, the complete TensorFlow package comes bundled with a pre-built binary of TensorFlow Serving.

TensorFlow Serving 1.0 comes with servables, loaders, sources and managers. Servables are basically underlying objects used for central abstractio­n and computatio­n in TensorFlow Serving. Loaders, on the other hand, are used for managing a servable life cycle. Sources include plugin modules that work with servables, while managers are designed to handle the life cycle of servables.

The major benefit of TensorFlow Serving is the set of C++ libraries that offer standards for support, for learning and serving TensorFlow models. The generic core platform is not linked with TensorFlow. However, you can use the library as a hosted service too, with the Google

Cloud ML platform.

Newspapers in English

Newspapers from India