Google de­vel­ops Ten­sorFlow Serv­ing li­brary

OpenSource For You - - Fossbytes -

Google has re­leased a sta­ble ver­sion of Ten­sorFlow Serv­ing. The new open source li­brary is de­signed to serve ma­chine-learned mod­els in a pro­duc­tion en­vi­ron­ment by of­fer­ing out-of-the-box in­te­gra­tion with Ten­sorFlow mod­els.

First re­leased in beta this Fe­bru­ary, Ten­sorFlow Serv­ing is aimed at fa­cil­i­tat­ing the de­ploy­ment of al­go­rithms and ex­per­i­ments while main­tain­ing the same server ar­chi­tec­ture and APIs. The li­brary can help de­vel­op­ers push mul­ti­ple ver­sions of ma­chine learn­ing mod­els and even roll them back.

De­vel­op­ers can use Ten­sorFlow Serv­ing to in­te­grate with other model types along with Ten­sorFlow learn­ing mod­els. You need to use a Docker con­tainer to in­stall the server bi­nary on non-Linux sys­tems. No­tably, the com­plete Ten­sorFlow pack­age comes bun­dled with a pre-built bi­nary of Ten­sorFlow Serv­ing.

Ten­sorFlow Serv­ing 1.0 comes with serv­ables, load­ers, sources and man­agers. Serv­ables are ba­si­cally un­der­ly­ing ob­jects used for cen­tral ab­strac­tion and com­pu­ta­tion in Ten­sorFlow Serv­ing. Load­ers, on the other hand, are used for man­ag­ing a serv­able life cy­cle. Sources in­clude plugin mod­ules that work with serv­ables, while man­agers are de­signed to han­dle the life cy­cle of serv­ables.

The ma­jor ben­e­fit of Ten­sorFlow Serv­ing is the set of C++ li­braries that of­fer stan­dards for sup­port, for learn­ing and serv­ing Ten­sorFlow mod­els. The generic core plat­form is not linked with Ten­sorFlow. How­ever, you can use the li­brary as a hosted ser­vice too, with the Google

Cloud ML plat­form.

Newspapers in English

Newspapers from India

© PressReader. All rights reserved.