Ma­chine learn­ing: the fu­ture is now

Mac|Life - - FEATURE -

>>> An­other new frame­work in­tro­duced at WWDC was CoreML (Ma­chine Learn­ing). It pro­vides de­vel­op­ers with a means to add in­tel­li­gence to their soft­ware, per­haps to an­a­lyze how you use your de­vice – an ex­am­ple be­ing iOS’s Siri App Sug­ges­tions wid­get – or even im­ages and sound.

One of the best-known ex­am­ples of ma­chine learn­ing to­day is com­puter vi­sion: the abil­ity to de­tect and track faces, text, or a bar­code, and even to iden­tify ob­jects. This is al­ready used in Pho­tos’ search en­gine to en­able you to search for spe­cific ob­jects and scenes with­out hav­ing to tag them.

Ap­ple used the classic ex­am­ple of trans­lat­ing a phrase from one lan­guage to an­other to demon­strate nat­u­ral lan­guage pro­cess­ing (NLP). iOS pro­vides APIs to help iden­tify what lan­guage you’re speak­ing, to dis­cern en­ti­ties among your words, and to sort through words to work out which should be grouped as a sin­gle item for anal­y­sis. Per­haps soon your iPhone will be your voice when you visit a place that uses an­other lan­guage.

Newspapers in English

Newspapers from Australia

© PressReader. All rights reserved.