Machine learning: the future is now
>>> Another new framework introduced at WWDC was CoreML (Machine Learning). It provides developers with a means to add intelligence to their software, perhaps to analyze how you use your device – an example being iOS’s Siri App Suggestions widget – or even images and sound.
One of the best-known examples of machine learning today is computer vision: the ability to detect and track faces, text, or a barcode, and even to identify objects. This is already used in Photos’ search engine to enable you to search for specific objects and scenes without having to tag them.
Apple used the classic example of translating a phrase from one language to another to demonstrate natural language processing (NLP). iOS provides APIs to help identify what language you’re speaking, to discern entities among your words, and to sort through words to work out which should be grouped as a single item for analysis. Perhaps soon your iPhone will be your voice when you visit a place that uses another language.