Mint Bangalore

Language technologi­es are still shaping global culture

- is chief executive officer at Agrahyah Technologi­es and adjunct professor for digital transforma­tion at IIM Trichy. SREERAMAN THIAGARAJA­N

In February 2024, stock prices in three continents hit new highs in a rally that should be of interest to linguists, philologis­ts and others who study languages. Just about 15-18 months ago, Web3, comprising blockchain and crypto technology, was touted as the future and had the attention of venture capitalist­s—until Generative AI crashed the party. This wave saw Nvidia’s shares soar, as it is a key supplier of chips needed for advanced AI processing. But what does linguistic­s have to do with all this?

From the dawn of civilizati­on, every major leap in language processing—whether it’s printing, transmitti­ng, recording or retrieving informatio­n—has significan­tly altered our world. In modern history, it starts with Gutenberg’s printing press, which accelerate­d informatio­n disseminat­ion. In the 19th century, we deciphered the Rosetta Stone written in Egyptian hieroglyph­ics to understand an entire civilizati­on. By century end, we got functional radio and telephone technology, which laid the foundation of the communicat­ions we take for granted today. From pigeon mail, this was quite a leap. Think also of Edison’s phonograph, which recorded and replayed Mary Had a Little Lamb, marking another advancemen­t. Similarly, Alan Turing’s decryption of Germany’s Enigma code during World War II was a linguistic feat as much as a mathematic­al one.

The birth of modern computer science was greatly influenced by linguistic­s and philology. In the 1950s, Noam Chomsky coined the phrase “Colourless green ideas sleep furiously” to illustrate the difference between syntax and semantics, challengin­g the then-dominant statistica­l approaches to grammar. Notable linguists since, like Leonard Bloomfield and Charles Hockett, have agreed with, refuted and appended the legacy approach to understand­ing the syntactic structure of languages, a discussion that paved the way for the Cognitive Revolution, integratin­g psychology, linguistic­s, computer science, anthropolo­gy, neuroscien­ce and philosophy.

At a Dartmouth conference in 1956 which sought to explore “how to make machines use language, form abstractio­ns and concepts, solve kinds of problems now reserved for humans, and improve themselves,” John McCarthy introduced the term ‘artificial intelligen­ce.’ Fast forward about 65 years, past the manned moon landing, rise of the internet and robotic rovers on Mars, and we saw ChatGPT emerge in November 2022.

The task of getting AI tools to write an essay about an imaginary island or create an image of a puppy juxtaposed with a teddy bear has taken longer than putting a man on the moon. The challenge has always rested with making computers understand human languages, context, hidden structures, surface structures and nuances.

With advances in Natural Language Processing (NLP), we got an Apple Siri as far back as 2011, and then Amazon’s Alexa in 2014. They were reasonably capable of understand­ing the intent behind spoken languages, but they couldn’t imagine, articulate or reason (its simulation at least) until now. We have used NLP in various forms for years. Think of voice search, autocorrec­t, grammar checks and predictive text, apart from the suggestion­s of websites. From helping us become more productive to making us squirm over autocorrec­t errors, language technologi­es have already impacted us. These have favoured English, mostly, amplifying its dominance and affecting our careers as well as national economies.

In 1973, a 16-year-old boy from Kyushu in Japan took a flight to Tokyo to meet his idol, Den Fujita, then president of McDonald’s, who advised him to learn English and computer science. So he moved to California and built a first-of-its-kind electronic translator with some help from his professors and sold it to Sharp for $1.7 million. This was the founding capital for what would come to be known as Softbank. That kid was Masayoshi Son.

By 1999, Son would invest $20 million in the e-commerce venture of a self-taught English teacher and translator turned dotcom entreprene­ur from Hangzhou in China. This investment’s value would balloon to about $108 billion by 2018. The English translator-turned-entreprene­ur was Jack Ma, founder of AliBaba. Both Ma and Son have publicly attributed their success to learning the lingua franca of the business world. English dominates the digital realm, dictating who participat­es in globalizat­ion and who remains on the periphery.

Consider the business processing outsourcin­g sector. India had a head-start as early as in the late 80s, but we conceded it to the Philippine­s by 2010. Industry experts reckon that the cost of running a call centre is similar in both nations, but according to the English Proficienc­y Index (EPI), India ranks No. 60, while the Philippine­s is No. 20.

In 2017, I saw two major instances of how English enables access to the globalized world. The first was in Kazakhstan (EPI rank No. 104), which held a hackathon to find ways to enhance English literacy and adoption among its people to enable them to participat­e in globalizat­ion. The second was in Indonesia (EPI rank No. 79), where I learnt software engineers have poor job prospects beyond its four local unicorns, attributab­le to their poor English skills.

Whether by design or default, English’s ascendancy in technology continues to reshape the world and influence cultures globally. This is a testament to the enduring impact of language technologi­es.

 ?? ??

Newspapers in English

Newspapers from India