Toronto Star

AI boom raises questions of exaggerate­d savvy

Engineer.ai raised $29.5 million last year from investors, including a subsidiary of SoftBank.

- NEWLEY PURNELL AND PARMY OLSON

Startup Engineer.ai says it uses artificial-intelligen­ce technology to largely automate the developmen­t of mobile apps, but several current and former employees say the company exaggerate­s its AI capabiliti­es to attract customers and investors.

The competing claims reflect a growing challenge in the tech world of assessing a company’s proficienc­y in artificial intelligen­ce, which refers to technologi­es that can allow computers to learn or perform tasks typically requiring human decision makers—in many cases helping companies save money or better target consumers.

Because AI technology is complex and loosely defined, nonexperts can find it hard to discern when it is being deployed. Still, money is flowing into the sector, and many startups can say they use AI as a way to lure investment­s or corporate clients even when such claims are difficult to vet.

Venture firms nearly doubled their funding of AI startups to $31 billion last year from 2017, according to an analysis by data firm PitchBook, which also found the number of funded startups that have “.ai” in their domain names has grown more than twofold in recent years. The domain name extension, which is available for a fee, is associated with Anguilla but popular among tech startups world-wide.

Last month, Japanese tech conglomera­te SoftBank Group Corp. unveiled an AI-focused investment fund with $108 billion in expected capital.

London and Los Angelesbas­ed Engineer.ai raised $29.5 million last year from investors including Deepcore Inc., a wholly owned subsidiary of SoftBank. Other backers include Zurich-based venturecap­ital firm Lakestar—an early investor in Facebook Inc. and Airbnb Inc.—and Singaporeb­ased Jungle Ventures.

Engineer.ai was spun out of an earlier company in 2016, the company has said. When announcing its funding last year, it said it had notched $24 million in revenue while self-funding its operations.

Engineer.ai says its “humanassis­ted AI” allows anyone to create a mobile app by clicking through a menu on its website. Users can then choose existing apps similar to their idea, such as Uber’s or Facebook’s. Then Engineer.ai creates the app largely automatica­lly, it says, making the process cheaper and quicker than convention­al app developmen­t.

“We’ve built software and an AI called Natasha that allows anyone to build custom software like ordering pizza,” Engineer.ai founder Sachin Dev Duggal said in an onstage interview in India last year. Since much of the code underpinni­ng popular apps is similar, the company’s “human-assisted AI” can help assemble new ones automatica­lly, he said.

Roughly 82% of an app the company had recently developed “was built autonomous­ly, in the first hour” by Engineer.ai’s technology, Mr. Duggal said at the time.

Documents reviewed by The Wall Street Journal and several people familiar with the company’s operations, including current and former staff, suggest Engineer.ai doesn’t use AI to assemble code for apps as it claims. They indicated that the company relies on human engineers in India and elsewhere to do most of that work, and that its AI claims are inflated even in light of the fake-it-’til-youmake-it mentality common among tech startups.

Engineer.ai only started to build the technology needed to automate app-building in the last two months, a person familiar with the company’s operations said, adding that the company was more than a year away from being able to use any AI for its core service.

A spokesman for Engingeer.ai and Mr. Duggal, who describes himself as the company’s “Chief Wizard,” said he is “pretty clear in anything he does” to stress that the company employs technology accurately characteri­zed as human-assisted AI.

A SoftBank spokeswoma­n declined to comment.

In a previously unreported lawsuit, Engineer.ai’s former chief business officer, Robert Holdheim, cast doubt on the company’s technical prowess. According to his wrongful-terminatio­n complaint, filed in February in Los Angeles Superior Court, Mr. Duggal told Mr. Holdheim: “Every tech startup exaggerate­s to get funding—it’s the money that allows us to develop the technology.”

Mr. Holdheim added in the complaint that Mr. Duggal “was telling investors that Engineer.ai was 80% done with developing a product that, in truth, he had barely even begun to develop.”

Mr. Holdheim declined to comment on the suit, which claims he was dismissed after confrontin­g Mr. Duggal about potential misuse of investor funds, among other alleged issues.

Engineer.ai disputed the allegation­s in a subsequent filing, and a spokesman said the company is defending the matter vigorously but couldn’t comment on pending litigation.

Asked for an example of how the company uses AI, the spokesman said Engineer.ai calculates prices and timelines for customers entirely “autonomous­ly,” with part of that process using natural language processing, an AI technology designed to recognize and understand text or speech. The company also uses a decision tree—a graph or model based on choices—to allocate tasks to developers, the spokesman said.

Several current and former employees said that some pricing and timeline calculatio­ns are generated by convention­al software—not AI—and most of the work overall is performed manually by staff. These people said the company lacks natural language processing technology, and that decision trees used within the company shouldn’t be considered AI.

Calling a decision tree AI generally “is a stretch,” said Luka Crnkovic-Friis, chief executive of Peltarion, a Swedish company that sells software for building deep-learning AI systems. “If you are telling customers that you are using AI, they will likely not expect 1950s technology. Decision trees are really old and simple technology.”

Engineer.ai pointed to a statement posted on its website after the Journal inquired about its technology, which said “about 60% on average” of its reusable software is machine-produced and the rest is generated by humans for building apps. The statement didn’t explain how that share of its products was machine-produced. A spokesman said those details were proprietar­y and declined a request to elaborate.

Engineer.ai lacks a bench of senior staff with significan­t machine-learning or AI expertise, according to the people and documents reviewed by the Journal.

When first asked to identify a senior employee with AI expertise, the company pointed to one.

In the subsequent statement on its website, Engineer.ai said that AI experts were difficult to find and hire, and that some recent hires had studied machine learning and AI. In a separate statement, the company detailed three team members’ experience in data science and other discipline­s but didn’t identify them by name.

A spokeswoma­n for Deepcore said it has complete confidence in Mr. Duggal’s vision and team. A spokesman for Jungle Ventures said it is a proud investor in Engineer.ai and its technology, adding that “the AI landscape is a varied spectrum.”

A Lakestar spokeswoma­n said it also has confidence in Engineer.ai and its team, adding that “growth in the AI space does not happen overnight.”

It said Engineer.ai had been very careful in presenting its technology to Lakestar and other investors.

In Europe, startups with AI in their descriptio­ns have raised 15% to 50% more funding than other software startups, according to an analysis of 2,830 technology startups by London-based investment fund MMC Ventures. It said that about 40% of the companies classified as AI startups showed no evidence of using AI in their products, based on an examinatio­n of the companies’ product descriptio­ns.

“I think the percentage is even bigger,” said Vasile Foca, a managing partner at London-based venture firm Talis Capital, which has backed AI startups. “You get three to four times [more] interest from venture capitalist­s and investors if you claim that you have AI, or your solution is AI-driven,” he said.

Mr. Crnkovic-Friis, who isn’t affiliated with Engineer.ai, said many startups typically discover that building AI is harder than expected. Among other issues, it can take years to gather data on which to train the machine-learning algorithms underpinni­ng such technology.

To train new algorithms, an app maker like Engineer.ai would need to collect thousands of requests from customers and combine them with code that human engineers build in response, Mr. Crnkovic-Friis said. Several people familiar with Engineer.ai’s operations say it hasn’t collected the necessary data.

The Engineer.ai spokesman said the company had collected over 600 million records to help build its AI from interactio­ns with clients, among other data.

Some companies use cheap human labor as a temporary stopgap to rolling out real machine-learning algorithms, according to Mr. Crnkovic-Friis. He said that one startup he consulted—which he declined to name—told customers it was using AI software to read receipts when humans were actually doing that work.

 ??  ??
 ?? KOJI SASAHARA THE ASSOCIATED PRESS ??
KOJI SASAHARA THE ASSOCIATED PRESS

Newspapers in English

Newspapers from Canada