Wanted: ‘Perfect babysitter.’ Must pass AI checkup
When Jessie Battaglia started looking for a new babysitter for her 1-year-old son, she wanted more information than she could get from a criminal-background check, parent comments and a face-to-face interview.
So she turned to Predictim, an online service that uses “advanced artificial intelligence” to assess a babysitter’s personality, and aimed its scanners at one candidate’s thousands of Facebook, Twitter and Instagram posts.
The system offered an automated “risk rating” of a 24year-old candidate, saying she was at a “very low risk” of being a drug abuser. But it gave a slightly higher risk assessment –
Online holiday shopping off to record start
The rush of Black Friday and the armchair browsing of Cyber Monday are blending into one big holiday shopping event as more customers buy items online and pick them up at brick-andmortar stores.
Adobe Analytics reported Saturday that more customers are going to stores to get items they bought online. That trend contributed to a record $6.22 billion spent online Thursday, up nearly 24 percent from last year. More than $2 billion in sales were done from smartphones alone.
Shoppertrak, which tracks Black Friday foot traffic, reported a 1.7 percent drop from 2017. a 2 out of 5 – for bullying, harassment, being “disrespectful” and having a “bad attitude.”
The system didn’t explain why it had made that decision. But Battaglia, who had believed the sitter was trustworthy, suddenly felt pangs of doubt.
“Social media shows a person’s character,” said Battaglia, 29, who lives near Los Angeles. “So why did she come in at a 2 and not a 1?”
Predictim is offering parents the same playbook that dozens of other tech firms are selling to employers around the world: artificial-intelligence systems that analyze a person’s speech, facial expressions and online history with promises of revealing the hidden aspects of their private lives.
The technology is reshaping how some companies approach recruiting, hiring and reviewing workers, offering employers an unrivaled look at job candidates through a new wave of invasive psychological assessment and surveillance.
The tech firm Fama says it uses AI to police workers’ social media for “toxic behavior” and alert their bosses. And the recruitment-technology firm Hirevue, which works with companies such as Geico, Hilton and Unilever, offers a system that automatically analyzes applicants’ tone, word choice and facial movements during video interviews to predict their skill and demeanor on the job. (Candidates are encouraged to smile for best results.)
But critics say Predictim and similar systems present their own dangers by making automated and possibly life-altering decisions virtually unchecked.
The systems depend on black-box algorithms that give little detail about how they reduced the complexities of a person’s inner life into a calculation of virtue or harm. And even as Predictim’s technology influences parents’ thinking, it remains entirely unproven, largely unexplained and vulnerable to quiet biases over how an appropriate babysitter should share, look and speak.
There’s this “mad rush to seize the power of AI to make all kinds of decisions without ensuring it’s accountable to human beings,” said Jeff Chester, the executive director of the Center for Digital Democracy, a tech advocacy group. “It’s like people have drunk the digital Kool-aid and think this is an appropriate way to govern our lives.”
Predictim’s scans analyze the entire history of a babysitter’s social media, which, for many of the youngest sitters, can cover most of their lives. And the sitters are told they will be at a great disadvantage for the competitive jobs if they refuse.
Predictim’s chief and cofounder Sal Parsa said the company, launched last month as part of the University of California at Berkeley’s Skydeck tech incubator, takes ethical questions about its use of the technology seriously. Parents, he said, should see the ratings as a companion that “may or may not reflect the sitter’s actual attributes.”
But the danger of hiring a problematic or violent babysitter, he added, makes the AI a necessary tool for any parent hoping to keep his or her child safe.