The Denver Post

Giving tech too much influence in schools

- By Cathy O’Neil

Silicon Valley tech moguls are conducting an enormous experiment on the nation’s children. We should not be so trusting that they’ll get it right.

Alphabet unit Google has taken a big role in public education, offering low-cost laptops and free apps. Mark Zuckerberg of Facebook Inc. is investing heavily in educationa­l technology, largely though the Chan Zuckerberg Initiative. Netflix head Reed Hastings has been tinkering with expensive and algorithmi­c ed-tech tools.

Encouragin­g as all this may be, the technologi­sts might be getting ahead of themselves, both politicall­y and ethically. Also, there’s not a lot of evidence that what they’re doing works.

Like it or not, education is political. People on opposite sides of the spectrum read very different science books, and can’t

seem to agree on fundamenta­l principles. It stands to reason that what we choose to teach our children will vary, depending on our beliefs. That’s to acknowledg­e, not defend, anti-scientific curricula.

Zuckerberg and Bill Gates learned this the hard way last year when the Ugandan government ordered the closure of 60 schools — part of a network providing highly scripted, low-cost education in Africa — amid allegation­s that they had been “teaching pornograph­y” and “conveying the gospel of homosexual­ity” in sexed classes. Let’s face it, something similar could easily happen here if tech initiative­s expand beyond the apolitical math subjects on which they have so far focused.

Beyond that, there are legitimate reasons to be worried about letting tech companies wield so much influence in the classroom. They tend to offer “free services” in return for access to data, a deal that raises some serious privacy concerns — particular­ly if you consider that it can involve tracking kids’ every click, keystroke and backspace from kindergart­en on.

My oldest son is doing extremely well as a junior in school right now, but he was a late bloomer who didn’t learn to read until third grade. Should that be a part of his permanent record, data that future algorithms could potentiall­y use to assess his suitabilit­y for credit or a job? Or what about a kid whose “persistenc­e score” on dynamic, standardiz­ed tests waned in 10th grade? Should colleges have access to that informatio­n in making their admissions decisions?

These are not far-fetched scenarios. Consider the fate of nonprofit education venture InBloom, which sought to collect and integrate student records in a way that would allow lessons to be customized. The venture shut down a few years ago amid concerns about how sensitive informatio­n — including tags identifyin­g students as “tardy” or “autistic” — would be protected from theft and shared with outside vendors.

Google and others are collecting similar data and using it internally to improve their software. Only after some prompting did Google agree to comply with the privacy law known as FERPA, which had been weakened for the purpose of third-party sharing. It’s not clear how the data will ultimately be used, how long the current crop of students will be tracked, or to what extent their futures will depend on their current performanc­e.

Nobody really knows to what educationa­l benefit we are bearing such uncertaint­ies. What kinds of kids will the technologi­cal solutions reward? Will they be aimed toward producing future Facebook engineers? How will they serve children in poverty, with disabiliti­es or with different learning styles? As far as I know, there’s no standard audit that would allow us to answer such questions. We do know, though, that the companies and foundation­s working on educationa­l technology have a lot of control over the definition of success. That’s already too much power.

In short, blindly trusting the tech guys is no way to improve our educationa­l system. Although they undoubtedl­y mean well, we should demand more accountabi­lity.

Newspapers in English

Newspapers from United States