New York Post

Scarier Than Gemini: ‘Bias-Free’ AI Quotas

- WILLIAM A. JACOBSON & KEMBERLEE KAYE William A. Jacobson is a clinical professor of law at Cornell and founder of the Equal Protection Project, where Kemberlee Kaye is operations and editorial director.

EVERYONE is laughing at the Google Gemini AI rollout. But it’s no joke. The problem is more nefarious than historical­ly inaccurate generated images. The manipulati­on of AI is just one aspect of broader “discrimina­tion by algorithm” being built into corporate America, and it could cost you job opportunit­ies and more.

When Gemini was asked to produce pictures of white people, it refused, saying it couldn’t fulfill the request because it “reinforces harmful stereotype­s and generaliza­tions about people based on their race.”

But it had no trouble generating pictures of a female pope, non-white Vikings and a black George Washington.

Microsoft’s AI Imaging tool has its own problems, generating sexually explicit and violent images.

Clearly AI imaging has gone off the rails.

While Google’s CEO admitted

Gemini’s results were “biased” and “unacceptab­le,” that’s not a bug but a feature — much as “anti-racism” theory gave rise to openly racist diversity, equity and inclusion practices.

As one of us (William) recently explained to The Post: “In the name of anti-bias, actual bias is being built into the systems. This is a concern not just for search results, but real-world applicatio­ns where ‘bias free’ algorithm testing actually is building bias into the system by targeting end results that amount to quotas.”

Our Equal Protection Project (EqualProte­ct.org) sounded the alarm almost a year ago, when we exposed the use of algorithms to manipulate pools of job applicants in LinkedIn’s “Diversity in Recruiting’’ function.

LinkedIn justified the racial and other identity-group manipulati­on as necessary “to make sure people have equal access” to job opportunit­ies, but what it meant by “equal access” was actually preferenti­al treatment.

Such bias operates in the shadows. Job candidates don’t see how the algorithms affect their prospects.

Algorithms can be — and are — used to elevate certain groups over others.

But it’s not limited to LinkedIn. The Biden administra­tion has issued an executive order to require bias-free algorithms, but under the progressiv­e DEI rubric built into this policy, the lack of bias is demonstrat­ed not on equal treatment, but on “equity.” Equity is a codeword for quotas. In the world of “bias-free” algorithm testing, bias is built-in to achieve equity.

What happened with Gemini is an example of such programmin­g.

It’s one thing to get a bad search result; it’s quite another thing to lose a job opportunit­y.

As attorney Stewart Baker, an expert on such deck-stacking, explained at an EPP event, “preventing bias . . . in artificial intelligen­ce is almost always going to be code for imposing stealth quotas.”

The insidious reach of “biasfree” bias will grow.

Discrimina­tion by algorithm has the potential to manipulate every major detail of our lives in order to obtain group results and group quotas.

These algorithms are designed to take the scourge of DEI and secretly bring it into every facet of life and the economy.

People are purposely “teaching” AI that images of black Vikings are a more equitable result than the truth.

Because Big Tech already knows a lot about you, including your race and ethnicity, it’s not hard to imagine discrimina­tion by algorithm manipulati­ng access to a host of goods and services.

Get turned down for a job, a loan, an apartment, or college admission? Could be a “bias free” algorithm at work.

But you’ll almost never be able to prove it, because the algorithms operate out of sight and undercover, certified as “bias free” because they build bias into the system to achieve quotas.

You get the picture. Discrimina­tion by algorithm is a threat to equality and must be stopped.

Because Big Tech already knows a lot about you . . . it’s not hard to imagine discrimina­tion by algorithm manipulati­ng services.’ access to a host of goods and

Newspapers in English

Newspapers from United States