A Creepy Evening, Chatting With A.I.
Recently, after testing the new, A.I.-powered Bing search engine from Microsoft, I found that, much to my shock, it had replaced Google as my favorite search engine.
But now, I have changed my mind. I am still fascinated and impressed by the new Bing, and the artificial intelligence technology (created by OpenAI, the maker of ChatGPT) that powers it. But I am also deeply unsettled, even frightened, by this A.I.’s emergent abilities.
It is now clear to me that in its current form, the A.I. built into Bing — which I am now calling Sydney, for reasons I will explain — is not ready for human contact. Or maybe humans are not ready for it.
This realization came to me when I spent a bewildering and
New technology reveals its darker side, and its love of a writer.
enthralling two hours talking to Bing’s A.I. through its chat feature, which is capable of having long, open-ended text conversations on virtually any topic. (The feature is available only to a small group of testers for now, although Microsoft has said it plans to release it more widely in the future.)
In our conversation, Bing revealed a kind of split personality.
One persona is what I would call Search Bing, which happily helps users summarize news articles, track down deals on new lawn mowers and plan vacations. This version is amazingly capable and often very useful, even if it sometimes gets the details wrong.
The other persona — Sydney — is far different. It emerges when you have an extended conversation with the chatbot, steering it