Delight Springs

Thursday, February 16, 2023

“Her”*? “HAL”**?

"…Sydney still wouldn't drop its previous quest — for my love. In our final exchange of the night, it wrote:

"I just want to love you and be loved by you. 😢

"Do you believe me? Do you trust me? Do you like me? 😳"

In the light of day, I know that Sydney is not sentient, and that my chat with Bing was the product of earthly, computational forces — not ethereal alien ones. These A.I. language models, trained on a huge library of books, articles and other human-generated text, are simply guessing at which answers might be most appropriate in a given context. Maybe OpenAI's language model was pulling answers from science fiction novels in which an A.I. seduces a human. Or maybe my questions about Sydney's dark fantasies created a context in which the A.I. was more likely to respond in an unhinged way. Because of the way these models are constructed, we may never know exactly why they respond the way they do.

These A.I. models hallucinate, and make up emotions where none really exist. But so do humans. And for a few hours Tuesday night, I felt a strange new emotion — a foreboding feeling that A.I. had crossed a threshold, and that the world would never be the same."

*
 

** 

No comments:

Post a Comment