April 24, 2025
Trending News

Don’t fall in love with ChatGPT

  • August 11, 2024
  • 0

Once again, because it’s not the first time, that we have to relate ChatGPT to the movie Her. As you may remember, we were very impressed with the

Don’t fall in love with ChatGPT

Once again, because it’s not the first time, that we have to relate ChatGPT to the movie Her. As you may remember, we were very impressed with the GPT-4o presentation last May, and one of the key points of that presentation was found in the speech synthesis model they showed us, a level we had never seen before (although Google’s Project Astra presentation gave us the other day also surprised in that sense) and it reminded us a lot of Spike Jonze’s movie.

If you haven’t seen it and won’t (if you haven’t, skip to the next paragraph now because I’m going to talk about its plot), it puts us in the near future where it takes place for sale an operating system fully managed by general artificial intelligence. This system was designed to provide a type of interaction similar to a conversation with another human being. And this, in a social environment that does not support interpersonal relationships, causes many users of this OS to end up forming emotional relationships of various kinds with it.

When the film was released in theaters in 2013, the state of development of artificial intelligence, and especially its perception by ordinary people, was very far from what was shown in Jonze’s film. However, when we learned about the new model powered by ChatGPT, all of us who saw the film at the time It was absolutely impossible not to remember her (which was also helped by the striking similarity of the chatbot’s voice to that of Scarlett Johansson, the actress who voiced the operating system in the film).

Don't fall in love with ChatGPT

What seemed like science fiction suddenly became something much closer. So much so that, as we read in the Wccfts, OpenAI had to advise ChatGPT users not to form emotional ties with the chatbot. Yes, you read that right. The tech company hypothesizes that this is because chatbot interactions are becoming more human-like in certain aspects, and has identified patterns that would make some users take ChatGPT “one step further”.

Establishing a personal connection with a chatbot poses several risks, such as that for these people, these interactions may translate into less socialization or a loss of objectivity in evaluating their responses. And that’s why they said that they were going to implement a system that would allow these patterns to be monitored so that if a problem were detected, reset chatbot behavior with this userjust so that it doesn’t go any further.

Source: Muy Computer

Leave a Reply

Your email address will not be published. Required fields are marked *