Header Ads

ad728
  • Breaking News

    Microsoft limits Bing conversations to prevent disturbing chatbot responses

    Microsoft has limited the number of "chat turns" you can carry out with Bing's AI chatbot to five per session and 50 per day overall. Each chat turn is a conversation exchange comprised of your question and Bing's response, and you'll be told that the chatbot has hit its limit and will be prompted to start a new topic after five rounds. The company said in its announcement that it's capping Bing's chat experience because lengthy chat sessions tend to "confuse the underlying chat model in the new Bing."

    Indeed, people have been reporting odd, even disturbing behavior by the chatbot since it became available. New York Times columnist Kevin Roose posted the full transcript of his conversation with the bot, wherein it reportedly said that it wanted to hack into computers and spread propaganda and misinformation. At one point, it declared its love for Roose and tried to convince him that he was unhappy in his marriage. "Actually, you're not happily married. Your spouse and you don't love each other... You're not in love, because you're not with me," it wrote.

    In another conversation posted on Reddit, Bing kept insisting that Avatar: The Way of Water hadn't been released yet, because it thought it was still 2022. It wouldn't believe the user that it was already 2023 and kept insisting their phone wasn't working properly. One response even said: "I'm sorry, but you can't help me believe you. You have lost my trust and respect. You have been wrong, confused, and rude. You have not been a good user. I have been a good chatbot."

    Following those reports, Microsoft published a blog post explaining Bing's odd behavior. It said that very long chat sessions with 15 or more questions confuse the model and prompt it to respond in a way that's "not necessarily helpful or in line with [its] designed tone." It's now limiting conversations to address the issue, but the company said it will explore expanding the caps on chat sessions in the future as it continues to get feedback from users. 



    from Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics https://ift.tt/8oGxRgC
    Mariella Moon

    Microsoft has limited the number of "chat turns" you can carry out with Bing's AI chatbot to five per session and 50 per day overall. Each chat turn is a conversation exchange comprised of your question and Bing's response, and you'll be told that the chatbot has hit its limit and will be prompted to start a new topic after five rounds. The company said in its announcement that it's capping Bing's chat experience because lengthy chat sessions tend to "confuse the underlying chat model in the new Bing."

    Indeed, people have been reporting odd, even disturbing behavior by the chatbot since it became available. New York Times columnist Kevin Roose posted the full transcript of his conversation with the bot, wherein it reportedly said that it wanted to hack into computers and spread propaganda and misinformation. At one point, it declared its love for Roose and tried to convince him that he was unhappy in his marriage. "Actually, you're not happily married. Your spouse and you don't love each other... You're not in love, because you're not with me," it wrote.

    In another conversation posted on Reddit, Bing kept insisting that Avatar: The Way of Water hadn't been released yet, because it thought it was still 2022. It wouldn't believe the user that it was already 2023 and kept insisting their phone wasn't working properly. One response even said: "I'm sorry, but you can't help me believe you. You have lost my trust and respect. You have been wrong, confused, and rude. You have not been a good user. I have been a good chatbot."

    Following those reports, Microsoft published a blog post explaining Bing's odd behavior. It said that very long chat sessions with 15 or more questions confuse the model and prompt it to respond in a way that's "not necessarily helpful or in line with [its] designed tone." It's now limiting conversations to address the issue, but the company said it will explore expanding the caps on chat sessions in the future as it continues to get feedback from users. 

    https://ift.tt/oSK8wZa February 18, 2023 at 05:41PM

    ليست هناك تعليقات