Sunday, February 19, 2023

The new Bing AI ChatGPT bot is going to be limited to five replies per chat

As regular TechRadar readers will know, the heavily promoted AI chatbot enhancements recently added to Bing haven't had the smoothest of launches – and now Microsoft is making some changes to improve the user experience.

In a blog post (via The Verge), Microsoft says the tweaks should "help focus the chat sessions": the AI part of Bing is going to be limited to 50 chat 'turns' (a question and answer) per day, and five responses per chat session.

This has been coming: Microsoft executives have previously gone on record saying that they were looking into ways of cutting out some of the weird behavior that's been noticed by early testers of the AI bot service.

Put to the test

Those early testers have been testing pretty hard: they've been able to get the bot, based on an upgraded version of OpenAI's ChatGPT engine, to return inaccurate answers, get angry, and even question the nature of its own existence.

Having your search engine go through an existential crisis when you were just looking for a list of the best phones isn't ideal. Microsoft says that very long chat sessions get its AI confused, and that the "vast majority" of search queries can be answered in 5 responses.

The AI add-on for Bing isn't available for everyone yet, but Microsoft says its working its way through the waiting list. If you're planning on trying out the new functionality, remember to keep your interactions brief and to the point.


Analysis: don't believe the hype just yet

Despite the early problems, there's clearly a lot of potential in the AI-powered search tools in development from Microsoft and Google. Whether you're searching for ideas for party games or places to visit, they're capable of returning fast, informed results – and you don't have to wade through pages of links to find them.

At the same time, there's clearly still a lot of work to do. Large Language Models (LLMs) like ChatGPT and Microsoft's version of it aren't really 'thinking' as such. They're like supercharged autocorrect engines, predicting which words should go after each other to produce a coherent and relevant response to what's being asked of them.

On top of that, there's the question of sourcing – if people are going to rely on AI to tell them what the best laptops are and put human writers out of a job, these chat bots won't have the data they need to produce their answers. Like traditional search engines, they're still very much dependent on content put together by actual people.

We did of course take the opportunity to ask the original ChatGPT why long interactions confuse LLMs: apparently it can make the AI models "too focused on the specific details of the conversation" and cause it to "fail to generalize to other contexts or topics", leading to looping behavior and responses that are "repetitive or irrelevant".



source https://www.techradar.com/news/the-new-bing-ai-chatgpt-bot-is-going-to-be-limited-to-five-replies-per-chat

Blackhat Marketing Scripts / Applications
Twitter Bird Gadget