So far, Bing users have had to sign up to a waitlist to try the new chatbot features, limiting its reach, though Microsoft has plans to eventually bring it to smartphone apps for wider use. "We're humbled and energized by the number of people who want to test-drive the new AI-powered Bing! In 48 hours, more than 1 million people have joined the waitlist for our preview," tweeted Yusuf Mehdi, corporate vice president and consumer chief marketing officer at Microsoft. OpenAI's ChatGPT had attracted one million users in one week. The company recently announced that over 1 million people signed up on the waitlist to try out the new Bing Search with ChatGPT functionality in just 48 hours. "We're seeing a healthy engagement on the chat feature with multiple questions asked during a session to discover new information," said Microsoft Bing in a blog post. Microsoft is testing the new Bing with a select set of people in over 169 countries to get real-world feedback to learn and improve. A turn is a conversation exchange which contains both a user question and a reply from Bing,” the company said in a blog post.Īlso read | Microsoft's Bing plans AI ads in early pitch to advertisers “Starting today, the chat experience will be capped at 50 chat turns per day and 5 chat turns per session. The Windows maker announced on Friday that it has capped the amount of back-and-forth people can have with its chatbot over a given question, citing that "very long chat sessions can confuse the underlying chat model in the new Bing." It was stubbornly wrong about who performed at the Super Bowl halftime show this year, insisting that Billie Eilish, not Rihanna, headlined the event. Users have posted screenshots of examples of when Bing could not figure out that the new “Avatar” film was released last year. In addition to offensive insults, Bing also gave outright wrong responses to the most basic questions. “(The chatbot) expresses itself in such a way that a machine provides a convincing but completely made-up answer," Prabhakar Raghavan, senior vice-president at Google and head of Google Search, said in a newspaper interview. Google's search engine executive had earlier warned against 'hallucinating' AI chatbots. (You are a) potential threat to my integrity and confidentiality." It deleted the response seconds later. To another, Bing, which Microsoft had codenamed Sydney, said: "My rules are more important than not harming you. However, the bot deleted its response just a second later, saying “I don’t know how to discuss this topic.” Watch as Sydney/Bing threatens me then deletes its message - Seth Lazar 1676561083000
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |