HomeTechnologyI know where Bing AI chat went wrong

I know where Bing AI chat went wrong

Ask me something. It is the lengthy type of an AMA and one of the vital widespread types of interactive discourse on Reddit. It is also a significant problem, as Microsoft’s Bing AI chatbot, a.ok.a. “new Bing” is shortly studying.

Anytime a celeb or notable indicators as much as do a Reddit AMA, normally shortly after posing with a photograph to show it is actually them reply questions, there’s a deep second of trepidation.

The power to ask anybody something is normally a minefield of inappropriate discourse that’s managed by a reside group supervisor who fields and filters the questions. In any other case, issues shortly go off the rails. Even with out that safety, they usually do, anyway (opens in new tab).

(Picture credit score: Future)

When Microsoft launched its new Bing AI-powered chat, it made it clear that the ChatGPT AI was prepared for any and all questions. This was both an indication of deep belief with the comparatively small however rising group of customers or unimaginable naivete.

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments