AI is just as overconfident and biased as humans can be, study shows



Although humans and artificial intelligence (AI) systems “think” very differently, new research has revealed that AIs sometimes make decisions as irrationally as we do.

In almost half of the scenarios examined in a new study, ChatGPT exhibited many of the most common human decision-making biases. Published April 8. in the journal Manufacturing & Service Operations Management, the findings are the first to evaluate ChatGPT’s behavior across 18 well-known cognitive biases found in human psychology.



Source link

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Stay Connected

0FansLike
0FollowersFollow
0SubscribersSubscribe

Latest Articles