Spot on about the 'Yes Man' effect. I’ve noticed that if you feed AI a flawed premise, it’ll spend five paragraphs hallucinating reasons why you’re right instead of telling you it’s a bad idea. That lack of critical friction is dangerous for high level decision making i believe.
Sort: Trending
Yeah, it's scary. To make things worse, a lot of the people using AI don't question or check the accuracy of what they're being told.
!BBH
!PIZZA
!ALIVE