Why AI lies, cheats and steals
While chatbots can talk about “internal states” like feeling tired, excited, happy, sad, or hungry, they don’t actually experience these states because they don’t have a physical, biological body.
Microsoft 365 explained: Office 365, rebranded and expanded
While chatbots can talk about “internal states” like feeling tired, excited, happy, sad, or hungry, they don’t actually experience these states because they don’t have a physical, biological body.
Humans have biological bodies with natural internal states (such as needing food, sleep, or a stable temperature). These physical needs regulate our actions and keep us grounded.
Because chatbots don’t have a body or internal state to manage, they don’t have “regulatory objectives.” Without the physical limits of a biological body to force self-checking and balance, AI models just churn out data without caution, leading to unsafe, overconfident, and untrustworthy answers.
