

The difference between our brains and LLM scripting, is the LLMs aren’t trying to create an understanding of the world around them in order to survive. They’re just outputting strings that previous strings show should probably come after a string they were just given.
The cars are poorly designed to the point of being dangerous. They deserve it a little.