Daniel D. Johnson


Uncertain Simulators Don't Always Simulate Uncertain Agents

I argue that hallucinations are a natural consequence of the language modeling objective, which focuses on simulating confident behavior even when that behavior is hard to predict, rather than predictable behaviors that take uncertainty into account. I also discuss five strategies for avoiding this mismatch.