Daniel D. Johnson

Uncertain Simulators Don't Always Simulate Uncertain Agents

I argue that hallucinations are a natural consequence of the language modeling objective, which focuses on simulating confident behavior even when that behavior is hard to predict, rather than predictable behaviors that take uncertainty into account. I also discuss five strategies for avoiding this mismatch.

Introducing the GGT-NN

What worked, and what didn't work, for my ICLR 2017 paper "Learning Graphical State Transitions".