Daniel D. Johnson


I am a PhD student at the University of Toronto, working with David Duvenaud and Chris Maddison. I'm also a research software engineer at Google. My research interests include:
  • applying deep learning and probabilistic inference to compositional and structured data (such as trees, sets, and graphs),
  • continuously relaxing classical algorithms into end-to-end differentiable layers,
  • designing more expressive "differentiable" programming languages and frameworks to do non-traditional machine learning research.

Recently, I've been working on two sides of the intersection between programming languages and machine learning: on the one hand, how can we use machine learning to help people write code more easily and with fewer bugs, and on the other hand, how can we design programming languages that reduce the cognitive load of building complex probabilistic models? See my research page for more information.

In 2018-2019, I worked on applied machine learning at Cruise. Before that, I was an undergraduate CS/Math joint major at Harvey Mudd College, where I did research on applying deep learning to music, and worked as a math tutor in the Academic Excellence tutoring program at HMC.

In my free time, I enjoy playing board games, trying out indie video games (current recommendations: Baba is You, Outer Wilds), rock climbing, cooking, and working on a variety of side projects.


Updates

  • I presented "Beyond In-Place Corruption: Insertion and Deletion In Denoising Probabilistic Models" (arXiv) at the INNF+ workshop at ICML 2021.
  • The paper "Getting to the Point. Index Sets and Parallelism-Preserving Autodiff for Pointful Array Programming" (arXiv), describing the Dex programming language, has been accepted to ICFP 2021!
  • I presented "Learning Graph Structure With A Finite-State Automaton Layer" (arXiv) as a spotlight presentation at NeurIPS 2020 (and, earlier, at the Graph Representation Learning and Beyond workshop at ICML 2020).
  • In August 2020, I added extensible record and variant types to dex-lang (PRs 1, 2). I'm excited to continue exploring connections between records and named axes in strongly-typed multidimensional-array code.
  • In October 2019, I moved to Montréal for the AI Residency.
  • In December 2018, my HMC Mathematics Clinic group presented a method for sound separation using automatic differentiation at NeurIPS 2018.