Back to All Events

What is Bayesian Machine Learning?

Last month, while discussing Toy Models of Superposition, we briefly touched upon phase transitions in the Bayesian posterior (specifically the MCMC posterior sampling).

This week Blaine will cover the basics of Bayesian machine learning, including: Bayesian probability from first principles, Bayesianism / frequentism approaches, what Bayesianism means for neural networks, and the current meta of tools & techniques for Bayesian ML.

In this report, I describe the Bayesian approach to machine learning, detailing how it treats statistical parameters as random variables and uses Bayes’ rule to compute posterior distributions. I explain basic techniques like Maximum Likelihood Estimation (MLE) and introduce complex methods such as Variational Inference and Markov Chain Monte Carlo (MCMC) sampling. Variational Inference is highlighted for its performance and upgrade over Frequentist methods, while MCMC is presented as the gold standard for posterior sampling.

"Bayesian Machine Learning", Rogers 2023.

Previous
Previous
8 November

Briefing: The Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence

Next
Next
22 November

Formalizing the Presumption of Independence