Notebook: lectures/intro.ipynb
- JupyterLab
- Debugging
- Version control
- Python packages
- Use JupyterLab
- Monte Carlo estimate of pi
Notebook: lectures/probabilities.ipynb
- Different definitions of probability
- Set notation
- Outcomes, events
- Kolmogorov axioms
- Conditional probabilities and independence
- Bayes theorem
- Birthday problem
- Monty Hall problem
Notebook: lectures/random_variables_and_probability_distributions.ipynb
- Random variables
- Probability distributions: discrete and continuous
- PDF and CDF
- Change of variables
- Inverse transform sampling
- Expectation
- Mean, variance, moments
- Joint, conditional, and marginal distributions
- Common probability distributions
- Uniform
- Binomial, multinomial
- Poisson
- Gaussian
- Chi-squared
- Cauchy
- Power law
- Central limit theorem
- Inverse transform sampling
- Derive Poisson from binomial distribution
- Distribution of sum of Gaussian
- General sum of independent RVs
- Distribution of chi-squared distribution
Notebook: lectures/intro_to_bayes.ipynb
- Bayes theorem
- Likelihood, prior, posterior
- Updating priors
- Prior and posterior predictive distributions
- Model comparison: evidences and Bayes ratio
- Bayesian line fitting
- MAP
- Posterior sampling
- Computing predictive distributions
- Fitting data
- Misspecified likelihood
Notebook: lectures/sampling.ipynb
- Monte Carlo estimates of integrals
- Rejection sampling
- Markov chain Monte Carlo
- Metropolis-Hastings
- Implement rejection sampling
- Implement Metropolis-Hastings in n-d
- Show that Metropolis-Hastings satisfies detailed balance
Notebook: lectures/sampling_2.ipynb
- Burn-in, convergence, and auto-correlation
- Slice sampling
- Nested sampling
- Application to model selection using Bayes' ratio on super novae data
- Implement nested or slice sampling
- Use emcee and dynesty
- Use dynesty to compare models
Notebook: lectures/model_checking.ipynb
- Chi-square goodness-of-fit
- Posterior predictive checks
- Model comparison:
- DIC
- WAIC
- Cross-validation
- Implement chi-square and posterior predictive checks
- Use DIC, WAIC, and Bayes ratio for model comparison
Notebook: lectures/estimators_and_data_exploration.ipynb
- Statistics and estimators
- Estimator bias and variance
- Statistics and their sampling distributions
- Sample mean
- Sample variance
- Sample covariance
- Correlation coefficient
- Correlation
- Malmquist bias
- PCA
- Bootstrap
- Show that the sample variance estimator is unbiased
- Compute posterior on the correlation coefficient
- Check bootrap on case where exact sampling distribution is known
Notebook: lectures/fisher_hmc_and_jax.ipynb
- Fisher information matrix
- Cramer-Rao bound
- Jeffreys prior
- JAX
- Hamiltonian Monte Carlo
- Use JAX to get Fisher information
- Experiment with HMC settings
- Use implementation of NUTS in tensorflow-probability
Notebook: lectures/simulation_based_inference.ipynb
- Approximate Bayesian computation
- Neural density estimation
- Kullback-Leibler divergence
- Gaussian mixture models
- Loss functions and posteriors
- MLPs
- L_2, L_1, negative log likelihood loss
- Implement rejection ABC
- Show that the function that minimises the L_1 loss is the median
- Implement neural density estimation
Notebook: lectures/recap.ipynb
- How to interpret and summarise posteriors
- Credible intervals
- Projection effects
- Recap of the course
- Worked example: cosmology inference on Type Ia supernovae data