1. Evidence lower bound (ELBO): Lower bound on the log marginal likelihood of a generative model used in variational inference.
  2. KL divergence: Measure of the difference between two probability distributions used in generative models to encourage the model to produce similar samples to the training data.
  3. Markov chain Monte Carlo (MCMC) methods: Algorithms used to sample from complex probability distributions in generative models.
  4. Langevin dynamics: Stochastic process used to generate samples from complex probability distributions in the diffusion model.
  5. Score matching: Technique used to estimate the gradient of a probability density function in some variants of the diffusion model.
  6. Normalizing flows: Models that transform a simple base distribution into a more complex distribution used in conjunction with the diffusion model to learn a flexible family of distributions.
  7. Bayesian neural networks: Neural network that can model uncertainty in the parameters of the network used in generative models to learn a posterior distribution over the parameters of the network.

Variational inference is a technique used to approximate a complex/intractable probability distribution with a simpler distribution, such as a Gaussian distribution. It involves finding the best approximation by minimizing the KL-divergence between the true distribution and the approximating distribution. Variational inference is commonly used in Bayesian inference to find the posterior distribution over model parameters. It is also used in generative models, such as the Variational Autoencoder (VAE), to learn the latent space representation of input data.