The aim in the posts/notes below is to cover topics that I found interesting and to present them in a constructive fashion, putting the emphasis on intuition rather than formal proofs.
While I try to write those with care, there is no guarantees of correctness and if you spot an error or a doubtful statement, kindly let me know by opening an issue on GitHub.




  • Matrix inversion lemmas: Investigating the Woodbury formula and the Sherman-Morrison formula.

  • Splitting methods: Splitting methods in optimisation, proximal methods, and the Alternating Direction Method of Multipliers (ADMM).

  • First order methods: First order methods, minimising sequence, admissible direction, and the Generalised Projected Gradient Descent (again).

  • Mirror descent algorithm: The Generalised Projected Gradient Descent (GPGD) and the Mirror Descent Algorithm (MDA).

  • Projected gradient descent: Normal cone, Euclidean projection, and the Projected Gradient Descent (PGD).

  • Convex analysis – pt. III: Strict and strong convexity, Bregman divergences, and the link between Lipschitz continuity and strong convexity.

  • Convex analysis – pt. II: The convex conjugate, Fenchel's inequality, and the Fenchel-Moreau theorem.

  • Convex analysis – pt. I: The subdifferential and the First-order Optimality Condition (FOC).

  • Convex Optimisation – intro: Introduction to the general convex minimisation problem and generic iterative methods.


  • RKHS – pt. II: Probabilistic reasoning with kernel embeddings.

  • RKHS – pt. I: Introduction to Reproducing Kernel Hilbert Spaces (RKHS) and embedding of distributions.