Matrix Derivative
Machine learning involves plenty of matrices, vectors, and related approximations, operations, and derivatives. They are worth summarizing for subsequent reference. One important thing to notice is the Layout conventions.
Wikipedia page about Matirx calculus is a good way to start. More comprehensive collection can be found on The Matrix Cookbook by Kaare Brandt Petersen and Michael Syskind Pedersen.
The Matrix Cookbook: What is this? These pages are a collection of facts (identities, approximations, inequalities, relations, ...) about matrices and matters relating to them. It is collected in this form for the convenience of anyone who wants a quick desktop reference .
Feels like the tensor analysis in electrodynamics is back. :(
References
[1] Matrix Calculus
[2] The Matrix Cookbook
[3] 机器学习中的矩阵、向量求导
[4] The Matrix Calculus You Need For Deep Learning
[5] Mathematics for Machine Learning