→ A primer on the fundamental theorem of information geometry
→ Dually flat spaces of exponential families and mixture families, smooth strictly convex functions, and the dual generalized Pythagorean theorems
→ A first read introducing the Fisher information matrix, the CRLB, and the Fisher-Rao differential-geometrization of statistics (aka 0-geometry)
→ Skew the Jensen-Shannon divergence with a skewing vector (instead of an ordinary scalar α) Calculate the Jensen-Shannon centroid of a set of normalized histograms using the Convex-ConCave Procedure (CCCP)