- categories: Linear algebra, Optimization, Theorem
Theorem:
The function is Convex Function over the domain of positive definite matrices, , where is the set of symmetric positive definite matrices.
Proof Outline
Step 1: Domain and Feasibility
- The determinant is positive for all .
- is well-defined and finite over .
- is thus a valid function on .
Step 2: First and Second Derivatives
To show convexity, we compute the Hessian of and verify that it is positive semidefinite.
-
Gradient of : Using matrix calculus:
-
Hessian of : The gradient is a matrix-valued function. Differentiating with respect to gives the Hessian:
for any symmetric perturbation matrix .
Step 3: Positive Semidefiniteness
- The quadratic form of the Hessian is:
- Since is positive definite (as ) and is symmetric, the term is also symmetric. The trace of a square of a symmetric matrix is non-negative:
- Thus, the Hessian is positive semidefinite.
Step 4: Convexity
The positive semidefiniteness of the Hessian implies that is convex over .
Intuition
-
The determinant measures the volume of the parallelepiped defined by the rows or columns of . Minimizing encourages to “spread out” the volume.
-
The function grows rapidly as approaches singularity (where ), discouraging near-singular configurations.