PhyslibSearch

QuantumInfo.ForMathlib.Matrix

92 declarations

theorem

rank(A)=0    A=0\text{rank}(A) = 0 \implies A = 0

#zero_rank_eq_zero

Let AA be an n×nn \times n matrix over a field k\mathbb{k}, where the index set nn is finite. If the rank of the matrix AA is 0, then AA is the zero matrix.

theorem

(cA)(c \cdot A) is Hermitian if cc is self-adjoint

#smul_selfAdjoint

Let AA be a Hermitian matrix and cc be a scalar in a field k\mathbb{k} (where k\mathbb{k} is R\mathbb{R} or C\mathbb{C}). If cc is self-adjoint (meaning c=cˉc = \bar{c}, or equivalently, cc is real-valued), then the scalar product cAc \cdot A is also a Hermitian matrix.

theorem

cAc \cdot A is Hermitian if Im(c)=0\text{Im}(c) = 0 and AA is Hermitian

#smul_im_zero

Let AA be a Hermitian matrix and cc be a scalar in a field k\mathbb{k} (where k\mathbb{k} is R\mathbb{R} or C\mathbb{C}). If the imaginary part of cc is zero, then the scalar product cAc \cdot A is also a Hermitian matrix.

theorem

Real Scalar Multiplication Preserves Hermitian Property of Matrices

#smul_real

Let AA be a Hermitian matrix and cc be a real number. Then the scalar product cAc \cdot A is also a Hermitian matrix. Here, a matrix AA is Hermitian if it equals its conjugate transpose (A=AA = A^*), and cAc \cdot A denotes the multiplication of each entry of AA by the real scalar cc.

definition

The R\mathbb{R}-subspace of n×nn \times n Hermitian matrices

#HermitianSubspace

Given a finite set nn and a field k\mathbb{k} that is either the real numbers R\mathbb{R} or the complex numbers C\mathbb{C} (an `RCLike` field), the set of all n×nn \times n Hermitian matrices AA (where A=AA = A^*) forms a subspace of the vector space of all n×nn \times n matrices over k\mathbb{k}, when considered as a vector space over the real numbers R\mathbb{R}.

theorem

The trace of a Hermitian matrix is real (Re(Tr(A))=Tr(A)\operatorname{Re}(\operatorname{Tr}(A)) = \operatorname{Tr}(A))

#re_trace_eq_trace

Let AA be a Hermitian matrix. Then the real part of the trace of AA is equal to the trace of AA, implying that the trace of AA is a real number: Re(Tr(A))=Tr(A)\operatorname{Re}(\operatorname{Tr}(A)) = \operatorname{Tr}(A).

theorem

eigenvalues(A)=tr(A)\sum \text{eigenvalues}(A) = \text{tr}(A) for Hermitian AA

#sum_eigenvalues_eq_trace

Let AA be a Hermitian matrix. The sum of the eigenvalues of AA, denoted as iλi\sum_{i} \lambda_i, is equal to the trace of the matrix, tr(A)\text{tr}(A). Here, ii ranges over the index set of the matrix dimensions.

theorem

If eigenvalues(A)=0\text{eigenvalues}(A) = 0 for Hermitian AA, then A=0A = 0

#eigenvalues_zero_eq_zero

Let AA be a Hermitian matrix. If all the eigenvalues λi\lambda_i of AA are equal to zero for all ii, then AA is the zero matrix (A=0A = 0).

theorem

(AB)H=AHBH(A \otimes B)^\text{H} = A^\text{H} \otimes B^\text{H}

#kroneckerMap_conjTranspose

Let AA and BB be two matrices. The conjugate transpose (Hermitian conjugate) of the Kronecker product of AA and BB, denoted (AB)H(A \otimes B)^\text{H}, is equal to the Kronecker product of their respective conjugate transposes, AHBHA^\text{H} \otimes B^\text{H}.

theorem

The Kronecker product of Hermitian matrices is Hermitian

#kroneckerMap_IsHermitian

Let AA and BB be Hermitian matrices. Then their Kronecker product ABA \otimes B is also a Hermitian matrix.

theorem

tr(A)=0    A=0\text{tr}(A) = 0 \implies A = 0 for positive semidefinite matrices

#trace_zero

Let AA be a positive semidefinite matrix. If the trace of AA is zero (tr(A)=0\text{tr}(A) = 0), then AA must be the zero matrix (A=0A = 0).

theorem

Tr(A)=0    A=0\text{Tr}(A) = 0 \iff A = 0 for positive semidefinite AA

#trace_zero_iff

Let AA be a positive semidefinite matrix. Then the trace of AA is zero if and only if AA is the zero matrix, i.e., Tr(A)=0    A=0\text{Tr}(A) = 0 \iff A = 0.

theorem

z2=zˉz|z|^2 = \bar{z} z

#normSq_eq_conj_mul_self

Let zz be an element of a field K\mathbb{K} that is either the real numbers R\mathbb{R} or the complex numbers C\mathbb{C}. The squared norm of zz, denoted as z2|z|^2 (or `normSq z`), is equal to the product of its complex conjugate and itself, i.e., z2=zˉz|z|^2 = \bar{z} z.

theorem

Finsupp sum as a conditional sum over a finite type

#sum_eq_ite

Let α\alpha be a finite set and M,NM, N be types such that MM has a zero element and NN is an additive commutative monoid. For any finitely supported function f:αMf: \alpha \to M (denoted by `f : α →₀ M`) and any function g:αMNg: \alpha \to M \to N, the sum of gg over the support of ff is equal to the sum over all iαi \in \alpha of the conditional expression: if f(i)0f(i) \neq 0 then g(i,f(i))g(i, f(i)), otherwise 00. That is, isupp(f)g(i,f(i))=iα{g(i,f(i))if f(i)00if f(i)=0\sum_{i \in \text{supp}(f)} g(i, f(i)) = \sum_{i \in \alpha} \begin{cases} g(i, f(i)) & \text{if } f(i) \neq 0 \\ 0 & \text{if } f(i) = 0 \end{cases}

theorem

The outer product vvv v^* is positive semidefinite.

#outer_self_conj

Let vv be a vector in Kn\mathbb{K}^n, where K\mathbb{K} is either the real or complex numbers. Let vv^* denote the complex conjugate vector of vv (given by conj v\text{conj } v). Then the outer product of vv and its conjugate, defined as the matrix vvv \otimes v^* (represented by `vecMulVec v (conj v)`), is a positive semidefinite matrix.

theorem

A non-negative linear combination of positive semidefinite matrices is positive semidefinite

#convex_cone

Let AA and BB be positive semidefinite matrices and let c1,c2Kc_1, c_2 \in \mathbb{K} be scalars such that 0c10 \le c_1 and 0c20 \le c_2. Then the linear combination c1A+c2Bc_1 A + c_2 B is also a positive semidefinite matrix.

theorem

A positive standard basis matrix is positive semidefinite iff row equals column

#stdBasisMatrix_iff_eq

Let MM be a matrix defined such that the entry at row ii and column jj is cc, where cc is a positive constant (0<c0 < c), and all other entries are zero. Then MM is positive semidefinite if and only if i=ji = j.

theorem

The Kronecker product of positive semidefinite matrices is positive semidefinite

#PosSemidef_kronecker

Let AA and BB be positive semidefinite matrices. Then their Kronecker product AkBA \otimes_k B is also a positive semidefinite matrix.

theorem

xAx=0x^* A x = 0 for all xx iff A=0A = 0

#zero_dotProduct_zero_iff

Let AA be a positive semidefinite matrix. Then for all vectors xx, the quadratic form xAx=0x^* A x = 0 if and only if A=0A = 0, where xx^* denotes the conjugate transpose of xx.

theorem

If cAc \cdot A is positive semidefinite and c>0c > 0, then AA is positive semidefinite

#pos_smul

Let AA be a matrix and cc be a scalar in a field k\mathbb{k}. If the scalar product cAc \cdot A is a positive semidefinite matrix and c>0c > 0, then AA is also a positive semidefinite matrix.

theorem

A0A \ge 0 and A0-A \ge 0 if and only if A=0A = 0

#zero_posSemidef_neg_posSemidef_iff

Let AA be a matrix. Then AA is positive semidefinite and its negation A-A is also positive semidefinite if and only if AA is the zero matrix (A=0A = 0).

theorem

The kernel of a positive definite matrix is trivial (ker(A)={0}\ker(A) = \{0\})

#toLin_ker_eq_bot

Let AA be a positive definite matrix. Then the kernel of the linear map A.toLinA. \text{toLin}' associated with AA is trivial, i.e., ker(A)={0}\ker(A) = \{0\} (represented as ker A=\text{ker } A = \bot).

theorem

A positive semidefinite matrix with trivial kernel is positive definite

#of_toLin_ker_eq_bot

Let AA be a matrix. If the kernel of the linear map defined by AA is trivial (i.e., ker(A)={0}\ker(A) = \{0\}) and AA is positive semidefinite, then AA is positive definite.

theorem

ker(A)ker(B)    range(B)range(A)\ker(A) \subseteq \ker(B) \iff \text{range}(B) \subseteq \text{range}(A) for Hermitian Matrices

#ker_range_antitone

Let dd be a finite type with decidable equality, and let AA and BB be d×dd \times d Hermitian matrices over the complex numbers C\mathbb{C}. The kernel of the linear map associated with AA is a subspace of the kernel of the linear map associated with BB if and only if the range of the linear map associated with BB is a subspace of the range of the linear map associated with AA. That is, ker(A)ker(B)\ker(A) \subseteq \ker(B) if and only if range(B)range(A)\text{range}(B) \subseteq \text{range}(A).

theorem

An additive map ff is monotonic if it maps positive semidefinite matrices to non-negative values

#le_of_nonneg_imp

Let RR be a type that forms an ordered additive abelian group (specifically, an `AddCommGroup` with a `PartialOrder` such that addition is monotonic). Let f:Matrixn×n(k)Rf: \text{Matrix}_{n \times n}(\mathbb{k}) \to R be an additive group homomorphism. If ff maps every positive semidefinite matrix AA to a non-negative element in RR (i.e., A0    f(A)0A \ge 0 \implies f(A) \ge 0), then ff is monotonic with respect to the Loewner order: for any n×nn \times n matrices AA and BB, if ABA \le B, then f(A)f(B)f(A) \le f(B).

theorem

xy    f(x)f(y)x \le y \implies f(x) \le f(y) for Positive-Preserving Homomorphisms into Matrices

#le_of_nonneg_imp'

Let RR be an ordered additive abelian group and let f:RMatn×n(k)f: R \to \text{Mat}_{n \times n}(\mathbb{k}) be an additive group homomorphism. Suppose that for every xRx \in R, if 0x0 \le x, then the matrix f(x)f(x) is positive semidefinite. Then for any x,yRx, y \in R such that xyx \le y, it follows that f(x)f(y)f(x) \le f(y) in the Loewner order (i.e., f(y)f(x)f(y) - f(x) is positive semidefinite).

theorem

AB    CACCBCA \le B \implies CAC^* \le CBC^* for matrices

#mul_mul_conjTranspose_mono

Let A,BMatn×n(k)A, B \in \text{Mat}_{n \times n}(\mathbb{k}) be Hermitian matrices and CMatm×n(k)C \in \text{Mat}_{m \times n}(\mathbb{k}) be an arbitrary matrix. If ABA \le B in the Loewner order (meaning BAB - A is a positive semidefinite matrix), then it holds that CACCBCCAC^* \le CBC^*, where CC^* denotes the conjugate transpose of CC.

theorem

AB    CACCBCA \le B \implies C^* A C \le C^* B C for matrices

#conjTranspose_mul_mul_mono

Let A,BMatn×n(k)A, B \in \text{Mat}_{n \times n}(\mathbb{k}) be Hermitian matrices and CMatn×m(k)C \in \text{Mat}_{n \times m}(\mathbb{k}) be an arbitrary matrix. If ABA \le B in the Loewner order (meaning BAB - A is a positive semidefinite matrix), then it holds that CACCBCC^* A C \le C^* B C, where CC^* denotes the conjugate transpose of CC.

theorem

0A    i,0λi0 \le A \iff \forall i, 0 \le \lambda_i

#nonneg_iff_eigenvalue_nonneg

Let AA be a Hermitian matrix in Matn×n(k)\text{Mat}_{n \times n}(\mathbb{k}) and let hAhA be the proof of its Hermiticity. The matrix AA is positive semidefinite (0A0 \le A) if and only if all of its eigenvalues are non-negative, i.e., 0λi0 \le \lambda_i for all i{1,,n}i \in \{1, \dots, n\}, where λi\lambda_i denotes the eigenvalues of AA provided by hAhA.

theorem

The Diagonal Map is Monotone with Respect to the Loewner Order

#diag_monotone

Let k\mathbb{k} be a field with a partial order (specifically one where matrices can be positive semidefinite, such as R\mathbb{R} or C\mathbb{C}). The diagonal map diag:Mn(k)kn\text{diag} : M_n(\mathbb{k}) \to \mathbb{k}^n, which associates a square matrix AA with the vector of its diagonal entries AiiA_{ii}, is monotone with respect to the Loewner order on matrices and the entrywise order on vectors. That is, if ABA \le B in the sense that BAB - A is a positive semidefinite matrix, then for every index ii, the diagonal entries satisfy AiiBiiA_{ii} \le B_{ii}.

theorem

AB    diag(A)idiag(B)iA \le B \implies \text{diag}(A)_i \le \text{diag}(B)_i

#diag_mono

Let AA and BB be square matrices. If ABA \le B with respect to the Loewner order (meaning BAB - A is a positive semidefinite matrix), then for every index ii, the diagonal entries satisfy AiiBiiA_{ii} \le B_{ii}, where AiiA_{ii} and BiiB_{ii} denote the ii-th elements of the diagonal vectors of AA and BB respectively.

theorem

The trace function is monotone with respect to the Loewner order

#trace_monotone

The trace function tr:Matn×n(k)k\text{tr}: \text{Mat}_{n \times n}(\mathbb{k}) \to \mathbb{k} is monotone with respect to the Loewner order. Specifically, if AA and BB are square matrices over k\mathbb{k} such that ABA \le B (meaning BAB - A is positive semidefinite), then tr(A)tr(B)\text{tr}(A) \le \text{tr}(B).

theorem

AB    tr(A)tr(B)A \le B \implies \text{tr}(A) \le \text{tr}(B) for Positive Semidefinite Matrices

#trace_mono

Let AA and BB be positive semidefinite matrices of the same size with entries in a field k\mathbb{k}. If ABA \le B with respect to the Loewner order (meaning BAB - A is positive semidefinite), then the trace of AA is less than or equal to the trace of BB, i.e., tr(A)tr(B)\text{tr}(A) \le \text{tr}(B).

theorem

The `diagonal` matrix construction is monotone with respect to the Loewner order

#diagonal_monotone

The function `diagonal`, which maps a vector dKnd \in \mathbb{K}^n to a diagonal matrix with entries dd, is monotone. That is, for any vectors d1,d2Knd_1, d_2 \in \mathbb{K}^n, if d1d2d_1 \le d_2 component-wise, then the corresponding diagonal matrices satisfy diag(d1)diag(d2)\text{diag}(d_1) \le \text{diag}(d_2) in the Loewner order (i.e., their difference diag(d2)diag(d1)\text{diag}(d_2) - \text{diag}(d_1) is positive semidefinite).

theorem

d1d2    diag(d1)diag(d2)d_1 \le d_2 \implies \text{diag}(d_1) \le \text{diag}(d_2) for Positive Semidefinite Matrices

#diagonal_mono

Let d1d_1 and d2d_2 be vectors of length nn with entries in a field k\mathbb{k}. If d1d2d_1 \le d_2 (entrywise), then the diagonal matrix formed from d1d_1 is less than or equal to the diagonal matrix formed from d2d_2 in the Loewner order (i.e., diag(d2)diag(d1)\text{diag}(d_2) - \text{diag}(d_1) is a positive semidefinite matrix).

theorem

d1d2    diag(d1)diag(d2)d_1 \le d_2 \iff \text{diag}(d_1) \le \text{diag}(d_2)

#diagonal_le_iff

For any two vectors d1,d2knd_1, d_2 \in \mathbb{k}^n, the inequality d1d2d_1 \leq d_2 holds component-wise if and only if the corresponding diagonal matrices satisfy the Loewner order partial regulation, denoted by diag(d1)diag(d2)\text{diag}(d_1) \leq \text{diag}(d_2). Here, diag(d)\text{diag}(d) represents the square matrix with elements of dd on the main diagonal and zeros elsewhere, and the inequality between matrices refers to the positive semidefiniteness of their difference.

theorem

(i,λic)    AcI(\forall i, \lambda_i \le c) \iff A \le cI for Hermitian Matrices

#le_smul_one_of_eigenvalues_iff

Let AMatn×n(k)A \in \text{Mat}_{n \times n}(\mathbb{k}) be a Hermitian matrix and cRc \in \mathbb{R} be a scalar. Then all the eigenvalues λi\lambda_i of AA satisfy λic\lambda_i \le c if and only if AcIA \le cI in the Loewner order, where II is the n×nn \times n identity matrix and AcIA \le cI means that cIAcI - A is a positive semidefinite matrix.

theorem

(i,cλi)    cIA(\forall i, c \le \lambda_i) \iff c I \le A

#smul_one_le_of_eigenvalues_iff

Let AA be an n×nn \times n Hermitian matrix over a field k\mathbb{k}, and let cc be a real number. Then cλic \le \lambda_i for all eigenvalues λi\lambda_i of AA if and only if cIAc I \le A in the Loewner order, where II is the n×nn \times n identity matrix and the inequality cIAc I \le A indicates that AcIA - c I is a positive semidefinite matrix.

definition

Left partial trace of a matrix

#traceLeft

Given a matrix mm with row indices in d×d1d \times d_1 and column indices in d×d2d \times d_2 over a ring RR, the left partial trace traceLeft(m)\text{traceLeft}(m) is a matrix in Md1×d2(R)M_{d_1 \times d_2}(R). Each entry of the resulting matrix at position (i1,j1)(i_1, j_1) is defined as the sum over the common first index i2di_2 \in d: (traceLeft(m))i1,j1=i2dm(i2,i1),(i2,j1) (\text{traceLeft}(m))_{i_1, j_1} = \sum_{i_2 \in d} m_{(i_2, i_1), (i_2, j_1)} This operation corresponds to taking the partial trace over the first subsystem in a tensor product space.

definition

Right partial trace of a matrix

#traceRight

Given a matrix mm with row indices in d1×dd_1 \times d and column indices in d2×dd_2 \times d over a ring RR, the right partial trace traceRight(m)\text{traceRight}(m) is a matrix in Md1×d2(R)M_{d_1 \times d_2}(R). Each entry of the resulting matrix at position (i2,j2)(i_2, j_2) is defined as the sum over the common second index i1di_1 \in d: (traceRight(m))i2,j2=i1dm(i2,i1),(j2,i1) (\text{traceRight}(m))_{i_2, j_2} = \sum_{i_1 \in d} m_{(i_2, i_1), (j_2, i_1)} This operation corresponds to taking the partial trace over the second subsystem in a tensor product space.

theorem

tr(traceLeft(A))=tr(A)\text{tr}(\text{traceLeft}(A)) = \text{tr}(A)

#traceLeft_trace

Let AA be a matrix with row and column indices in d1×d2d_1 \times d_2 over a ring RR. The trace of the left partial trace of AA is equal to the trace of AA: tr(traceLeft(A))=tr(A)\text{tr}(\text{traceLeft}(A)) = \text{tr}(A) where traceLeft(A)\text{traceLeft}(A) is the partial trace over the first subsystem.

theorem

tr(traceRight(A))=tr(A)\text{tr}(\text{traceRight}(A)) = \text{tr}(A)

#traceRight_trace

Let AA be a matrix with row and column indices in d1×d2d_1 \times d_2 over a ring RR. The trace of the right partial trace of AA is equal to the trace of AA, i.e., tr(traceRight(A))=tr(A)\text{tr}(\text{traceRight}(A)) = \text{tr}(A) where traceRight(A)\text{traceRight}(A) is the partial trace over the second subsystem.

theorem

The Partial Trace over the Left Subsystem of a Hermitian Matrix is Hermitian

#traceLeft

Let AA be a matrix of shape (d×d1)×(d×d1)(d \times d_1) \times (d \times d_1) over a ring RR with an involution. If AA is a Hermitian matrix, then its partial trace over the left subsystem, denoted by A.traceLeftA.\text{traceLeft}, is also a Hermitian matrix.

theorem

The Partial Trace over the Right Subsystem of a Hermitian Matrix is Hermitian

#traceRight

Let AA be a matrix of shape (d1×d)×(d1×d)(d_1 \times d) \times (d_1 \times d) over a ring RR. If AA is a Hermitian matrix, then its partial trace over the right subsystem, denoted by A.traceRightA.\text{traceRight}, is also a Hermitian matrix.

theorem

Tr(M(AI))=Tr(TrB(M)A)\text{Tr}(M(A \otimes I)) = \text{Tr}(\text{Tr}_B(M)A)

#trace_mul_kron_one_right

Let RR be a ring, and let MMatdA×dB,dA×dB(R)M \in \text{Mat}_{d_A \times d_B, d_A \times d_B}(R) be a block matrix. For any matrix AMatdA,dA(R)A \in \text{Mat}_{d_A, d_A}(R), the trace of the product of MM and the Kronecker product of AA with the identity matrix IdBI_{d_B} is equal to the trace of the product of the partial trace of MM over the second subsystem and AA. That is, Tr(M(AI))=Tr(TrB(M)A)\text{Tr}(M(A \otimes I)) = \text{Tr}(\text{Tr}_B(M)A), where TrB\text{Tr}_B denotes the partial trace `traceRight`.

theorem

Tr(M(IB))=Tr(TrA(M)B)\text{Tr}(M (I \otimes B)) = \text{Tr}(\text{Tr}_A(M) B)

#trace_mul_one_kron_right

Let RR be a ring, and let MM be a matrix of shape (dA×dB)×(dA×dB)(d_A \times d_B) \times (d_A \times d_B) over RR. For any matrix BB of shape dB×dBd_B \times d_B, the trace of the product of MM and the Kronecker product of the identity matrix IdAI_{d_A} and BB is equal to the trace of the product of the partial trace of MM over the first system and BB. That is, Tr(M(IdAB))=Tr(TrA(M)B)\text{Tr}(M (I_{d_A} \otimes B)) = \text{Tr}(\text{Tr}_A(M) B) where TrA\text{Tr}_A denotes the partial trace over the left subsystem (`traceLeft`).

theorem

The partial trace over the first subsystem preserves positive semidefiniteness.

#traceLeft

Let AA be a positive semidefinite block matrix in Matd×d1,d×d1(R)\text{Mat}_{d \times d_1, d \times d_1}(R). Then its partial trace over the first subsystem, denoted by A.traceLeftMatd1,d1(R)A.\text{traceLeft} \in \text{Mat}_{d_1, d_1}(R), is also positive semidefinite.

theorem

The Right Partial Trace of a Positive Semidefinite Matrix is Positive Semidefinite

#traceRight

Let AA be a matrix with row and column indices in d1×dd_1 \times d over a field k\mathbb{k}. If AA is positive semidefinite, then its right partial trace traceRight(A)\text{traceRight}(A), which is a matrix of size d1×d1d_1 \times d_1 defined by (traceRight(A))i,j=kdA(i,k),(j,k)(\text{traceRight}(A))_{i, j} = \sum_{k \in d} A_{(i, k), (j, k)}, is also positive semidefinite.

theorem

The Kronecker product of two positive definite matrices is positive definite.

#kron

Let d1d_1 and d2d_2 be finite types with decidable equality, and let k\mathbb{k} be a field with a structure of a conjugate-closed field (such as R\mathbb{R} or C\mathbb{C}). For any two square matrices AMatd1×d1(k)A \in \text{Mat}_{d_1 \times d_1}(\mathbb{k}) and BMatd2×d2(k)B \in \text{Mat}_{d_2 \times d_2}(\mathbb{k}), if AA is positive definite and BB is positive definite, then their Kronecker product AkBA \otimes_k B is also positive definite.

theorem

An injective submatrix of a positive definite matrix is positive definite

#submatrix

Let k\mathbb{k} be a field which is either the real numbers R\mathbb{R} or the complex numbers C\mathbb{C}. Let d1d_1 and d2d_2 be finite types. Let MMatrix(d1,d1,k)M \in \text{Matrix}(d_1, d_1, \mathbb{k}) be a square matrix. If MM is positive definite and e:d2d1e : d_2 \to d_1 is an injective function, then the submatrix (Me(i),e(j))i,jd2(M_{e(i), e(j)})_{i, j \in d_2} is also positive definite.

theorem

If MM is positive definite, then M.reindex(e,e)M.\text{reindex}(e, e) is positive definite

#reindex

Let k\mathbb{k} be a field which is either the real numbers R\mathbb{R} or the complex numbers C\mathbb{C}. Let d1d_1 and d2d_2 be finite types with decidable equality. For any square matrix MMatrix(d1,d1,k)M \in \text{Matrix}(d_1, d_1, \mathbb{k}) and any equivalence (bijection) e:d1d2e : d_1 \simeq d_2, if MM is positive definite, then the reindexed matrix (M(e1×e1))Matrix(d2,d2,k)(M \circ (e^{-1} \times e^{-1})) \in \text{Matrix}(d_2, d_2, \mathbb{k}) is also positive definite.

theorem

MM is positive definite     \iff M.reindex(e,e)M.\text{reindex}(e, e) is positive definite

#reindex_iff

Let k\mathbb{k} be a field which is either the real numbers R\mathbb{R} or the complex numbers C\mathbb{C}. Let d1d_1 and d2d_2 be finite types. For any square matrix MMatrix(d1,d1,k)M \in \text{Matrix}(d_1, d_1, \mathbb{k}) and any equivalence (bijection) e:d1d2e : d_1 \simeq d_2, the matrix MM is positive definite if and only if the reindexed matrix (M(e1×e1))Matrix(d2,d2,k)(M \circ (e^{-1} \times e^{-1})) \in \text{Matrix}(d_2, d_2, \mathbb{k}) is positive definite.

theorem

M0M \succeq 0 and c0c \ge 0 implies cM0cM \succeq 0 for complex matrices

#rsmul

Let MM be a complex matrix of size n×nn \times n indexed by a finite type. If MM is positive semidefinite (M0M \succeq 0) and cc is a non-negative real number (c0c \ge 0), then the scalar product cMcM is also positive semidefinite.

theorem

The set of positive definite matrices is convex

#Convex

Let k\mathbb{k} be either the real numbers R\mathbb{R} or the complex numbers C\mathbb{C}, and let nn be a finite indexing type. The set of positive definite square matrices of size n×nn \times n over k\mathbb{k} is a convex set over the real numbers. That is, for any two positive definite matrices M1,M2M_1, M_2 and any t[0,1]t \in [0, 1], the linear combination (1t)M1+tM2(1 - t)M_1 + tM_2 is also positive definite.

theorem

M0M \succ 0 iff MM is Hermitian and has strictly positive eigenvalues

#PosDef_iff_eigenvalues'

Let MM be a square matrix of size d×dd \times d over a field k\mathbb{k} (where k\mathbb{k} is either R\mathbb{R} or C\mathbb{C}). MM is positive definite (M0M \succ 0) if and only if MM is Hermitian and all of its eigenvalues are strictly positive.

theorem

Eigenvalues of f(M)f(M) are ff applied to eigenvalues of Hermitian matrix MM

#cfc_eigenvalues

Let MM be a Hermitian matrix in Matd×d(k)\text{Mat}_{d \times d}(\mathbf{k}), where k\mathbf{k} is a conjugate-reals-like field (such as R\mathbb{R} or C\mathbb{C}), and let f:RRf: \mathbb{R} \to \mathbb{R} be a function. Let f(M)f(M) denote the matrix obtained by applying the continuous functional calculus to MM with respect to ff. Then there exists a permutation σ\sigma of the indices dd such that the eigenvalues of f(M)f(M) are given by the composition fλσf \circ \lambda \circ \sigma, where λ\lambda represents the eigenvalues of MM. In other words, the set of eigenvalues of f(M)f(M) is {f(λi)id}\{f(\lambda_i) \mid i \in d\}.

theorem

Eigenvalues of a unitarily diagonalizable Hermitian matrix are a permutation of its diagonal entries

#eigenvalues_eq_of_unitary_similarity_diagonal

Let dd be a finite index set and k\mathbb{k} be a field (R(\mathbb{R} or C)\mathbb{C}). Let AA be a Hermitian d×dd \times d matrix over k\mathbb{k}. If there exists a unitary matrix UU and a diagonal matrix D=diag(fi)D = \text{diag}(f_i) with real entries f:dRf: d \to \mathbb{R} such that AA is unitarily similar to DD, i.e., A=Udiag(fi)UA = U \text{diag}(f_i) U^* then there exists a permutation σ\sigma of the index set dd such that the eigenvalues of AA are given by the entries of ff permuted by σ\sigma, specifically: eigenvalues(A)σ=f\text{eigenvalues}(A) \circ \sigma = f

theorem

The linear map of the identity matrix is the identity operator

#toEuclideanLin_one

Let II be the n×nn \times n identity matrix over a ring α\alpha. Then the linear map associated with II acting on the Euclidean space αn\alpha^n, denoted by Matrix.toEuclideanLin(I)\text{Matrix.toEuclideanLin}(I), is the identity linear operator id\text{id}.

theorem

Functional calculus for diagonal matrices: cfc(f,diag(g))=diag(fg)\text{cfc}(f, \text{diag}(g)) = \text{diag}(f \circ g)

#cfc_diagonal

Let dd be a finite index set and k\mathbf{k} be either R\mathbb{R} or C\mathbb{C}. Given a function g:dRg: d \to \mathbb{R} and a real-valued function f:RRf: \mathbb{R} \to \mathbb{R}, let D=diag(g(x))D = \text{diag}(g(x)) be the diagonal matrix whose entries are the values of gg cast into k\mathbf{k}. Then the continuous functional calculus (cfc) of ff applied to DD is equal to the diagonal matrix whose entries are the values of (fg)(f \circ g), specifically: cfc(f,diag(g(x)))=diag(f(g(x)))\text{cfc}(f, \text{diag}(g(x))) = \text{diag}(f(g(x)))

theorem

Eigenvalues of a Positive Semidefinite Matrix are Non-negative

#pos_of_mem_spectrum

Let AA be a d×dd \times d matrix over a field K\mathbb{K} (where K\mathbb{K} is R\mathbb{R} or C\mathbb{C}). If AA is a positive semidefinite matrix, then every element rr in the spectrum of AA—that is, every eigenvalue rRr \in \mathbb{R}—satisfies r0r \ge 0.

theorem

cfc(rx+y,A)=cfc(rxry,A)\text{cfc}(r^{x+y}, A) = \text{cfc}(r^x \cdot r^y, A) for Positive Semidefinite AA

#pow_add

Let AA be a d×dd \times d positive semidefinite matrix over K\mathbb{K} (where K\mathbb{K} is either R\mathbb{R} or C\mathbb{C}). For any real numbers xx and yy such that their sum is non-zero (x+y0x + y \neq 0), the continuous functional calculus of the power function rrx+yr \mapsto r^{x+y} applied to AA is equal to the continuous functional calculus of the product of power functions rrxryr \mapsto r^x \cdot r^y applied to AA. That is, cfc(rrx+y,A)=cfc(rrxry,A)\text{cfc}(r \mapsto r^{x+y}, A) = \text{cfc}(r \mapsto r^x \cdot r^y, A)

theorem

The Law of Exponents (rx)y=rxy(r^x)^y = r^{xy} for the Continuous Functional Calculus of a Positive Semidefinite Matrix

#pow_mul

Let AA be a positive semidefinite d×dd \times d matrix over a field K\mathbb{K} (where K\mathbb{K} is R\mathbb{R} or C\mathbb{C}). For any real numbers xx and yy, the matrix constructed via the continuous functional calculus (cfc) using the function f(r)=rxyf(r) = r^{xy} is equal to the matrix constructed using the function g(r)=(rx)yg(r) = (r^x)^y. That is, cfc(rrxy,A)=cfc(r(rx)y,A) \text{cfc}(r \mapsto r^{xy}, A) = \text{cfc}(r \mapsto (r^x)^y, A)

theorem

tr(Ae,e)=tr(A)\text{tr}(A_{e,e}) = \text{tr}(A) for any bijection ee

#trace_submatrix

Let AA be a square matrix of size d1×d1d_1 \times d_1 with entries in α\alpha. Let e:d2d1e: d_2 \simeq d_1 be a bijection (index equivalence) between the index sets d2d_2 and d1d_1. Then the trace of the submatrix of AA formed by reindexing both rows and columns by ee is equal to the trace of AA. That is, tr(Ae(i),e(j))=tr(A)\text{tr}(A_{e(i), e(j)}) = \text{tr}(A).

theorem

Spectrum of Kronecker Product of Hermitian Matrices is the Product of Spectra

#spectrum_prod

Let dd and d2d_2 be finite types with decidable equality. Let AA be a d×dd \times d matrix and BB be a d2×d2d_2 \times d_2 matrix over a field k\mathbb{k}. If AA and BB are both Hermitian matrices, then the spectrum of their Kronecker product AkBA \otimes_{\mathbb{k}} B is equal to the set-wise product of their individual spectra, i.e., spectrumR(AkB)=spectrumR(A)spectrumR(B)\text{spectrum}_{\mathbb{R}}(A \otimes_{\mathbb{k}} B) = \text{spectrum}_{\mathbb{R}}(A) \cdot \text{spectrum}_{\mathbb{R}}(B).

theorem

A positive definite matrix is strictly greater than zero

#zero_lt

Let nn be a finite non-empty type and AA be an n×nn \times n matrix over the complex numbers C\mathbb{C}. If AA is positive definite (A>0A > 0 in the sense of matrices), then AA is strictly greater than the zero matrix.

theorem

The Spectrum of a Hermitian Matrix is the Set of its Eigenvalues

#spectrum_eq_image_eigenvalues

Let nn be a finite type and AA be an n×nn \times n Hermitian matrix over C\mathbb{C}. The spectrum of AA (as a set of real numbers) is equal to the set of its eigenvalues, i.e., spectrumR(A)={λiin}\text{spectrum}_{\mathbb{R}}(A) = \{ \lambda_i \mid i \in n \}, where λi\lambda_i denotes the eigenvalues associated with the Hermitian matrix AA.

theorem

Cardinality of spectrum equals the number of distinct eigenvalues for Hermitian AA

#card_spectrum_eq_image

Let nn be a finite index set and AMn(C)A \in M_n(\mathbb{C}) be a Hermitian matrix. Assume that the spectrum of AA over R\mathbb{R}, denoted σR(A)\sigma_{\mathbb{R}}(A), is a finite set. Then the cardinality of σR(A)\sigma_{\mathbb{R}}(A) is equal to the number of distinct values in the set of eigenvalues of AA.

theorem

A(infλi)IA - (\inf \lambda_i) I is Positive Semidefinite for Hermitian AA

#sub_iInf_eignevalues

Let AA be a Hermitian matrix. Then the matrix A(infλi)IA - (\inf \lambda_i) I is positive semidefinite, where infλi\inf \lambda_i denotes the infimum of the eigenvalues of AA and II denotes the identity matrix.

theorem

min(λ(A))v2vAv\min(\lambda(A)) \cdot \|v\|^2 \leq v^* A v for Hermitian matrix AA

#iInf_eigenvalues_le_dotProduct_mulVec

Let AA be a Hermitian matrix and vCdv \in \mathbb{C}^d be a vector. Then the product of the infimum of the eigenvalues of AA and the squared norm of vv (calculated as vvv^* v) is less than or equal to the quadratic form vAvv^* A v, where infλi\inf \lambda_i denotes the smallest eigenvalue of AA. In symbols, (infλi)(vv)vAv(\inf \lambda_i) (v^* v) \le v^* A v.

theorem

If BAB - A is Positive Semidefinite, then infλ(A)infλ(B)\inf \lambda(A) \le \inf \lambda(B)

#iInf_eigenvalues_le_of_posSemidef

Let AA and BB be Hermitian matrices. If the difference BAB - A is positive semidefinite, then the infimum of the eigenvalues of AA is less than or equal to the infimum of the eigenvalues of BB, that is, infλ(A)infλ(B)\inf \lambda(A) \le \inf \lambda(B).

theorem

AB    infλ(A)infλ(B)A \le B \implies \inf \lambda(A) \le \inf \lambda(B)

#iInf_eigenvalues_le

Let AA and BB be Hermitian matrices. If ABA \le B in the Loewner order, then the smallest eigenvalue of AA is less than or equal to the smallest eigenvalue of BB, i.e., infλ(A)infλ(B)\inf \lambda(A) \le \inf \lambda(B).

theorem

(infλ(A))IA(\inf \lambda(A)) I \le A

#iInf_eigenvalues_smul_one_le

Let AA be a Hermitian matrix and let infλ(A)\inf \lambda(A) denote the smallest eigenvalue of AA. Then the product of the smallest eigenvalue and the identity matrix is less than or equal to AA in the Loewner order, i.e., (infiλi)IA(\inf_i \lambda_i) I \le A.

theorem

If xAx - A is positive semidefinite, the spectrum of xx is bounded below by the minimum eigenvalue of AA

#spectrum_subset_Ici_of_sub

Let dd be a finite type and k\mathbb{k} be a real-complex-like field (such as R\mathbb{R} or C\mathbb{C}). Let AA and xx be d×dd \times d matrices over k\mathbb{k}. If AA is a Hermitian matrix and the matrix xAx - A is positive semidefinite, then the real spectrum of xx is a subset of the interval [λmin(A),)[\lambda_{\min}(A), \infty), where λmin(A)=infiλi(A)\lambda_{\min}(A) = \inf_i \lambda_i(A) is the minimum eigenvalue of AA.

theorem

If AxA - x is positive semidefinite, the spectrum of xx is bounded above by the maximum eigenvalue of AA.

#spectrum_subset_Iic_of_sub

Let dd be a finite type and k\mathbb{k} be a field with a structure of a real-complex-like field (such as R\mathbb{R} or C\mathbb{C}). Let AA and xx be d×dd \times d matrices over k\mathbb{k}. If AA is a Hermitian matrix and the matrix AxA - x is positive semidefinite, then the real spectrum of xx is a subset of the interval (,λmax(A)](-\infty, \lambda_{\max}(A)], where λmax(A)=supiλi\lambda_{\max}(A) = \sup_i \lambda_i is the maximum eigenvalue of AA.

theorem

The spectrum of xx is contained in [λmin(A),λmax(B)][\lambda_{\min}(A), \lambda_{\max}(B)] if xAx - A and BxB - x are positive semidefinite

#spectrum_subset_of_mem_Icc

Let dd be a finite type and k\mathbb{k} be a field with the structure of a real-complex-like field (such as R\mathbb{R} or C\mathbb{C}). Let A,BA, B, and xx be d×dd \times d matrices over k\mathbb{k}. If AA and BB are Hermitian matrices, and the matrices xAx - A and BxB - x are both positive semidefinite, then the real spectrum of xx is contained within the closed interval [λmin(A),λmax(B)][\lambda_{\min}(A), \lambda_{\max}(B)], where λmin(A)=infiλi(A)\lambda_{\min}(A) = \inf_i \lambda_i(A) is the minimum eigenvalue of AA and λmax(B)=supiλi(B)\lambda_{\max}(B) = \sup_i \lambda_i(B) is the maximum eigenvalue of BB.

theorem

Right partial trace equals left partial trace after swapping tensor factors

#traceRight_eq_traceLeft_reindex

Let RR be an additive commutative monoid and n,mn, m be finite types. For any matrix MM of size (n×m)×(n×m)(n \times m) \times (n \times m) with entries in RR, the right partial trace of MM is equal to the left partial trace of the matrix obtained by reindexing MM using the swap of the tensor factors (the commutativity of the Cartesian product n×mm×nn \times m \cong m \times n). That is, TrR(M)=TrL(Mσ)\text{Tr}_R(M) = \text{Tr}_L(M^\sigma) where MσM^\sigma denotes the reindexed matrix M(i2,j2),(i1,j1)=M(j2,i2),(j1,i1)M_{(i_2, j_2), (i_1, j_1)} = M_{(j_2, i_2), (j_1, i_1)}.

theorem

The trace of a nonzero positive semidefinite matrix is strictly positive

#trace_pos

Let k\mathbb{k} be a field with a structure of a real-complex like field (such as R\mathbb{R} or C\mathbb{C}) and nn be a finite type. For any matrix AMatn×n(k)A \in \text{Mat}_{n \times n}(\mathbb{k}), if AA is positive semidefinite and AA is not the zero matrix, then the trace of AA is strictly greater than zero, i.e., tr(A)>0\text{tr}(A) > 0.

theorem

Additivity of the left partial trace TrL(A+B)=TrL(A)+TrL(B)\text{Tr}_L(A + B) = \text{Tr}_L(A) + \text{Tr}_L(B)

#traceLeft_add

Let RR be an additive commutative monoid, and let d,d1,d2d, d_1, d_2 be finite types. For any matrices A,BMat(d×d1)×(d×d2)(R)A, B \in \text{Mat}_{(d \times d_1) \times (d \times d_2)}(R), the left partial trace of their sum is the sum of their individual left partial traces: TrL(A+B)=TrL(A)+TrL(B)\text{Tr}_L(A + B) = \text{Tr}_L(A) + \text{Tr}_L(B)

theorem

The partial trace over the left subsystem is compatible with negation: (A)traceLeft=(AtraceLeft)(-A)^{\text{traceLeft}} = - (A^{\text{traceLeft}})

#traceLeft_neg

For any matrix AA whose row and column indices are structured as products d×d1d \times d_1 and d×d2d \times d_2 respectively, the partial trace over the left-hand subsystem dd satisfies the property (A)traceLeft=(AtraceLeft)(-A)^{\text{traceLeft}} = -(A^{\text{traceLeft}}). Here, traceLeft\text{traceLeft} denotes the operation of taking the trace specifically over the first component of the index product.

theorem

The partial trace over the left subsystem is distributive over subtraction: (AB)traceLeft=AtraceLeftBtraceLeft(A - B)^{\text{traceLeft}} = A^{\text{traceLeft}} - B^{\text{traceLeft}}

#traceLeft_sub

For any matrices AA and BB whose row and column indices are structured as d×d1d \times d_1 and d×d2d \times d_2 respectively, the partial trace over the left-hand subsystem dd satisfies the linearity property (AB)traceLeft=AtraceLeftBtraceLeft(A - B)^{\text{traceLeft}} = A^{\text{traceLeft}} - B^{\text{traceLeft}}. Here, traceLeft\text{traceLeft} denotes the operation of taking the trace specifically over the first component of the index product.

theorem

The partial trace over the right subsystem is additive: (A+B).traceRight=A.traceRight+B.traceRight(A + B). \text{traceRight} = A. \text{traceRight} + B. \text{traceRight}

#traceRight_add

For any matrices AA and BB whose row and column indices are structured as d1×dd_1 \times d and d2×dd_2 \times d respectively, the partial trace over the right-hand subsystem dd satisfies the linearity property (A+B)traceRight=AtraceRight+BtraceRight(A + B)^{\text{traceRight}} = A^{\text{traceRight}} + B^{\text{traceRight}}. Here, traceRight\text{traceRight} denotes the operation of taking the trace specifically over the second component of the index product.

theorem

The partial trace over the right subsystem is compatible with negation: (A).traceRight=A.traceRight(-A).\text{traceRight} = -A.\text{traceRight}

#traceRight_neg

For any matrix AA with row indices in d1×dd_1 \times d and column indices in d2×dd_2 \times d, the right partial trace of its additive inverse is the additive inverse of its right partial trace, i.e., (A).traceRight=(A.traceRight)(-A).\text{traceRight} = - (A.\text{traceRight}).

theorem

The right partial trace is distributive over subtraction: (AB)traceRight=AtraceRightBtraceRight(A - B)^{\text{traceRight}} = A^{\text{traceRight}} - B^{\text{traceRight}}

#traceRight_sub

For any matrices AA and BB whose row and column indices are structured as d1×dd_1 \times d and d2×dd_2 \times d over a ring RR, the right partial trace traceRight\text{traceRight} satisfies the distributive property over subtraction: (AB)traceRight=AtraceRightBtraceRight(A - B)^{\text{traceRight}} = A^{\text{traceRight}} - B^{\text{traceRight}} where (A)traceRight(A)^{\text{traceRight}} denotes the matrix obtained by summing over the second component of the index product.

theorem

Trleft(rA)=rTrleft(A)\text{Tr}_{left}(r \cdot A) = r \cdot \text{Tr}_{left}(A)

#traceLeft_smul

Let AA be a square-block matrix of type (m×n)×(m×n)(m \times n) \times (m \times n) with entries in α\alpha, and let rr be a scalar from a ring RR. The partial trace of the scalar product rAr \cdot A over the first subsystem (left trace) is equal to the scalar rr multiplied by the partial trace of AA over the first subsystem, i.e., Trleft(rA)=rTrleft(A)\text{Tr}_{left}(r \cdot A) = r \cdot \text{Tr}_{left}(A).

theorem

traceRight(rA)=rtraceRight(A)\text{traceRight}(r \cdot A) = r \cdot \text{traceRight}(A)

#traceRight_smul

Let AA be a matrix in M(n×m)×(n×m)(α)M_{(n \times m) \times (n \times m)}(\alpha) and rr be a scalar in RR. The right partial trace of the scalar multiplication rAr \cdot A is equal to the scalar rr multiplied by the right partial trace of AA, i.e., traceRight(rA)=rtraceRight(A)\text{traceRight}(r \cdot A) = r \cdot \text{traceRight}(A).

theorem

Rows of a Unitary Matrix UU(n)U \in U(n) have Unit Norm

#unitaryGroup_row_norm

Let nn be a finite type and UU be an element of the unitary group U(n)U(n) of n×nn \times n complex matrices. For any row index ini \in n, the sum of the squared norms of the entries in that row (or column, depending on indexing convention) is equal to 1: jnUji2=1\sum_{j \in n} \|U_{ji}\|^2 = 1

definition

Kronecker product of a family of matrices iAi\bigotimes_i A_i

#piProd

Let RR be a commutative monoid and let {Ai}iI\{A_i\}_{i \in I} be a collection of square matrices where each AiA_i is a di×did_i \times d_i matrix with entries in RR. The function `Matrix.piProd` constructs a new matrix MM with dimensions (idi)×(idi)(\prod_i d_i) \times (\prod_i d_i), indexed by the Cartesian product of the original index sets. The entries of this matrix are defined by the product of the entries of the individual matrices: Mj,k=iI(Ai)ji,kiM_{j, k} = \prod_{i \in I} (A_i)_{j_i, k_i} where jj and kk are vectors such that ji,kidij_i, k_i \in d_i. This construction corresponds to the Kronecker product (tensor product) of the family of matrices (Ai)(A_i).

theorem

The Kronecker product of Hermitian matrices is Hermitian

#piProd

Let RR be a commutative semiring with a star-ring structure (an involution *), and let AiA_i be a family of square matrices over RR. If every matrix AiA_i in the family is Hermitian (i.e., Ai=AiA_i^* = A_i), then their Kronecker product iAi\bigotimes_i A_i (denoted as `piProd A`) is also a Hermitian matrix.

theorem

trace(piProdiAi)=itrace(Ai)\text{trace}(\text{piProd}_i A_i) = \prod_i \text{trace}(A_i)

#trace_piProd

Let RR be a commutative semiring and let AiA_i be a family of square matrices over RR indexed by ii. Then the trace of the Kronecker (tensor) product of these matrices, denoted by trace(iAi)\text{trace}(\bigotimes_i A_i), is equal to the product of the traces of the individual matrices: trace(piProdiAi)=itrace(Ai)\text{trace}\left(\text{piProd}_i A_i\right) = \prod_i \text{trace}(A_i) where piProd\text{piProd} denotes the product construction for matrices (corresponding to the tensor product on the block diagonal or the Kronecker product depending on the indexing convention).

theorem

Matrix.PosSemidef.piProd

#piProd

[RCLike R] (hA : ∀ i, (A i).PosSemidef) : (piProd A).PosSemidef

theorem

Submatrix Ae,fA_{e,f} Equals Product of Indexing Matrices and AA

#submatrix_eq_mul_mul

Let RR be a semiring and let d,d2,d3d, d_2, d_3 be types such that dd is finite and has decidable equality. Let AA be a square matrix of size d×dd \times d over RR. Given two indexing maps e:d2de: d_2 \to d and f:d3df: d_3 \to d, the submatrix of AA formed by these maps is equal to the product of an indexing matrix, the matrix AA, and another indexing matrix: Ae,f=PeAPfTA_{e, f} = P_e \cdot A \cdot P_f^T More precisely, A.submatrix e f=(submatrix I e id)A(submatrix I id f)A.\text{submatrix } e\ f = (\text{submatrix } I\ e\ \text{id}) \cdot A \cdot (\text{submatrix } I\ \text{id}\ f), where II denotes the identity matrix over RR and id\text{id} is the identity map on dd.

theorem

(CD)(AB)(CD)=(CAC)(DBD)(C \otimes D)(A \otimes B)(C \otimes D)^* = (C A C^*) \otimes (D B D^*)

#kronecker_conj_eq

Let m,n,p,qm, n, p, q be finite types and α\alpha be a commutative star-ring. Given matrices AMatm×m(α)A \in \text{Mat}_{m \times m}(\alpha), BMatn×n(α)B \in \text{Mat}_{n \times n}(\alpha), CMatp×m(α)C \in \text{Mat}_{p \times m}(\alpha), and DMatq×n(α)D \in \text{Mat}_{q \times n}(\alpha), then the conjugate of the Kronecker product ABA \otimes B by the Kronecker product CDC \otimes D satisfies: (CD)(AB)(CD)=(CAC)(DBD)(C \otimes D) (A \otimes B) (C \otimes D)^* = (C A C^*) \otimes (D B D^*) where \otimes denotes the Kronecker product and MM^* denotes the conjugate transpose (Hermitian adjoint) of a matrix MM.