When we stopped last time, we were in the middle of proving the Hammersley-Clifford Theorem. Today we will finish the proof, and talk more about undirected models.

1. Theorem: Hammersley-Clifford

A positive distribution $p(x) > 0$ satisfies the CIs (conditional independencies) of an undirected graph iff $p$ can be represented as

$$ p(x) = \frac{1}{Z} \prod_{c\in C} \phi_c(x_c). $$

where $C$ is the set of all (maximal) cliques in the graph and

$$ Z = \sum_{x} \prod_{c \in C} \phi_c(x_c). $$

2. Proof of Hammersley-Clifford Theorem (Cont'd)

In the last lecture we covered the first three steps of the proof of Hammersley-Clifford Theorem.

Setup:

Define $x^* = (0,0,\dots,0)$ and $Q(x) = \ln(p(x) / p(x^*)).$

Step 1 (revisit):

We can write $Q$ uniquely as

$$

Q(x) = \sum_i x_iG_i(x_i) + \sum_{i<j} x_ix_jG_{i,j}(x_i, x_j) + \sum_{i<j<k} x_ix_jx_k G_{i,j,k}(x_i, x_j, x_k) + \dots + x_1x_2\dots x_DG_{1,2,\dots,D}(x_1,x_2,\dots,x_D).

$$

Claim: We can choose $G$ uniquely to make the left hand side (LHS) and the right hand side (RHS) of the equation equal. Note that we are assuming the variable are discrete.

Step 2 (revisit):

Let's define

$$ ⁍ $$

which means that only the $i^{th}$ component of $x$ is set to zero. Then we have

$$ \exp(Q(x) - Q(x^i)) = \frac{p(x)}{p(x^i)} = \frac{p(\mathsf{x}i = x_i|x{-i})}{p(\mathsf{x}i=0 | x{-i})}. $$

Step 3 (revisit):