In this lecture, we start the discussion with Markov Random Fields (MRFs), and how to condition them. We will formally discuss Conditional Random Fields (CRFs), and how to choose between MRFs and CRFs. Finally, we will start the discussion on exact inference via message passing.
A Markov Random Field is a distribution that factorizes over an undirected graph such that its nodes represent the random variables $\mathsf{x}_i$ and
$$ p(x_1, x_2, \dots, x_n) = \frac{1}{Z} \prod_{c\in C} \phi_c(x_c), $$
where, the normalization constant
$$ Z = \sum_x \prod_{c\in C} \phi_c(x_c), $$
and $C$ is the set of all maximal cliques.
Question: What is the difference between Undirected Models and Markov Random Fields?
Answer: This distinction will become clear when we look at Conditional Random Fields(CRFs). The short answer is that Undirected Models are strictly more general than MRFs. Specifically, both MRFs and CRFs are undirected models.
Example 1: Let's look at an example before we start the discussion on Conditional Random Fields. Consider the following graph
On applying the HC theorem, we can write the MRF for the above graph using the maximal cliques
$$ p(x) = \frac{1}{Z} \phi_{1,2}(x_1,x_2)\phi_{1,3}(x_1,x_3) \phi_{2,5}(x_2,x_5)\phi_{3,4}(x_3,x_4) \phi_{4,5}(x_4,x_5) \phi_{5,6}(x_5,x_6).\\ $$
Question : Now, how to calculate a conditional distribution like $p(x_1, x_2, x_3 | x_4, x_5, x_6)$?
Answer:
$$ \begin{aligned} p(x_1, x_2, x_3 | x_4, x_5, x_6) &= \frac{1}{Z(x_4,x_5,x_6)} \phi_{1,2}(x_1,x_2)\phi_{1,3}(x_1,x_3)\phi_{3,4}(x_3,x_4)\phi_{2,5}(x_2,x_5), \end{aligned} $$
where,
$$ Z(x_4,x_5,x_6) = \sum_{x_1}\sum_{x_2}\sum_{x_3}\phi_{1,2}(x_1,x_2)\phi_{1,3}(x_1,x_3) \phi_{3,4}(x_3,x_4) \phi_{2,5}(x_2,x_5). $$