It is a trivial matter to show that a Gibbs random field satisfies every Markov property. As an example of this fact, see the following:
In the image to the right, a Gibbs random field over the provided graph has the form . If variables and are fixed, then the global Markov property requires that: (see conditional independence), since forms a barrier between and .
With and constant, where and . This implies that .
To establish that every positive probability distribution that satisfies the local Markov property is also a Gibbs random field, the following lemma, which provides a means for combining different factorizations, needs to be proved:
Lemma 1
Let denote the set of all random variables under consideration, and let and denote arbitrary sets of variables. (Here, given an arbitrary set of variables , will also denote an arbitrary assignment to the variables from .)
If
for functions and , then there exist functions and such that
In other words, provides a template for further factorization of .
Proof of Lemma 1
In order to use as a template to further factorize , all variables outside of need to be fixed. To this end, let be an arbitrary fixed assignment to the variables from (the variables not in ). For an arbitrary set of variables , let denote the assignment restricted to the variables from (the variables from , excluding the variables from ).
Moreover, to factorize only , the other factors need to be rendered moot for the variables from . To do this, the factorization
will be re-expressed as
For each : is where all variables outside of have been fixed to the values prescribed by .
Let
and
for each so
What is most important is that when the values assigned to do not conflict with the values prescribed by , making "disappear" when all variables not in are fixed to the values from .
Fixing all variables not in to the values from gives
Since ,
Letting
gives:
which finally gives:
Lemma 1 provides a means of combining two different factorizations of . The local Markov property implies that for any random variable , that there exists factors and such that:
where are the neighbors of node . Applying Lemma 1 repeatedly eventually factors into a product of clique potentials (see the image on the right).
^Lafferty, John D.; Mccallum, Andrew (2001). "Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data". Proc. of the 18th Intl. Conf. on Machine Learning (ICML-2001). Morgan Kaufmann. ISBN9781558607781. Retrieved 14 December 2014. by the fundamental theorem of random fields (Hammersley & Clifford 1971)
^Dobrushin, P. L. (1968), "The Description of a Random Field by Means of Conditional Probabilities and Conditions of Its Regularity", Theory of Probability and Its Applications, 13 (2): 197–224, doi:10.1137/1113026
^Spitzer, Frank (1971), "Markov Random Fields and Gibbs Ensembles", The American Mathematical Monthly, 78 (2): 142–154, doi:10.2307/2317621, JSTOR 2317621
^Hammersley, J. M.; Clifford, P. (1971), Markov fields on finite graphs and lattices (PDF)
^Clifford, P. (1990), "Markov random fields in statistics", in Grimmett, G. R.; Welsh, D. J. A. (eds.), Disorder in Physical Systems: A Volume in Honour of John M. Hammersley, Oxford University Press, pp. 19–32, ISBN978-0-19-853215-6, MR 1064553, retrieved 2009-05-04
^Preston, C. J. (1973), "Generalized Gibbs states and Markov random fields", Advances in Applied Probability, 5 (2): 242–261, doi:10.2307/1426035, JSTOR 1426035, MR 0405645
^Sherman, S. (1973), "Markov random fields and Gibbs random fields", Israel Journal of Mathematics, 14 (1): 92–103, doi:10.1007/BF02761538, MR 0321185
Grimmett, Geoffrey (2018), "7.", Probability on Graphs (2nd ed.), Cambridge University Press, ISBN9781108438179
Langseth, Helge, The Hammersley–Clifford Theorem and its Impact on Modern Statistics (PDF), Department of Mathematical Sciences, Norwegian University of Science and Technology