Mutual knowledge is a fundamental concept about information in game theory, (epistemic) logic, and epistemology. An event is mutual knowledge if all agents know that the event occurred.[1]: 73 However, mutual knowledge by itself implies nothing about what agents know about other agents' knowledge: i.e. it is possible that an event is mutual knowledge but that each agent is unaware that the other agents know it has occurred.[2] Common knowledge is a related but stronger notion; any event that is common knowledge is also mutual knowledge.
The philosopher Stephen Schiffer, in his book Meaning, developed a notion he called "mutual knowledge" which functions quite similarly to David K. Lewis's "common knowledge".[3]
Communications (verbal or non-verbal) can turn mutual knowledge into common knowledge. For example, in the Muddy Children Puzzle with two children (Alice and Bob, ), if they both have muddy face (viz. ), both of them know that there is at least one muddy face. Written formally, let , and then we have . However, neither of them know that the other child knows (), which makes mutual knowledge. Now suppose if Alice tells Bob that she knows (so that becomes common knowledge, i.e. ), and then Bob tells Alice that he knows as well (so that becomes common knowledge, i.e. ), this will turn into common knowledge (), which is equivalent to the effect of a public announcement "there is at least one muddy face".
See also
External links
References
- ^ Osborne, Martin J., and Ariel Rubinstein. A Course in Game Theory. Cambridge, MA: MIT, 1994. Print.
- ^ Peter Vanderschraaf, Giacomo Sillari (2007). Common Knowledge. Stanford Encyclopedia of Philosophy. Accessed 18 November 2011.
- ^ Stephen Schiffer, Meaning, 2nd edition, Oxford University Press, 1988. The first edition was published by OUP in 1972. Also, David Lewis, Convention, Cambridge, MA: Harvard University Press, 1969. For a discussion of both Lewis's and Schiffer's notions, see Russell Dale, The Theory of Meaning (1996).