We propose a new semantics and pragmatics for epistemic statements which builds on the systems of Yalcin (2012) and Moss (2015), but offers several empirical advantages. The key improvements stem from (a) modeling information states using probabilistic graphical models, a framework for knowledge representation that is highly influential in psychology, AI, and philosophy; and (b) a new method of treating probabilities as ordinary random variables, making it possible to condition information states on probability statements such as Rain is likely [roughly, P(rain) > .5]. This feature makes it possible to account for the dynamic effects of epistemic sentences while maintaining a thoroughgoing Bayesianism, with conditioning as the only update operation. Nested epistemic statements are also given a natural interpretation in terms of higher-order probability, which is implicitly defined once probabilities are treated as random variables. This approach simplifies Moss’ account and avoids some of its less desirable features, but the simplification re-introduces certain empirical challenges, which are discussed in conclusion.
Daniel Lassiter is Assistant Professor of Linguistics at Stanford University.