We give a principled methodology for decomposing the predictive uncertainty of a mannequin into aleatoric and epistemic parts with express semantics relating them to the real-world knowledge distribution. Whereas many works within the literature have proposed such decompositions, they lack the kind of formal ensures we offer. Our methodology is predicated on the brand new notion of higher-order calibration, which generalizes extraordinary calibration to the setting of higher-order predictors that predict mixtures over label distributions at each level. We present easy methods to measure in addition to obtain higher-order calibration utilizing entry to okayokayokay-snapshots, specifically examples the place every level has okayokayokay impartial conditional labels. Beneath higher-order calibration, the estimated aleatoric uncertainty at a degree is assured to match the real-world aleatoric uncertainty averaged over all factors the place the prediction is made. To our data, that is the primary formal assure of this kind that locations no assumptions in any respect on the real-world knowledge distribution. Importantly, higher-order calibration can also be relevant to current higher-order predictors equivalent to Bayesian and ensemble fashions and gives a pure analysis metric for such fashions. We display via experiments that our methodology produces significant uncertainty decompositions for picture classification.