

Probability Models for Rater Agreement in Medical Diagnosis, Fachbücher
49,00 €
The most popular measure of agreement is Cohen s Kappa which is an agreement index for the correlation between categorical variables often used as a reliability or validity measure. Although kappa has meaning as percentage agreement corrected for chance, its definition does not give a direct probabilistic support. The book studies a probability framework to model rater agreement in medical diagnosis, and hence a new index of agreement based on a probability model is proposed, which carries some useful interpretations along with providing information about intrinsic agreement not due to chance. The new index results in the same form as Cohen s kappa in practice and thus provides a novel means to understand kappa. Estimation based on maximum likelihood and statistical tests for hypothesis related to this new index are developed for inferences. Monte Carlo simulations are used to confirm th.
Deine Shops für beste Deals

Galaxus
Logge dich ein für Coupon Details