The Equivalence of Weighted Kappa and the Intraclass Correlation Coefficient as Measures of Reliability - Joseph L. Fleiss, Jacob Cohen, 1973
![Understanding the calculation of the kappa statistic: A measure of inter-observer reliability | Semantic Scholar Understanding the calculation of the kappa statistic: A measure of inter-observer reliability | Semantic Scholar](https://d3i71xaburhd42.cloudfront.net/352d009ea266e6771ca6c699ab9869d8eba1bb24/3-Table5-1.png)
Understanding the calculation of the kappa statistic: A measure of inter-observer reliability | Semantic Scholar
![Frontiers | Intra-rater Kappa Accuracy of Prototype and ICD-10 Operational Criteria-Based Diagnoses for Mental Disorders: A Brief Report of a Cross-Sectional Study in an Outpatient Setting Frontiers | Intra-rater Kappa Accuracy of Prototype and ICD-10 Operational Criteria-Based Diagnoses for Mental Disorders: A Brief Report of a Cross-Sectional Study in an Outpatient Setting](https://www.frontiersin.org/files/MyHome%20Article%20Library/793743/793743_Thumb_400.jpg)
Frontiers | Intra-rater Kappa Accuracy of Prototype and ICD-10 Operational Criteria-Based Diagnoses for Mental Disorders: A Brief Report of a Cross-Sectional Study in an Outpatient Setting
![Symmetry | Free Full-Text | An Empirical Comparative Assessment of Inter-Rater Agreement of Binary Outcomes and Multiple Raters Symmetry | Free Full-Text | An Empirical Comparative Assessment of Inter-Rater Agreement of Binary Outcomes and Multiple Raters](https://www.mdpi.com/symmetry/symmetry-14-00262/article_deploy/html/images/symmetry-14-00262-g001.png)