The kappa statistics measured the intra- and inter-observerreliability.
2
The vast majority of CPA events were categorized using the ART Matrix with high inter-observerreliability.
3
Student t-test and intra-class correlation coefficient (ICC) were both utilized to assess inter-observerreliability.
4
Region analysis and individual review (the original recommendation) on light-microscopy yielded the highest inter-observerreliability.
5
The inter-observerreliability between the two observers within one method (3D-CT or MR-arthro) was moderate to good.
6
Unweighted Cohen's kappa coefficients and Spearman's correlation values were calculated to assess inter-observerreliability and validity at each point in time.
7
We statistically assessed agreement between the manual and the computer-assisted technique, the intra-observer and the inter-observerreliability of the computer-assisted technique.
8
The two highest inter-observerreliability was fair to moderate (ICC: 0.71 and 0.74) in 2 methods (region-analysis and individual-review) on light microscopy.
9
Results: Inter-observerreliability was similar for both methods.
10
Inter-observerreliability for status scores and change scores was determined by intraclass correlation coefficients and by the Smallest Detectable Change method.