To assess the quality of the imaging procedure requests and radiologists' reports using an auditing tool, and to assess the agreement between different observers of the quality parameters.
In an audit using a standardized scoring system, three observers reviewed request forms for 296 consecutive radiological examinations, and two observers reviewed a random sample of 150 of the corresponding radiologists' reports. We present descriptive statistics from the audit and pairwise inter-observer agreement, using the proportion agreement and kappa statistics.
The proportion of acceptable item scores (0 or +1) was above 70% for all items except the requesting physician's bleep or extension number, legibility of the physician's name, or details about previous investigations. For pairs of observers, the inter-observer agreement was generally high, however, the corresponding kappa values were consistently low with only 14 of 90 ratings >0.60 and 6 >0.80 on the requests/reports. For the quality of the clinical information, the appropriateness of the request, and the requested priority/timing of the investigation items, the mean percentage agreement ranged 67-76, and the corresponding kappa values ranged 0.08-0.24.
The inter-observer reliability of scores on the different items showed a high degree of agreement, although the kappa values were low, which is a well-known paradox. Current routines for requesting radiology examinations appeared satisfactory, although several problem areas were identified.