Accuracy of radiographic readings in the emergency department

Published:March 11, 2010DOI:https://doi.org/10.1016/j.ajem.2009.07.011

      Abstract

      Objectives

      A review of radiology discrepancies of emergency department (ED) radiograph interpretations was undertaken to examine the types of error made by emergency physicians (EPs).

      Methods

      An ED quality assurance database containing all radiology discrepancies between the EP and radiology from June 1996 to May 2005 was reviewed. The discrepancies were categorized as bone, chest (CXR), and abdomen (AXR) radiographs and examined to identify abnormalities missed by EPs.

      Results

      During the study period, the ED ordered approximately 151 693 radiographs. Of the total, 4605 studies were identified by radiology as having a total of 5308 abnormalities discordant from the EP interpretation. Three hundred fifty-nine of these abnormalities were not confirmed by the radiologist (false positive). The remainder of the discordant studies represented abnormalities identified by the radiologist and missed by the EP (false negatives). Of these false-negative studies, 1954 bone radiographs (2.4% of bone x-rays ordered) had missed findings with 2050 abnormalities; the most common missed findings were fractures and dislocations. Of the 220 AXRs (3.7% of AXRs ordered) with missed findings, 240 abnormalities were missed; the most common of these was bowel obstruction. Of the 2431 CXRs (3.8% of CXRs ordered), 2659 abnormalities were missed; the most common were air-space disease and pulmonary nodules. The rate of discrepancies potentially needing emergent change in management based solely on a radiographic discrepancy was 85 of 151 693 x-rays (0.056%).

      Conclusions

      Approximately 3% of radiographs interpreted by EPs are subsequently given a discrepant interpretation by the radiology attending. The most commonly missed findings included fractures, dislocations, air-space disease, and pulmonary nodules. Continuing education should focus on these areas to attempt to further reduce this error rate.

      1. Introduction

      Emergency medicine (EM) attending physicians (EP) often order imaging studies as part of a patient's workup in the emergency department (ED). In most EDs, the interpretation of the radiograph is initially done by the treating EP, with a radiologist's (RAD) interpretation done after the disposition of the patient [
      • O'Leary M.R.
      • Smith M.
      • Olmsted W.W.
      Physician assessments of practice patterns in emergency department radiograph interpretation.
      ,
      • Torreggiani W.C.
      • Nicolaou S.
      • Lyburn I.D.
      • Harris A.C.
      • et al.
      Emergency radiology in Canada: a national survey.
      ,
      • James M.R.
      • Bracegirdle A.
      • Yates D.W.
      X-ray reporting in accident and emergency departments—an area for improvements in efficiency.
      ,
      • Hunter T.B.
      • Krupinski E.A.
      • Hunt K.R.
      • Erly W.K.
      Emergency department coverage by academic departments of radiology.
      ]. Most EDs have a quality assurance system to ensure concordance between the EP and RAD interpretation. Should a discrepancy between the 2 interpretations be noted, a review of the ED chart typically determines whether or not this discrepancy is of any clinical importance and if so, whether or not it needs to alter the clinical management of the patient. A number of studies have not only examined such discrepancies but also investigated the effect of interventions to minimize such events [
      • Preston C.A.
      • Marr J.J.
      • Amaraneni K.K.
      • Suthar B.S.
      Reduction of ‘callbacks’ to the ED due to discrepancies in plain radiograph interpretation.
      ,
      • Espinosa J.A.
      • Nolan T.W.
      Reducing errors made by emergency physicians in interpreting radiographs: longitudinal study.
      ]. In the past, large discrepancy rates of up to 45% have been noted between internal medicine house staff and radiology attending radiograph interpretations [
      • De Lacey G.
      • Barker A.
      • Harper J.
      • et al.
      An assessment of the clinical effects of reporting accident and emergency radiographs.
      ,
      • Fleisher G.
      • Ludwig S.
      • McSorley M.
      Interpretation of pediatric x-ray films by emergency department pediatricians.
      ,
      • McLain P.L.
      • Kirkwood C.R.
      The quality of emergency room radiograph interpretations.
      ]. More recent studies have indicated this rate to be as low as 1.1% in EDs [
      • Warren J.S.
      • Lara K.
      • Connor P.D.
      • et al.
      Correlation of emergency department radiographs: results of a quality assurance review in an urban community hospital setting.
      ]. The largest study to date has reviewed 16 246 radiographs [
      • Thomas H.G.
      • Mason A.C.
      • Smith R.M.
      • Fergusson C.M.
      Value of radiograph audit in an accident service department.
      ], although most studies have been smaller [
      • Nitowski L.A.
      • O'Connor R.E.
      • Reese C.L.
      The rate of clinically significant plain radiograph misinterpretation by faculty in an emergency medicine residency program.
      ,
      • Berman L.
      • de Lacey G.
      • Twomey E.
      • et al.
      Reducing errors in the accident department: a simple method using radiographers.
      ,
      • Quick G.
      • Podgorny G.
      An emergency department radiology audit procedure.
      ,
      • Klein E.J.
      • Koenig M.
      • Diekema D.S.
      • Winters W.
      Discordant radiograph interpretation between emergency physicians and radiologists in a pediatric emergency department.
      ,
      • Masel J.P.
      • Grant P.J.
      Accuracy of radiological diagnosis in the casualty department of a children's hospital.
      ]. We undertook a review of radiographs at our institution to quantify and characterize the discrepancies between EM and radiology interpretations of plain radiographs and to determine our accuracy of diagnostic radiographic interpretation by the EP.

      2. Methods

      This study was a retrospective review of all plain radiographs ordered during the period of the study from June 1996 to May 2005 in the ED of an urban, university-affiliated level I trauma center with a residency training program in EM. Most patient population was older than 18 years. The study was approved by the institutional review board. Attending EPs routinely reviewed all plain radiographs ordered in the ED and documented a preliminary interpretation on a “wet read” form. The same radiograph was subsequently reviewed by an attending RAD. Any discrepancies were reported to the ED staff member responsible for follow-ups, typically a third-year EM resident. Upon receipt of the discrepancy, it was entered into a database (Microsoft Access, Redmond, Wash). The EM resident reviewed the patient's medical record to determine whether or not the discrepancy was known to the EP attending (ie, flagged by the RAD as a discrepancy due to incomplete documentation only). All other studies were entered as false negative, false positive, questionable false negative, or questionable false positive (see Table 1 for definitions). The EM resident's assessment of the clinical importance of the discrepancy, based on the abnormality described, medical record review, and consultation with the on duty EP attending, then led to at least one of the following follow-up actions: no action, call to admitting team (if patient was an inpatient), call to patient's regular physician, call to patient to change follow-up or instructions, or call to patient to return to the ED. Typically, the EM resident documented the result of the follow-up action in the database, along with any pertinent patient information to that discrepancy. This included whether or not that patient was admitted or discharged and what change in follow-up or treatment was conveyed to the patient. This quality assurance process remained unchanged throughout the study period.
      Table 1Case definitions
      RadiographAny diagnostic radiograph, to exclude cross-sectional imaging and ultrasonography
      AbnormalityPathology identified on the radiograph (ASD, humeral head fracture, etc)
      DiscrepancyEvent where the ED attending interpretation of a radiograph does not match the radiology attending interpretation of the same radiograph.
      RADRadiologist (attending)
      EPEmergency physician (attending)
      False negativeAn abnormality missed by the EP
      False positiveAn abnormality noted by the EP which RAD did not feel was present.
      Questionable false negativeA radiograph in which the EP noted no abnormalities and RAD indicated the possibility of an abnormality.
      Questionable false positiveA radiograph in which the EP noted an abnormality and RAD indicated the possibility that the abnormality did not exist.
      The inclusion criteria into the study were all plain radiograph discrepancies entered into the quality assurance database. All cross-sectional imaging studies (such as computed tomography, magnetic resonance imaging, ultrasound) were excluded, as well as blank entries in the database containing no data. At least 2 investigators independently reviewed each discrepancy in the database and removed discrepancies in which documentation error and not radiograph misinterpretation led to a discrepancy. Disagreements between investigators were settled by consensus of all 4 investigators. If the investigators were not able to determine with certainty after review of the database whether the abnormality was known to the treating EP, then it was counted as a false negative. Investigators then reviewed the remaining discrepancies and further categorized them into bone (including soft tissue), abdomen (AXR), and chest (CXR). Multiple abnormalities existed for some discrepancies, and each abnormality was individually counted. Of note, some abnormalities resulted in a discrepancy being classified into both false-negative and false-positive abnormalities if the EP overread one finding but missed another on the same discrepancy. When the RAD was not certain about a false-negative abnormality, the abnormality was called a questionable false negative. For the purposes of this study, these abnormalities were analyzed as true false negatives, as they potentially represented pathology missed by EPs. The total number of radiographs ordered during the study period was extrapolated using available billing data from 2001 through 2005. See Table 1 for case definitions.
      The data were extracted from Microsoft Access and processed in Microsoft Excel (Microsoft). Using the number of abnormalities compared with the total number of radiographs ordered, the overall accuracy of the EP radiographic interpretation was ascertained.

      3. Results

      The investigators reviewed all 5660 discrepancies in the database. Of these, 449 discrepancies were cross-sectional imaging ineligible for the study. Two discrepancies were test entries and 23 were blank, containing no data fields, leaving 5186 discrepancies. Eighteen discrepancies upon review were found to represent 2 separate radiograph discordances, yielding 5204 total discrepancies. A total of 599 discrepancies removed after review of the discrepancy clearly indicated that the treating EP was aware of the finding reported as discrepant. Investigators then reviewed the remaining 4605 discrepancies and further categorized the discrepancies into bone, AXR, and CXR. Due to multiple abnormalities on a single discrepancy, a total of 5308 abnormalities were reported.
      The total number of radiographs ordered during the study period was extrapolated using available billing data from the year 2001 through 2005 to be 151 693 radiographs. Of these, 82 557 were bone radiographs, 5,987 were AXR, and 63 149 were CXR.
      Within each category, major groupings served to organize the data. The bone category was grouped into dislocations, fractures, old fractures, retained foreign bodies, inadequate films, and soft tissue or bony lesions other than fractures, and other pathology. The bone-other grouping included above-the-diaphragm pathology such as pulmonary or vascular disease, as well as sinus disease and presence of hardware. Abdomen discrepancies were placed into bowel obstruction, free air, calcifications, masses, chest, bony lesion, foreign body, and other pathology groupings. The AXR-other grouping included abnormal bowel gas pattern, pancreatitis, organomegally, dilated bladder, ascites, hernia, as well as questionable pneumobilia, hernia, and abscess. Chest discrepancies were placed into pulmonary (atelectasis/air-space disease [ASD]), structural (hilum/mass/nodule/pleural problems), vascular, bone, foreign body, abdominal, soft tissue, and other groupings. The CXR-other grouping included small heart size, poor inspiratory effort, incomplete studies, elevated diaphragm, and others.
      Fig. 1 illustrates the breakdown of the abnormalities by major radiographic categories. False positives represented 6.8% of all abnormalities (359/5308), of which 4 abnormalities were qualified by RAD as questionable false positive. The remaining studies represented false negative (93.2%), of which 17.4% (859/4949) were qualified by RAD as questionable false negatives.
      Table 2 illustrates the results of the bone radiograph abnormalities. On bone radiographs, a total of 878 fractures and 263 questionable fractures were missed by EPs (see Table 3 for types of fractures missed—across all x-ray categories to include CXR, AXR, and bone). Twenty-four complete joint dislocations were missed on bone x-rays, most commonly of the hand, shoulder, finger, and wrist. There were an additional 78 questionable dislocations including subluxations and/or questionable subluxations. Forty-six of 78 represented cervical spine discordances. Of these 46, half (23) were related to C1/C2 asymmetry. Fifty radiographs were judged to be inadequate due to imaging techniques. Of these, 36 involved the cervical spine. Foreign bodies on bone x-rays were missed in 27 studies. Soft tissue lesions included 180 cases of soft tissue swelling, 249 cases of bony lesions (avulsions, DJD, osteomyelitis, etc), 8 calcifications, and 116 cases of missed joint effusions.
      Table 2Bone radiograph discrepancy summary—false negatives (n = 2050)
      GroupingFalse negativesQuestionable false negative
      Acute fractures878263
      Soft tissue lesions55364
      Old fractures490
      Dislocations2478
      Foreign bodies2714
      Other3614
      Incomplete studies500
      Totals1617433
      Table 3Missed fractures (definite) across all radiograph categories
      Fracture typeN
      Ankle7
      Calcaneous42
      Cervical spine11
      Clavicle26
      Facial bones12
      Femur26
      Fibula68
      Finger176
      Foot52
      Glenoid6
      Hand35
      Humerus40
      Lumbar spine43
      Nasal9
      Patella14
      Pelvis17
      Radius83
      Rib108
      Sacrum4
      Scaphoid1
      Scapula6
      Shoulder1
      Skeleton not otherwise specified3
      Spine not otherwise specified2
      Sternum3
      Thoracic spine66
      Tibia57
      Toe105
      Ulna30
      Total1053
      Table 4 illustrates the results of the AXR radiograph abnormalities. On AXR, 13 small bowel obstructions (SBOs) were missed with an additional 43 that were qualified as possible SBO. There were an additional 14 air/fluid levels, 11 partial SBO, and 2 large bowel obstructions. There were 45 missed calcifications and 26 missed abdominal masses including 3 missed questionable abdominal aortic aneurysms. Three abdominal series showed extraluminal contrast. Finally, there were 2 cases of missed free air.
      Table 4AXR radiograph discrepancy summary—false negatives (n = 240)
      GroupingFalse negativeQuestionable false negative
      Calcification450
      Bowel obstruction4043
      Intra-abdominal—other305
      Mass260
      Chest251
      Bone162
      Foreign body50
      Free air20
      Total18951
      Table 5 illustrates the results of the CXR radiograph abnormalities. On CXR, ASD was missed in 765 cases, and there were 90 questionable false negatives. The most commonly missed pulmonary pathology overall involved both lungs in 433 or 35% of cases, followed by the left lower lobe in 236 cases. A total of 23 pneumothoraces (PTXs) were missed and 9 questionable PTXs. Two pneumomediastinums were also missed, as were 8 cases of free air under the diaphragm and 9 cases of questionable free air under the diaphragm. Eighty-three aortic and 16 questionable aortic lesions were missed. One hundred twenty-eight cases of pulmonary edema and 120 cases of pleural effusions were missed by EPs. Three hundred eight pulmonary nodules or masses were missed, and 148 were questionable misses. Table 6 illustrates the abnormalities missed by EPs (false negatives) for all radiographs across the 3 categories of radiographs.
      Table 5CXR radiograph discrepancies summary—false negatives (n = 2659)
      GroupingFalse negativeQuestionable false negative
      Pulmonary1098130
      Structural588175
      Vascular18723
      Bone27223
      Abdominal5121
      Foreign body431
      Soft tissue152
      Other300
      Total2284375
      Table 6Discrepancy summary across all categories (bone, CXR, and AXR)—false negatives (n = 4949)
      Values differ from the prior 3 tables as a result of abnormalities being picked up on a nondedicated film, that is, bowel obstruction being detected on chest x-ray.
      GroupingFalse negativesQuestionable false negatives
      Bowel obstruction4747
      Calcification672
      Dislocations2680
      Foreign body7515
      Fractures1053280
      Free air1010
      Incomplete studies500
      Intra-abdominal—other418
      Abdominal mass440
      Old fractures820
      Other250
      Pulmonary1134138
      Soft tissue lesions63776
      Structural599175
      Vascular20028
      Total4090859
      a Values differ from the prior 3 tables as a result of abnormalities being picked up on a nondedicated film, that is, bowel obstruction being detected on chest x-ray.
      The false positives totaled 359 abnormalities, of which 187 abnormalities were related to bones, 6 to AXR, and 166 to CXR. Of the 187 false-positive bony abnormalities, there were 126 fractures, 27 cases of soft tissue swelling, and 1 dislocation. In addition, 29 questionable fractures and 4 questionable foreign body abnormalities were reported as false positives. The bony false-positive abnormalities involved feet (34), wrists (29), hand (16), ankles (13), and fingers (12) most frequently. Of the AXR, all 6 false-positive abnormalities related to SBO. On CXR, the 166 false-positive abnormalities included 96 ASDs, 11 congestive heart failures, 9 mass/nodules, 6 fractures, 4 cardiomegally, 4 hilar lesions, 3 effusions, 3 PTXs, 1 of each aortic, pneumomediastinum, bony lesion, hardware, free air, abdominal finding—other, and SBO. In addition, 12 questionable ASDs, 6 questionable mass/nodule, and 1 questionable finding each of congestive heart failure, effusion, fracture, and free air were also reported. Twenty-six discrepancies with a false-positive abnormality actually also included a separate false-negative abnormality.
      Of the 4605 discrepancies included in this study, 889 had no documented interpretation by the EP, thus triggering an automatic discrepancy. These discrepancies were included if a database review did not clearly indicate based on the documentation that the treating EP was aware of the finding. All remaining discrepancies had an EP interpretation. Out of the 4605 discrepancies, 1268 patients were admitted, 2444 were discharged, 892 did not have a disposition noted, and 2 patients expired in the ED.
      Follow-up regarding actions taken by the ED for discrepancies was noted in 3515 of 4605 discrepancies. Of the noted discrepancies, 1349 cases were judged upon review of the chart by the quality assurance physician at the time of discovery of the discrepancy to require no additional follow-up or treatment. Of the discrepancies judged to be significant, 775 patients, 542 admitting physicians, and 292 primary care physicians were contacted to inform them of the discrepancy. One hundred nine additional patients were called back to return to the ED for additional treatment, 5 of whom were told to return immediately by emergency medical services (EMS). Ten patients returned on their own before being contacted. Certified letters were sent to 260 patients to inform them of the discrepancy, and 178 (5% of patients whom there is a record of the ED attempting to contact them) had no contact information on record. Table 7 illustrates the radiograph and discrepancies in these more urgent cases.
      Table 7Discrepancies in 109 discharged patients called back to the ED or told to call EMS for follow-up (representing 113 discrepancies, with multiple discrepancies in 2 patients)
      CXRAXRBone
      Questionable abdominal aortic aneurysm11
      Questionable ASD4
      Questionable bowel obstruction2
      Questionable dislocation6
      Questionable foreign body2
      Questionable fracture14
      Questionable mass/nodule1
      Questionable pericardial effusion1
      Questionable pleural effusion1
      Questionable pneumomediastinum2
      Questionable PTX1
      Questionable wide mediastinum1
      Aorta4
      ASD8
      Bowel obstruction3
      Bone lesion, nonfracture10
      Calcification1
      Joint effusion1
      Foreign body1
      Fracture227
      Mass/nodule13
      Negative
      This patient was called back to the ED after a chart review raised the question of a pulmonary embolism after the initial ED diagnosis of pneumonia was questioned by radiology.
      1
      Pleural effusion1
      Pneumobilia1
      Pneumomediastinum1
      PTX4
      Soft tissue swelling5
      Wide mediastinum2
      a This patient was called back to the ED after a chart review raised the question of a pulmonary embolism after the initial ED diagnosis of pneumonia was questioned by radiology.

      4. Discussion

      This retrospective study of an ED radiology quality assurance database represents the largest series of radiographic discrepancies reported from an ED to date. It represents all levels of illness and injury and differs from some prior studies that evaluated only subacute patient populations with a lower pretest probability of serious disease [
      • Tachakra S.
      • Mukherjee P.
      • Smith C.
      • Dutton D.
      Are accident and emergency consultants as accurate as consultant radiologists in interpreting plain skeletal radiographs taken at a minor injury unit?.
      ,
      • Snow D.A.
      Clinical significance of discrepancies in roentgenographic film interpretation in an acute walk in area.
      ], which therefore may have underestimated the true significance of ED false negatives. Multiple studies have evaluated differing levels of physician training on the outcome of the interpretation [
      • Halvorsen J.G.
      • Kunian A.
      • Gjerdingen D.
      • Connolly J.
      • et al.
      The interpretation of office radiographs by family physicians.
      ,
      • Nolan T.M.
      • Oberklaid F.
      • Boldt D.
      Radiological services in a hospital emergency department—an evaluation of service delivery and radiograph interpretation.
      ]. This study differs from these prior studies by specifically assessing only the radiographic interpretation of the ED attending. The overall rate of 5308 discrepancies for an estimated 151 693 radiographs (3.5%) is in line with prior studies [
      • O'Leary M.R.
      • Smith M.S.
      • O'Leary D.S.
      • Omsted W.W.
      • et al.
      Application of clinical indicators in the emergency department.
      ]. When only the discrepancies characterized by RAD as definite are included, the rate drops to 2.9%. The rate of false positives in our study of 359/5308 discrepancies represented a relatively small proportion of overall discrepancies. The most common false positives involved the “overcalling” of fractures or of ASD that were ultimately felt not to be present on RAD review.
      A larger proportion of films were considered false negatives, although a significant number of false negatives were qualified by the RAD as “questionable.” These 859 questionable false negatives represented radiographic uncertainties after RAD review requiring additional imaging studies or clinical correlation. The most common uncertainty represented the possibility of a missed fracture in 280/859 cases. How this translates to clinical care is difficult to ascertain from these data set because the EP will use the radiograph as a diagnostic adjunct in addition to history and physical examination. Most EPs' practice takes into account the patient's clinical examination as well as the x-ray results, for example, immobilizing musculoskeletal injuries that are clinically severe and arranging orthopedic follow-up regardless of radiographic finding. Also, some of the findings called a false negative by the RAD were findings that were undoubtedly noticed by the treating physician, but ignored because they were so obviously not part of the clinical scenario. Emergency physicians have clinical information not available to the RAD, for instance, the 20-year-old bullet seen on the abdominal x-ray performed to rule out obstruction. These points are supported by the relatively high rate of patients not requiring any intervention in their management and follow-up (1359/3515) despite a radiographic discrepancy. Also, because the number of questionable discrepancies represents a sizable part of the study, a question arises regarding the criterion standard of radiographic interpretation. For the purposes of this study, the radiology attending was assumed to represent the criterion standard, though multiple studies [
      • Robinson P.J.
      • Wilson D.
      • Coral A.
      • et al.
      Variation between experienced observers in the interpretation of accident and emergency radiographs.
      ,
      • Siegle R.L.
      • Baram E.M.
      • Reuter S.R.
      • et al.
      Rates of disagreement in imaging interpretation in a group of community hospitals.
      ,
      • Berlin L.
      Does the ‘missed’ radiographic diagnosis constitute malpractice?.
      ,
      • Herman P.G.
      • Hessel S.J.
      Accuracy and its relationship to experience in the interpretation of chest radiographs.
      ,
      • Robinson P.J.
      • Culpan G.
      • Wiggins M.
      Interpretation of selected accident and emergency radiographic examinations by radiographers: a review of 11000 cases.
      ] have demonstrated large variations in interpretation between RADs. Also, in one study of the outcomes of 175 discrepancies, the ED interpretation proved to be the correct one in 39 cases [
      • Benger J.R.
      • Lybrun I.D.
      What is the effect of reporting all emergency department radiographs?.
      ]. In addition, in our study some unknown portion of the 892 discrepancies were generated because of the lack of documentation on the part of the EP, illustrating the need for good documentation both to record medical care provided and to decrease medical liability [
      • Guly H.R.
      Missed diagnoses in an accident and emergency department.
      ,
      • Gwynne A.
      • Barber P.
      • Tavener F.
      A review of 105 negligence claims against accident and emergency departments.
      ,
      • George J.E.
      • Espinosa J.A.
      • Quattrone M.S.
      Legal issues in emergency radiology. Practical strategies to reduce risk.
      ].
      Missed fractures have represented the largest proportion of errors in EP radiograph interpretation in prior literature [
      • Guly H.R.
      Diagnostic errors in an accident and emergency department.
      ], and a previous analysis shows a relatively large number of missed calcaneal fractures [
      • Freed H.A.
      • Shields N.N.
      Most frequently overlooked radiographically apparent fractures in a teaching hospital emergency department.
      ]. Digits are also demonstrated in other studies [
      • Thomas H.G.
      • Mason A.C.
      • Smith R.M.
      • Fergusson C.M.
      Value of radiograph audit in an accident service department.
      ] to remain a frequent area of missed fracture diagnosis. In our study, fingers, ribs, and toes were the most frequently fractured bones missed by EPs.
      From an EM perspective, any discrepancy across all categories that changed the management and/or disposition of a patient is a significant one. It is worth noting that on 8 studies, potentially emergent findings were noted on nondedicated studies (eg, large aortic aneurysm seen on pelvis x-ray). Overall, in this study, missed PTXs (24), aneurysm (14-10 thoracic and 4 abdominal), wide mediastinum (35), pneumomediastinum (2), and free air (10) represent potentially emergent disease presentations (n = 85) that were missed by the interpreting EP attending. There were an additional 1979 urgent radiographic discrepancies including dislocations (26), ASD (778), fractures (1053), bowel obstruction (47), and/or foreign bodies (75, including 5 bullets), which may represent clinically significant discrepancies. These radiographic emergent and urgent disease presentations represent a total of 2064 (39%) of 5308 discrepancies or a 1.4% false-negative rate compared with the total number of radiographs ordered in the study period [
      • Walsh-Kelly C.M.
      • Melzer-Lange M.D.
      • Hennes H.M.
      • et al.
      Clinical impact of radiograph misinterpretation in a pediatric ED and the effect of physician training level.
      ]. It is unclear how many of these discrepancies would have lead to changes in clinical management of the affected patients, though prior studies [
      • Gatt M.E.
      • Spectre G.
      • Paltiel O.
      • et al.
      Chest radiographs in the emergency department: is the radiologist really necessary?.
      ] have examined this issue and found a rather small number of clinical changes instituted [
      • Thomas H.G.
      • Mason A.C.
      • Smith R.M.
      • Fergusson C.M.
      Value of radiograph audit in an accident service department.
      ,
      • Williams S.M.
      • Connelly D.J.
      • Wadsworth S.
      • Wilson D.J.
      Radiological review of accident and emergency radiographs: a 1-year audit.
      ,
      • Mayhue F.E.
      • Rust D.D.
      • Aldag J.C.
      • et al.
      Accuracy of interpretations of emergency department radiographs: effect of confidence levels.
      ]. The rate of discrepancies potentially requiring emergent change in medical management (85/5308) represents only a small fraction of an estimated 151 693 x-rays during our 9-year study period. This corresponds to 1 radiograph in 1785 or 0.056% potentially needing emergent change in management based solely on a radiographic discrepancy.
      Limitation of this study is that the set of data was at times not complete in all aspects. Also, no chart reviews were undertaken at the time of this study; thus, the only information regarding patient outcomes was clinical details that were entered in the database from a chart review performed at the time of discrepancy. The database available for study captured almost all returned radiographic discrepancies to the ED, with the following exception: because the RAD read these radiographs with a time delay in relation to patient care, on rare occasion, a discrepancy might have been called to the treating physician in the ED before patient disposition. Such a discrepancy would not have been entered in the database and thus would have caused the number of discrepancies to be even higher. Such real-time interpretation of ED radiographs at our facility is unusual, although anecdotally growing more frequent with increasing lengths of stay for patients awaiting inpatient beds. Also, a significant number of discrepancies in the database were included because incomplete documentation would not allow the investigators to conclusively determine whether or not the treating EP knew of the abnormality at the time of the ED visit. Unlike other studies [
      • Mayhue F.E.
      • Rust D.D.
      • Aldag J.C.
      • et al.
      Accuracy of interpretations of emergency department radiographs: effect of confidence levels.
      ,
      • Smith P.D.
      • Temte J.
      • Beasley J.W.
      • Mundt M.
      Radiographs in the office: is a second reading always needed?.
      ,
      • Lufkin K.C.
      • Smith S.W.
      • Matticks C.A.
      • Brunette D.D.
      Radiologists' review of radiographs interpreted confidently by emergency physicians infrequently leads to changes in patient management.
      ], this study did not assess the EP's confidence level of the ED interpretation of all radiographs. Interestingly, the study period also spans that of a technology transfer. In 2003, the hospital introduced a digital radiology system for processing and displaying the captured radiographs. The impact of such a technological transition on this study was not evaluated, although the transition might have had an impact [
      • Scott W.W.
      • Bluemke D.A.
      • Mysko W.K.
      • Wellter G.E.
      • et al.
      Interpretation of emergency department radiographs by radiologists and emergency medicine physicians: teleradiology workstation versus radiograph readings.
      ]. Lastly, our ED sees few pediatric patients, and our results cannot be extrapolated to this population.

      5. Conclusions

      Plain radiograph interpretation has long been an integral part of ED patient management. Because real-time radiology attending overread is not universally available during daylight hours, and rarely available during off hours, EPs are often called upon to determine clinical care based on their own interpretation of x-rays. Previous studies have shown a wide variability in the rate of missed findings, and few specifically address EP's interpretation of x-rays. This study represents the largest database review of radiographic discrepancies to date in the literature. Using radiology overread as a criterion standard, EPs had definitely discrepant interpretations from radiology in 2.9% of cases. Emergency physicians were found to have infrequently missed clinically significant findings and rarely missed emergent findings. The most commonly missed findings included fractures, dislocations, ASD, and pulmonary nodules. Continuing education should focus on these areas to attempt to further reduce this error rate.

      References

        • O'Leary M.R.
        • Smith M.
        • Olmsted W.W.
        Physician assessments of practice patterns in emergency department radiograph interpretation.
        Ann Emerg Med. 1988; 17: 1019-1023
        • Torreggiani W.C.
        • Nicolaou S.
        • Lyburn I.D.
        • Harris A.C.
        • et al.
        Emergency radiology in Canada: a national survey.
        Can Assoc Radiol J. 2002; 53: 160-167
        • James M.R.
        • Bracegirdle A.
        • Yates D.W.
        X-ray reporting in accident and emergency departments—an area for improvements in efficiency.
        Arch Emerg Med. 1991; 8: 266-270
        • Hunter T.B.
        • Krupinski E.A.
        • Hunt K.R.
        • Erly W.K.
        Emergency department coverage by academic departments of radiology.
        Acad Rad. 2000; 7: 165-170
        • Preston C.A.
        • Marr J.J.
        • Amaraneni K.K.
        • Suthar B.S.
        Reduction of ‘callbacks’ to the ED due to discrepancies in plain radiograph interpretation.
        Am J Emerg Med. 1998; 16: 160-162
        • Espinosa J.A.
        • Nolan T.W.
        Reducing errors made by emergency physicians in interpreting radiographs: longitudinal study.
        BMJ. 2000; 320: 737-740
        • De Lacey G.
        • Barker A.
        • Harper J.
        • et al.
        An assessment of the clinical effects of reporting accident and emergency radiographs.
        Br J Radiol. 1980; 53: 304-309
        • Fleisher G.
        • Ludwig S.
        • McSorley M.
        Interpretation of pediatric x-ray films by emergency department pediatricians.
        Ann Emerg Med. 1983; 12: 153-158
        • McLain P.L.
        • Kirkwood C.R.
        The quality of emergency room radiograph interpretations.
        J Fam Pract. 1985; 20: 443-448
        • Warren J.S.
        • Lara K.
        • Connor P.D.
        • et al.
        Correlation of emergency department radiographs: results of a quality assurance review in an urban community hospital setting.
        J Am Board Fam Pract. 1993; 6: 255-259
        • Thomas H.G.
        • Mason A.C.
        • Smith R.M.
        • Fergusson C.M.
        Value of radiograph audit in an accident service department.
        Injury. 1992; 23: 47-50
        • Nitowski L.A.
        • O'Connor R.E.
        • Reese C.L.
        The rate of clinically significant plain radiograph misinterpretation by faculty in an emergency medicine residency program.
        Acad Emerg Med. 1996; 3: 782-789
        • Berman L.
        • de Lacey G.
        • Twomey E.
        • et al.
        Reducing errors in the accident department: a simple method using radiographers.
        Br Med J. 1985; 290: 421-422
        • Quick G.
        • Podgorny G.
        An emergency department radiology audit procedure.
        JACEP. 1977; 6: 247-250
        • Klein E.J.
        • Koenig M.
        • Diekema D.S.
        • Winters W.
        Discordant radiograph interpretation between emergency physicians and radiologists in a pediatric emergency department.
        Pediatr Emerg Care. 1999; 15: 245-248
        • Masel J.P.
        • Grant P.J.
        Accuracy of radiological diagnosis in the casualty department of a children's hospital.
        Aust Paediatr J. 1984; 20: 221-223
        • Tachakra S.
        • Mukherjee P.
        • Smith C.
        • Dutton D.
        Are accident and emergency consultants as accurate as consultant radiologists in interpreting plain skeletal radiographs taken at a minor injury unit?.
        Eur J Emerg Med. 2002; 9: 131-134
        • Snow D.A.
        Clinical significance of discrepancies in roentgenographic film interpretation in an acute walk in area.
        J Gen Intern Med. 1986; 1: 295-299
        • Halvorsen J.G.
        • Kunian A.
        • Gjerdingen D.
        • Connolly J.
        • et al.
        The interpretation of office radiographs by family physicians.
        J Fam Pract. 1989; 28: 426-432
        • Nolan T.M.
        • Oberklaid F.
        • Boldt D.
        Radiological services in a hospital emergency department—an evaluation of service delivery and radiograph interpretation.
        Aust Paediatr J. 1984; 20: 109-112
        • O'Leary M.R.
        • Smith M.S.
        • O'Leary D.S.
        • Omsted W.W.
        • et al.
        Application of clinical indicators in the emergency department.
        JAMA. 1989; 262: 3444-3447
        • Robinson P.J.
        • Wilson D.
        • Coral A.
        • et al.
        Variation between experienced observers in the interpretation of accident and emergency radiographs.
        Br J Radiol. 1999; 72: 323-330
        • Siegle R.L.
        • Baram E.M.
        • Reuter S.R.
        • et al.
        Rates of disagreement in imaging interpretation in a group of community hospitals.
        Acad Radiol. 1998; 5: 148-154
        • Berlin L.
        Does the ‘missed’ radiographic diagnosis constitute malpractice?.
        Radiology. 1977; 123: 523-527
        • Herman P.G.
        • Hessel S.J.
        Accuracy and its relationship to experience in the interpretation of chest radiographs.
        Invest Radiol. 1975; 10: 62-67
        • Robinson P.J.
        • Culpan G.
        • Wiggins M.
        Interpretation of selected accident and emergency radiographic examinations by radiographers: a review of 11000 cases.
        Br J Radiol. 1999; 72: 546-551
        • Benger J.R.
        • Lybrun I.D.
        What is the effect of reporting all emergency department radiographs?.
        Emerg Med J. 2003; 20: 40-43
        • Guly H.R.
        Missed diagnoses in an accident and emergency department.
        Injury. 1984; 15: 403-406
        • Gwynne A.
        • Barber P.
        • Tavener F.
        A review of 105 negligence claims against accident and emergency departments.
        J Accid Emerg Med. 1997; 14: 243-245
        • George J.E.
        • Espinosa J.A.
        • Quattrone M.S.
        Legal issues in emergency radiology. Practical strategies to reduce risk.
        Emerg Med Clin North Am. 1992; 10: 179-203
        • Guly H.R.
        Diagnostic errors in an accident and emergency department.
        Emerg Med J. 2001; 18: 263-269
        • Freed H.A.
        • Shields N.N.
        Most frequently overlooked radiographically apparent fractures in a teaching hospital emergency department.
        Ann Emerg Med. 1984; 13: 900-904
        • Walsh-Kelly C.M.
        • Melzer-Lange M.D.
        • Hennes H.M.
        • et al.
        Clinical impact of radiograph misinterpretation in a pediatric ED and the effect of physician training level.
        Am J Emerg Med. 1995; 13: 262-264
        • Gatt M.E.
        • Spectre G.
        • Paltiel O.
        • et al.
        Chest radiographs in the emergency department: is the radiologist really necessary?.
        Postgrad Med J. 2003; 79: 214-217
        • Williams S.M.
        • Connelly D.J.
        • Wadsworth S.
        • Wilson D.J.
        Radiological review of accident and emergency radiographs: a 1-year audit.
        Clin Radiol. 2000; 55: 861-865
        • Mayhue F.E.
        • Rust D.D.
        • Aldag J.C.
        • et al.
        Accuracy of interpretations of emergency department radiographs: effect of confidence levels.
        Ann Emerg Med. 1989; 18: 826-830
        • Smith P.D.
        • Temte J.
        • Beasley J.W.
        • Mundt M.
        Radiographs in the office: is a second reading always needed?.
        J Am Board Fam Prac. 2004; 17: 256-263
        • Lufkin K.C.
        • Smith S.W.
        • Matticks C.A.
        • Brunette D.D.
        Radiologists' review of radiographs interpreted confidently by emergency physicians infrequently leads to changes in patient management.
        Ann Emerg Med. 1998; 31: 202-207
        • Scott W.W.
        • Bluemke D.A.
        • Mysko W.K.
        • Wellter G.E.
        • et al.
        Interpretation of emergency department radiographs by radiologists and emergency medicine physicians: teleradiology workstation versus radiograph readings.
        Radiology. 1995; 195: 223-229