Journal List > Imaging Sci Dent > v.41(1) > 1088869

Kim, Choi, Lee, Huh, Yi, Heo, and Choi: Effect of LCD monitor type and observer experience on diagnostic performance in soft-copy interpretations of the maxillary sinus on panoramic radiographs

Abstract

Purpose

The aim of this study was to evaluate the effect of liquid crystal display (LCD) monitor type and observer experience on the diagnostic performance in soft-copy interpretations of maxillary sinus inflammatory lesions on panoramic radiographs.

Materials and Methods

Ninety maxillary sinuses on panoramic images were grouped into negative and positive groups according to the presence of inflammatory lesions, using CT for confirmation. Monochrome and color LCDs were used. Six observers participated and ROC analysis was performed to evaluate the diagnostic performance. The reading time, fatigue score, and inter-/intra-observer agreements were assessed.

Results

The interpretation of maxillary sinus inflammatory lesions was affected by the LCD monitor type used and by the experience of the observer. The reading time was not significantly different, however the fatigue score was significantly different between two LCD monitors. Inter-observer agreement was relatively good in experienced observers, while the intra-observer agreement for all observers was good with monochrome LCD but not with color LCD.

Conclusion

The less experienced observers showed lowered diagnostic ability with a general color LCD.

Introduction

In picture archiving and communication system (PACS) environments, PACS workstation is a vital tool for image interpretation because PACS equipments improve the efficiency, provide the image analysis tools, and enhance the image storage and distribution.1-3 In PACS environment, computed radiography (CR) with photostimulable phosphor (PSP) plates is a suitable technique for extra-oral radiographic examination4 and has been applied to panoramic radiography,5 which is a useful and common examination for the oral and maxillofacial regions.
Soft-copy interpretation on liquid crystal display (LCD) and cathode ray tube (CRT) monitors is as accurate as hardcopy interpretation and widely accepted in medical and dental practices.6-12 If several factors including the monitor resolution, monitor luminance, image resolution, image bit depth, image receptor device, and ambient light are well regulated, the soft-copy interpretation is quite reliable.13-16
Dental implant therapy is an essential consideration for the rehabilitation of missing teeth. Implant surgery with maxillary sinus floor elevation has high implant survival rates.17,18 The assessment of anatomical structures and pathological lesions around the surgical site is one of the requirements for successful dental implant therapy.19,20 Therefore, diagnostic imaging is essential for dental implant therapy.
Researchers assessed the value of panoramic radiography for the evaluation of the maxillary sinus.21,22 Those studies showed that a more experienced observer was likely to be more efficient with interpreting panoramic radiographs and suggested that specialists in oral and maxillofacial radiology were the best suited to evaluate the maxillary sinus on panoramic radiographs.
The aim of this study was to evaluate the effect of LCD monitor type and observer's experience in soft-copy interpretation of maxillary sinus inflammatory lesions on panoramic radiograph.

Materials and Methods

Image acquisition and collection

Fifty two CR panoramic radiographs were collected from patients (mean age 40.2±17.3 years) who had taken CT (Somatom Sensation 10, Siemens AG, Forchheim, Germany) images on the same day. Panoramic images had been taken by experienced oral and maxillofacial radiographers using two different panoramic radiography equipments (1, Orthopantomograph OP100, Instrumentarium Corp., Tuusula, Finland; 2, Cranex 3+ PAN, Orion Corporation Soredex, Helsinki, Finland). All CR images had been obtained using PSP image plates (12×10 inch) and read by an FCR system (Fuji Computed Radiography 5000R, Fuji Photo Film Co. Ltd., Düsseldorf, Germany).
The acquired digital raw data (2,010×1,670 pixels, 10 bits, 6.7 pixel/mm) were sent to a PACS server and distributed to a PACS workstation. Then all images were downloaded onto the local hard disk drive of the PACS workstation as a DICOM (Digital Imaging and Communications in Medicine) format file before being read by observers. The average file size of each image was 3.66 MB, and the CR images were displayed in consecutive mode.

Case selection

A total of 90 maxillary sinuses from 52 patients were classified into negative or positive groups. The negative and positive classifications were confirmed on CT images by experienced oral and maxillofacial radiologist. Maxillary sinuses with surgical intervention and lesion involvement such as tumor, cyst and periapical infection, and congenital anomaly were excluded. On the basis of CT findings, the positive cases were defined as the presence of four types of inflammatory lesions on the sinus floor: 1, thickened mucosa (n=14); 2, polypoid mucosa (n=23); 3, air-fluid level (n=2); 4, totally mucosa-filled sinus (n=6). In this manner, 45 sinuses were classified as positive and the other 45 as negative.

Study performance

A high-resolution monochrome LCD (ME315L, Totoku Electric Co. Ltd., Tokyo, Japan) and a general color LCD (Nexview1830, Dicon Co. Ltd., Seoul, Korea) monitors were applied as the CR image display methods in this study. The detailed specifications are given in Table 1. To avoid the limitation of viewing angle with the LCD, the observer performed the image interpretation at a constant distance of 50 cm from the center of the monitor surface.23
All observers read the images twice on each monitor. This study consisted of four sessions as follows; the observers interpreted 52 panoramic images on one monitor (session 1) and a few minutes later interpreted the images on the other monitor (session 2). The monitor for each session was randomly selected for each observer. All images were differently displayed on each monitor to minimize the memory effect between sessions 1 and 2. To avoid the learning bias, sessions 3 and 4 were performed over one week after sessions 1 and 2.
All observers were proficient with PACS viewing software (m-View-ps, Marotech, Seoul, Korea) as they used it in their daily practice. They were allowed to adjust the brightness and contrast of the images and also to magnify the images to their preference through a mouse control. No limitation was imposed on reading time. Ambient light was lowered as much as possible in order to eliminate reflections in the monitor. All observers were adapted fully to dark room conditions before the session was begun.
The observers did not have any information of the patients and proportion of positive cases. The definitions of the positive case were instructed to the observers, however any discussion for consensus about negative/positive cases was not allowed. The maxillary sinus above the molar region was determined as a region of interest (ROI). At each session, the absence or presence of inflammatory changes in the maxillary sinus was scored at each ROI using a five-point scale system: 1, definitely negative; 2, probably negative; 3, intermediate; 4, probably positive; 5, definitely positive. The reading time was recorded for each observer during all sessions. After finishing each session, the observers were asked to determine the subjective fatigue score using a ten-point scale from 1 (no feeling of fatigue) to 10 (extreme feeling of fatigue).

Observers

Six observers participated in this study. They were categorized into three groups consisted of 2 observers: group A; faculty in oral and maxillofacial radiology with interpreting experience of more than 15 years, group B; resident in oral and maxillofacial radiology with interpreting experience of more than three years; group C, non-specialist with interpreting experience of less than two years. All observers had been using PACS software for over two years.

Statistical analysis

To evaluate the diagnostic performance, the receiver operating characteristic (ROC) analysis with calculation of the area under the ROC curve (Az) was performed using ROCKIT (0.9B beta version, Metz CE, Department of Radiology, The University of Chicago). Overall diagnostic performances according to the monitors and observers were determined by averaging the Az values.
A statistical test was performed by using two-way analysis of variance (ANOVA) with interaction of the two factors of LCD monitor type and observer experience, where Az was considered as replicates. Also, a statistical testing regarding the reading time and fatigue score was conducted using the Wilcoxon signed-rank test. Statistical analysis was performed using SPSS for Windows (version 12.0, SPSS Inc., Chicago, USA), and p<0.05 was considered statistically significant.
To assess the decision agreement (inter- and intra-observer agreement), kappa (κ) values were calculated such that κ≥0.75 was considered excellent agreement, 0.40≤κ<0.75 was graded as fair or good agreement, and κ<0.40 was poor agreement.

Results

Table 2 shows the mean Az according to the monitors and observers and the overall diagnostic performances for the two monitors. Overall diagnostic performance of monochrome LCDs was superior to that of color LCDs. The diagnostic performance in experienced observers was higher than that in less experienced observers (Table 3). The Az value of each group decreased from group A to C serially.
Two-way AVONA revealed that there was a significant difference between the LCD monitors and among the groups and also significant interactions between the monitor and group (p<0.05), which indicated that the diagnostic performance was dependent upon the LCD monitor type in soft-copy interpretation of the maxillary sinus (Table 4). Color LCD had a detrimental influence on the diagnostic performance of less experienced observers. In experienced observers (group A), the diagnostic performance was comparable between the two LCD monitors. However, in less-experienced observers (group B, C), the diagnostic performance with the color LCD was lower than that with the monochrome LCD.
When the reading time was averaged over all observers for the two monitors, there was no significant difference in the reading time (p>0.05). However, there was a considerable individual variation in reading time ranging from 685.5 to 1,280.0 seconds (Table 5).
Table 6 reveals that the fatigue score ranged from 2 to 8 for the two monitors. The mean fatigue score for the color LCD was significantly higher than that for the monochrome LCD (p<0.05). All observers agreed that they felt more fatigued after using the color LCD.
Tables 7 and 8 show the decision agreement. Inter-observer agreement of group A was relatively fair or good (0.40≤κ), however that of other groups was poor (κ<0.40). Kappa values for inter-observer agreement were higher for the specialists in oral and maxillofacial radiology. Intra-observer agreement for all groups was relatively fair or good, which indicated that all observers who participated in this study had their own decision criteria except for one observer (observer 5) when using the color LCD. The kappa values showed a tendency to be higher for the monochrome LCD than for the color LCD (Table 8).

Discussion

Inflammatory changes such as thickened mucosa on maxillary sinus floor are difficult to detect on plain radiographs. It is still essential to distinguish such changes from normal features. For example, dental implants should be inserted in bone with no pathological lesions.
We selected panoramic radiography since it is the most useful radiological examination for jaw imaging. Lee et al reported that soft tissue lesions were difficult to detect by both specialists and non-specialists, however the specialists were more able to interpret lesions in the maxillary sinus on panoramic radiograph.22 Our study confirmed that the experienced observers showed higher ability to detect lesions on the maxillary sinus floor.
There were several studies on the effect of monitor luminance for observer performance in soft-copy interpretation.13-16 According to Herron et al,24 hard-copy interpretation in chest radiography was not affected by luminance above 260 cd/m2, which showed that the luminance below a threshold level had detrimental effects on the detection performance in interpretation. This finding might be also applied to soft-copy interpretation. The observer performance in color LCD was inferior to that in monochrome LCD, which was, we thought, due to the monitor luminance level. A decrease in luminance leads inevitably to a reduction in detection, which was similar to a study by Otto et al.8 The maximum luminance of the color LCD in this study was only one-third of that of the monochrome LCD. This might resulted in decreased discrimination of low contrast tissue such as mucosal shadows in the maxillary sinus. As the light stimuli from the low contrast mucosa is very faint to the human visual system, the decreased contrast sensitivity seems to cause deterioration in the radiological interpretation process.24 As Goo et al7 suggested that the observer performance was not affected under low ambient light, this study demonstrated that the detection performance might not be affected by an ambient light but was affected instead by a monitor luminance.
In spite of the adjustments in window width and level, the less-experienced observers showed lower diagnostic abilities for detection of the maxillary sinus lesion when using the color LCD. This might be resulted from the observer's experience and level of radiological practice. Some investigators found that the level of experience influenced the observer performance in radiology.9,12,25,26 In case of experienced observers, the performance over various display methods (film, CRT, LCD etc.) were fairly even. This was in agreement with our results.
The lack of difference in reading time between the two different monitors showed that the variation in reading time indicated the individual propensity to interpret an image. The reading time had no relation to the observer's experience.
Otto et al8 found that the diagnostic performance was improved with higher spatial resolution, which was consistent with our results. This indicated that the large number of pixels concomitant with small pixel size of a monochrome LCD would be suitable for detecting a fine distinction of mucosa.
Goo et al7 found that fatigue was higher at high levels of monitor luminance (100 fL) than low and middle levels (25, 50 fL), therefore fatigue in the monochrome LCD was expected to be higher than in the color LCD. However, our result was in opposition to their results. This might be related to the differences in reading time for each session in our study. The subjective determinations in the detection of low contrast tissue with monochrome LCD might make the observers less fatigued and also shortened the reading time in this study.
Inter-observer agreement (kappa values) ranging from 0.390 to 0.565 was fairly good only in experienced faculty of oral and maxillofacial radiology. Intra-observer agreement tended to be good with the monochrome LCD, and the reading time was not significantly different between the different monitors. Considering the Az values, fatigue score, and reading time, the high-resolution monochrome LCD monitor might be more suitable for primary interpretations in a PACS environment. However, because the monochrome monitors tend to increase the cost of PACS installation, a common color monitor can be considered as an alternative method. From our results, the observer's experience was an essential factor in soft-copy interpretation.
In conclusion, the less experienced observers in softcopy interpretation of the maxillary sinus on panoramic radiographs showed lower diagnostic ability when using a general LCD monitor, therefore a high resolution monochrome LCD monitor might be more suitable for an image interpretation.

Figures and Tables

Table 1
Specifications of two LCD monitors
isd-41-11-i001
Table 2
Az of observers
isd-41-11-i002

group A; oral and maxillofacial radiologists experienced more than 15 years, group B; oral and maxillofacial radiologists experienced more than three years, group C; non-specialist experienced of less than two years

Table 3
Mean Az of each group at all sessions from ROC analysis
isd-41-11-i003

group A; oral and maxillofacial radiologists experienced more than 15 years, group B; oral and maxillofacial radiologists experienced more than three years, group C; non-specialist experienced of less than two years

Table 4
Results of two-way ANOVA with interaction of LCD monitor type and observer experience
isd-41-11-i004

*Statistical significance (p<0.05), Statistical significance (p<0.05), group A; oral and maxillofacial radiologists experienced more than 15 years, group B; oral and maxillofacial radiologists experienced more than three years, group C; non-specialist experienced of less than two years

Table 5
Total reading time (second) of observers at all session
isd-41-11-i005
Table 6
Fatigue score of observers
isd-41-11-i006

*Statistical significance (p<0.05)

Table 7
Inter-observer agreement of each group (kappa value)
isd-41-11-i007

group A; oral and maxillofacial radiologists experienced more than 15 years, group B; oral and maxillofacial radiologists experienced more than three years, group C; non-specialist experienced of less than two years

Table 8
Intra-observer agreement of observers (kappa value)
isd-41-11-i008

References

1. Arenson RL, Chakraborty DP, Seshadri SB, Kundel HL. The digital imaging workstation. Radiology. 1990. 176:303–315.
crossref
2. Gotfredsen E, Wenzel A. Integration of multiple direct digital imaging sources in a picture archiving and communication system (PACS). Dentomaxillofac Radiol. 2003. 32:337–342.
crossref
3. Pisano ED, Cole EB, Kistner EO, Muller KE, Hemminger BM, Brown ML, et al. Interpretation of digital mammograms: comparison of speed and accuracy of soft-copy versus printed-film display. Radiology. 2002. 223:483–488.
crossref
4. Thaete FL, Fuhrman CR, Oliver JH, Britton CA, Campbell WL, Feist JH, et al. Digital radiography and conventional imaging of the chest: a comparison of observer performance. AJR Am J Roentgenol. 1994. 162:575–581.
crossref
5. Molander B, Grondahl HG, Ekestubbe A. Quality of film-based and digital panoramic radiography. Dentomaxillofac Radiol. 2004. 33:32–36.
crossref
6. Ishigaki T, Endo T, Ikeda M, Kono M, Yoshida S, Ikezoe J, et al. Subtle pulmonary disease: detection with computed radiography versus conventional chest radiography. Radiology. 1996. 201:51–60.
crossref
7. Goo JM, Choi JY, Im JG, Lee HJ, Chung MJ, Han D, et al. Effect of monitor luminance and ambient light on observer performance in soft-copy reading of digital chest radiographs. Radiology. 2004. 232:762–766.
crossref
8. Otto D, Bernhardt TM, Rapp-Bernhardt U, Ludwig K, Kastner A, Liehr UB, et al. Subtle pulmonary abnormalities: detection on monitors with varying spatial resolutions and maximum luminance levels compared with detection on storage phosphor radiographic hard copies. Radiology. 1998. 207:237–242.
crossref
9. Weatherburn GC, Ridout D, Strickland NH, Robins P, Glastonbury CM, Curati W, et al. A comparison of conventional film, CR hard copy and PACS soft copy images of the chest: analyses of ROC curves and inter-observer agreement. Eur J Radiol. 2003. 47:206–214.
crossref
10. Slasky BS, Gur D, Good WF, Costa-Greco MA, Harris KM, Cooperstein LA, et al. Receiver operating characteristic analysis of chest image interpretation with conventional, laser-printed, and high-resolution workstation images. Radiology. 1990. 174:775–780.
crossref
11. MacMahon H, Vyborny CJ, Metz CE, Doi K, Sabeti V, Solomon SL. Digital radiography of subtle pulmonary abnormalities: an ROC study of the effect of pixel size on observer performance. Radiology. 1986. 158:21–26.
crossref
12. Partan G, Mayrhofer R, Urban M, Wassipaul M, Pichler L, Hruby W. Diagnostic performance of liquid crystal and cathoderay-tube monitors in brain computed tomography. Eur Radiol. 2003. 13:2397–2401.
13. Heo MS, Han DH, An BM, Huh KH, Yi WJ, Lee SS, et al. Effect of ambient light and bit depth of digital radiograph on observer performance in determination of endodontic file positioning. Oral Surg Oral Med Oral Pathol Oral Radiol Endod. 2008. 105:239–244.
crossref
14. Heo MS, Choi DH, Benavides E, Huh KH, Yi WJ, Lee SS, et al. Effect of bit depth and kVp of digital radiography for detection of subtle differences. Oral Surg Oral Med Oral Pathol Oral Radiol Endod. 2009. 108:278–283.
crossref
15. Hellen-Halme K, Nilsson M, Petersson A. Effect of monitors on approximal caries detection in digital radiographs - standard versus precalibrated DICOM part 14 displays: an in vitro study. Oral Surg Oral Med Oral Pathol Oral Radiol Endod. 2009. 107:716–720.
16. Wenzel A, Haiter-Neto F, Gotfredsen E. Influence of spatial resolution and bit depth on detection of small caries lesions with digital receptors. Oral Surg Oral Med Oral Pathol Oral Radiol Endod. 2007. 103:418–422.
crossref
17. Pjetursson BE, Tan WC, Zwahlen M, Lang NP. A systematic review of the success of sinus floor elevation and survival of implants inserted in combination with sinus floor elevation. J Clin Periodontol. 2008. 35:216–240.
crossref
18. Tan WC, Lang NP, Zwahlen M, Pjetursson BE. A systematic review of the success of sinus floor elevation and survival of implants inserted in combination with sinus floor elevation. Part II: transalveolar technique. J Clin Periodontol. 2008. 35:241–254.
crossref
19. Koymen R, Gocmen-Mas N, Karacayli U, Ortakoglu K, Ozen T, Yazici AC. Anatomic evaluation of maxillary sinus septa: surgery and radiology. Clin Anat. 2009. 22:563–570.
crossref
20. Lee SS, Choi SC. Radiographic examination for successful dental implant. Korean J Oral Maxillofac Radiol. 2005. 35:63–68.
21. Hyun YM, Lee SS, Choi SC. Comparison of Waters' radiography, panoramic radiography, and computed tomography in the diagnosis of antral mucosal thickening. J Korean Acad Oral Maxillofac Radiol. 1998. 28:261–269.
22. Lee ES, Park CS. Usefulness of panoramic radiography in the detection of maxillary sinus pathosis. J Korean Acad Oral Maxillofac Radiol. 1999. 29:223–239.
23. Pavlicek W, Owen JM, Peter MB. Active matrix liquid crystal displays for clinical imaging: comparison with cathode ray tube displays. J Digit Imaging. 2000. 13:2 Suppl 1. 155–161.
crossref
24. Herron JM, Bender TM, Campbell WL, Sumkin JH, Rockette HE, Gur D. Effects of luminance and resolution on observer performance with chest radiographs. Radiology. 2000. 215:169–174.
crossref
25. Potchen EJ, Cooper TG, Sierra AE, Aben GR, Potchen MJ, Potter MG, et al. Measuring performance in chest radiography. Radiology. 2000. 217:456–459.
crossref
26. Quekel LG, Kessels AG, Goei R, van Engelshoven JM. Detection of lung cancer on the chest radiograph: a study on observer performance. Eur J Radiol. 2001. 39:111–116.
crossref
TOOLS
Similar articles