Which of the following are reasons for unacceptable images?

  • Journal List
  • J Digit Imaging
  • v.22(4); 2009 Aug
  • PMC3043704

J Digit Imaging. 2009 Aug; 22(4): 393–399.

C. Prieto,

Which of the following are reasons for unacceptable images?
1 E. Vano,1,2 J. I. Ten,3 J. M. Fernandez,1 A. I. Iñiguez,3 N. Arevalo,3 A. Litcheva,3 E. Crespo,3 A. Floriano,1 and D. Martinez1

Abstract

A methodology to automatically detect potential retakes in digital imaging, using the Digital Imaging and Communications in Medicine (DICOM) header information, is presented. In our hospital, neither the computed radiography workstations nor the picture archiving and communication system itself are designed to support reject analysis. A system called QCOnline, initially developed to help in the management of images and patient doses in a digital radiology department, has been used to identify those images with the same patient identification number, same modality, description, projection, date, cassette orientation, and image comments. The pilot experience lead to 6.6% and 1.9% repetition rates for abdomen and chest images. A thorough analysis has shown that the real repetitions were 3.3% and 0.9% for abdomen and chest images being the main cause of the discrepancy being the wrong image identification. The presented methodology to automatically detect potential retakes in digital imaging using DICOM header information is feasible and allows to detect deficiencies in the department performance like wrong identifications, positioning errors, wrong radiographic technique, bad image processing, equipment malfunctions, artefacts, etc. In addition, retake images automatically collected can be used for continuous training of the staff.

Key words: Diagnostic image quality, Digital Imaging and Communications in Medicine (DICOM), image analysis

Introduction

Reject analysis is described by the Quality Assurance Working Group of the Diagnostic Methods Committee of the British Institute of Radiology as ‘the critical evaluation of radiographs which are used as part of the imaging service but do not play a useful part in the diagnostic process.’1 In the digital philosophy, if we consider ‘images’ instead of ‘radiographs,’ it will refer to those images rejected as diagnostically unacceptable during imaging procedures. The total reject rate is, therefore, the number of images thrown out divided by the number of images taken within a specified period of time. It is also possible to consider the retake rate (or repeat rate), which is defined as the percentage of images that have been retaken due to error or a poor image quality.1 The retake rate does not include those images that are diagnostically unacceptable but for different reasons are not retaken. On the other hand, some of the images that later are retaken could have acceptable diagnostic quality2 but are discarded when the decision to repeat or accept an image as diagnostic is made exclusively by a not experienced or well-trained radiographer. Besides, radiographers, in most of the European Countries, are trained to recognize anatomy, not pathology. Comprehension of terminology is essential if meaningful and accurate results are to be achieved. Reject analysis in digital radiology is a challenge, and very few papers have been published on that subject. Deleting files (images) in a computer is easier than throwing films to the basket and more difficult to audit. Performing the control on the rejection rate manually has many difficulties with digital systems, but it is much more difficult to assess it automatically. In this work, we have focused on the feasibility of monitoring automatic retake rates in digital radiology departments.

The European Directive on health protection against the dangers of ionizing radiation in relation to medical exposure3 places great emphasis on the justification for radiographic exposures to ensure that patient radiation dose is minimized, and this requires greater consideration of the diagnostic efficacy of an image. Therefore, it is important to perform only clinically diagnostic examinations. Image retake analysis is a method of identifying image faults, sources of error, and in general, inappropriate practices. It is a key aspect in any quality assurance program,2,4–9 a basic tool to avoid unnecessary doses to patients in radiology departments,10,11 and a way to save time in services already suffering from a substantial scheduling backlog, improving efficiency in the use of X-ray equipment and saving space in local picture archiving and communication system (PACS). This could result eventually in better patient care.

To assure a high-quality service, service levels need to be measured within the clinical environment, and a potential measurement tool is image retake analysis. Retake analysis in a modern imaging department can be used to identify areas where improvement of service quality and cost effectiveness is needed.

The International Commission on Radiological Protection, in a recent report entitled “Managing Patient Dose in Digital Radiology,”12 highlights the importance of retakes in the quality assurance program.

Neither the computed radiography (CR) workstations nor the PACS itself is designed in general to support reject analysis. A methodology to automatically detect potential retakes in digital imaging using the Digital Imaging and Communications in Medicine (DICOM) header information is presented.

Materials and Methods

A system called QConline13 was developed as a pilot experience under the European Research Programs DIMOND and SENTINEL14,15 to help in the management of images and patient doses in a digital radiology department at the San Carlos University Hospital in Madrid. Currently, many digital radiography systems allow selective acceptance of images, while those that are unwanted are deleted for good. The Quality Assurance Committee of the Hospital agreed that all the images (irrespective of whether the image was considered diagnostic or not) were sent automatically to the central PACS and in parallel to a dedicated workstation located at the Medical Physics Service, where the QConline is running.

DICOM header information is automatically extracted and transferred to a data base. Relevant images can also be archived at the workstation for inspection. The initial criterion for identifying potential image retakes was that two or more images share the same patient ID (ID, identification number), same modality, description, projection, and date (see Table 1). This criterion was applied over the 3,742 abdomen and 8,636 chest CR images archived in the PACS in 3 months. A sample of the potential image retakes was evaluated by four independent radiologists to assess the criterion of the selection. After a thorough analysis of the supposed repeated images, the need for additional criteria was determined. As a result, an improved set of criteria was applied to the 1,893 abdomen and 4,369 chest images archived in the PACS during a month. These images were analyzed by four radiologists, and a probable cause for the retake was assigned to each one. A list of the complete set of DICOM header fields used to discriminate the potential repeated images is shown in Table 1 and an example of a the fields of a DICOM header is shown in Figure 1.

Which of the following are reasons for unacceptable images?

Example of a DICOM header with some of the fields used to automatically filter repeated images.

Table 1.

Set of Criteria Applied to Select Potential Image Retakes

Group IDElement IDDescription
Fields used in the first assessment
0016 0032 Patient ID
0008 0096 Modality
0008 4158 Series description
0008 0032 Study date
0024 2037 View position
0008 5122 Study description
Additional fields taken into account after the first assessment
0018 1402 Cassette orientation
0020 4000 Image comments
0018 1404 Exposures on plate
0019 1015 lgM (exposure level)

After the first assessment, four additional fields from the DICOM header were added.

Results

The initial rejection criteria led to a 15.4% retake rate in abdomen and a 4.5% retake for chest examinations. After a thorough analysis of the supposed repeated images, the need for additional criteria like orientation of the cassette (for computed radiology systems) and the number of exposures on plate was shown. For instance, both orientations of the cassette (landscape/portrait) are used in the case of large patients’ abdomen imaging where a single orientation cannot cover the area to be explored, but these images were taken as repetition by the original automatic criteria of selection. Additionally, some images are reprocessed by radiographers or radiologists to improve the visualization or to add markers and are sent again to be archived in the PACS, so that the processed image may be considered as a retake by the “online” system. A solution was to automatically detect these images by making use of the number of exposures on plate that is also archived in the DICOM header, which is the same number for original and reprocessed images. Additional information to the series and study descriptions is stored in the “image comment” field of the DICOM header. This field is particularly useful in the case of abdomen images because, in our case, it is used to indicate whether the image is taken with the patient in bipedestation or not.

The new set of criteria was applied to the 1,893 abdomen and 4,369 chest images archived in the PACS during a month, leading to a sample of 124 abdomen (6.6%) and 85 (1.9%) chest images that could be retaken images. The images finally considered as real repetitions by the radiologists were 62 abdomen (3.3%) and 38 (0.9%) chest images. Other causes of repetition like images taken in different days that are not really necessary, images repeated with a different identification, or repetitions due to a wrong orientation of the plate (landscape/portrait) in the first exposure are not detected by this automatic filtering of images.

The causes of rejections were grouped in five categories:

  1. Artifacts—jewelry, belt, or other radiopaque objects remaining in the way of the beam as well as scratch or dirt in the plate or in the cassette. In this group is also included double exposures of the same or different patients in a single plate.

  2. Mispositioning—off-centered images, wrong collimation, or cassette mispositioned resulting in an image cut.

  3. Movement—patient movement or motion of the area being imaged (incorrect breath holding).

  4. Bad technique—inadequate inspiration or exposure (over/under). An inappropriate exposure may be caused by a selection of incorrect exposure parameters or by a wrong positioning of the patient in relation to the automatic exposure chambers. This fact that used to mean a film too dark or too pale in conventional radiography can be corrected in part by the wide dynamic range of the CR plates and with the post-processing. Nevertheless, in some cases, these corrections are not effective, especially if the radiographic technique that has been used is very far from the optimum. An acceptable range of values can be established for the parameter “lgM” or the “logarithm of the median of the pixel value histogram” to detect this automatically, and therefore, we talk of high/low lgM images instead of too dark of too pale images.

  5. Processing—corrupt images by malfunctions in the digitalization or archiving process (or during the network transmission).

Figure 2 shows some examples of images retaken.

Which of the following are reasons for unacceptable images?

Examples of images retaken: artifacts (zip and hand on the abdomen), mispositioning, movement, bad technique (low technique), and processing error.

About 50% of the chest image and 65% of the abdomen image retakes were due to mispositioning of the patient (see Fig. 3). Bad technique scored as second main cause of retakes. In some cases, more than one cause of rejection was found.

Which of the following are reasons for unacceptable images?

Reason for retaking a low-quality image. Causes of rejection of repeated images (3.3% of the abdomen and 0.9% of the chest images in the sample: 1,893 abdomen and 4,369 chest images archived in the PACS during a month).

There were 50% of the abdomen and 45% of the chest images taken as repetitions by the automatic filtering that were not real retakes, as it was shown after the analysis image by image made by the radiologists. The automatic software selected them as repetitions because of:

  1. Wrong identification. The image was not a repetition but a wrong identification (patient ID, study, or projection). This happened in (roughly) a 30% of the chest and an 80% of the abdomen images of this group (see Fig. 4).

    Which of the following are reasons for unacceptable images?

    Reasons for the automatic filtering to wrongly identify a correct image as a retake (3.3% of the abdomen and 1.1% of the chest images in the sample: 1,893 abdomen and 4,369 chest images archived in the PACS during a month).

  2. Chest images taken in inspiration and exhalation.

  3. Patient too large that could not be fully imaged with a plate. Therefore, two exposures were needed.

  4. Plate not exposed or exposed without the patient.

Corrective action taken to avoid these errors will be necessary in the future to improve the accuracy of the automatic retake analysis. Upcoming training programs for technologists in our hospital will emphasize the importance of an adequate identification and ways to avoid the most common mispositioning errors.

Discussion

This work shows that a methodology to automatically detect potential retakes in digital imaging using DICOM header information is feasible. This pilot study can be easily extended to other modalities such as flat panel detector projection image. Besides, other approaches in the exploitation of data are possible using this methodology, such as variation on the number of retakes depending on the weekend, evening, or day shift. Rejection rate analysis could also be done if a tag is filled on every image rejected, but this implies a thorough collaboration of all radiologists and radiographers (not easy in very busy departments). The ‘natural’ tendency to mask poor-quality practice or the lack of consistency of all the staff to follow the procedure to mark or classify rejected images may easily lead to underreporting or misreporting of rejected images. We have found a simpler, independent, and feasible way to face firstly the methodology to get automatically retake rates.

The incorporation of effective assessment strategies by which the level of service delivery can be measured is essential to evidence the quality of a service. Only where service levels are measured, analyzed, and remeasured after an intervention can we estimate the quality value of the intervention. One of this measurement tool is image retake analysis that facilitates a comparative measurement of aspects of service quality, providing evidence to direct staff training and education and alterations to work practice patterns that may be appropriate to improve service quality and facilitate cost-effective practice. Automatic algorithms can help in this task. The presented system allows to detect deficiencies like wrong identifications, positioning errors, wrong radiographic technique, bad image processing, equipment malfunctions, artifacts, etc. In addition, retake images automatically collected can be used for continuous training of the staff that, as shown in our case, should focus on the importance of an adequate identification and ways to avoid the most common mispositioning errors.

From our results, it seems that a right identification of images (specially in the abdomen where projections in lateral position are very often not properly identified) and, in the case of chest images, a comment indicating whether the image was taken in inspiration or in exhalation could make the automatic filtering of information that appears in the DICOM header more useful for image retake analysis.

Other studies10,16,17 have described that most film-screen retakes were exposure and processing issues, while most digital retakes were due to a mispositioning of the patient, as it has happened in our case. The causes of repetition and the relative rates were similar in the abdomen and the chest images.

Based on the repetition rate, performance targets (such a certain reduction in the repetition rate or a reduction in the number of images wrongly taken as a repetition by the automatic software), can be set.

Appropriateness of exam has not been addressed yet, but the collaboration of radiologists putting tags to images that they judge as “not indicated” could be a first step to assess the justification of some radiographic exposures. Vendors could also help in future versions of the PACS by incorporating a user-friendly software to automatically store and classify rejected images.

Conclusions

The presented methodology to automatically detect potential retakes in digital imaging using DICOM header information is feasible and allows to detect deficiencies in the department performance like wrong identifications, positioning errors, wrong radiographic technique, bad image processing, equipment malfunctions, artifacts, etc. In addition, retake images automatically collected can be used for continuous training of the staff.

Acknowledgment

This study was partially funded under the European Commission Coordination Action SENTINEL (FI6R-012909) and by the Spanish grant FIS2006-08186 (Ministry of Education and Science).

References

1. Quality Assurance in the Diagnostic X-ray Department. London: British Institute of Radiology; 1998. [Google Scholar]

2. Dunn MA, Roger AT. X-ray film reject analysis as a quality indicator. Radiography. 1998;4:29–31. doi: 10.1016/S1078-8174(98)80027-8. [CrossRef] [Google Scholar]

3. Council Directive on Health Protection of Individuals Against Dangers of Ionising Radiation in Relation To Medial Exposures, and Replacing Directive 84/466/EURATOM, 97/43 Euratom. Brussels: European Commission; 1997. [Google Scholar]

4. Boone JM, Cody DD, Fisher JR, et al. Quality Control in Diagnostic Radiology. Diagnostic X-ray Imaging Committee Task Group 12, AAPM report no. 74. American Association of Physicists in Medicine. Madison, WI: Medical Physics Publishing; 2002. [Google Scholar]

5. Hardy M, Persaud A. The challenge of governance: achieving quality in diagnostic imaging. Radiography. 2001;7:159–163. doi: 10.1053/radi.2001.0322. [CrossRef] [Google Scholar]

6. Arvanitis TN, Parizel PM, Degryse HR, Schepper AMA. Reject analysis: a pilot programme for image quality management. Eur J Radiol. 1991;12(3):171–176. doi: 10.1016/0720-048X(91)90067-6. [PubMed] [CrossRef] [Google Scholar]

7. Gadeholt G, Geitung JT, Göthlin JH, Asp T. Continuing reject-repeat film analysis program. Eur J Radiol. 1989;9(3):137–141. [PubMed] [Google Scholar]

8. Pitcher EM, Wells PN. Quality assurance and radiologic audit. Curr Opin Radiol. 1992;4(3):9–14. [PubMed] [Google Scholar]

9. Freedman M, Steller D, Jafroudi H, Mun SK. Quality control of storage phosphor digital radiography systems. J Digit Imaging. 1995;8(2):67–74. doi: 10.1007/BF03168129. [PubMed] [CrossRef] [Google Scholar]

10. Honea R, Blado ME, Ma Y. Is reject analysis necessary after converting to computed radiography? J Digit Imaging. 2002;15(Suppl 1):41–52. doi: 10.1007/s10278-002-5028-7. [PubMed] [CrossRef] [Google Scholar]

11. Nol J, Isouard G, Mirecki J. Digital repeat analysis; setup and operation. J Digit Imaging. 2006;19(2):159–166. doi: 10.1007/s10278-005-8733-1. [PMC free article] [PubMed] [CrossRef] [Google Scholar]

12. International Commission on Radiological Protection (ICRP publication 93) Managing patient dose in digital radiology. A report of the International Commission on Radiological Protection. Ann ICRP. 2004;34(1):1–73. [PubMed] [Google Scholar]

13. Vano E, Fernandez JM, Ten JI, et al. Dosimetry and image quality in digital radiology from online audit of the X-ray system. Radiat Prot Dosim. 2005;117(1–3):199–203. [PubMed] [Google Scholar]

14. Faulkner K. The DIMOND project and its impact on radiation protection. Radiat Prot Dosim. 2005;117(1–3):3–6. [PubMed] [Google Scholar]

15. SENTINEL: Safety and Efficacy for New Techniques and Imaging using New Equipment to Support European Legislation. European Coordination Action, (2005–2007). http://www.sentinel.eu.com/Documents/Project+Presentation.pdf. Accessed 7 April 2007

16. Peer S, Peer R, et al. Comparative reject analysis in conventional film-screen and digital storage phosphor radiography. Eur Radiol. 1999;9(8):1693–1696. doi: 10.1007/s003300050911. [PubMed] [CrossRef] [Google Scholar]

17. Peer S, Peer R, et al. Comparative reject analysis in conventional film-screen and digital storage phosphor radiography. Radiat Prot Dosim. 2001;94(1–2):69–71. [PubMed] [Google Scholar]


Articles from Journal of Digital Imaging are provided here courtesy of Springer


Which of the following are technical exposure factor considerations?

List the 7 technical exposure factor considerations..
Mass per unit volume of tissue of the area of clinical interest..
Effective atomic numbers and electron densities of the tissues involved..
Type of image receptor..
Source-to-image receptor distance (SID).
Type and quantity of filtration employed..
Type of x-ray generator used..

What precautions should be taken by a radiographer who must perform a radiographic examination on a pregnant patient quizlet?

What precaution should a pregnant radiologic technologist take while performing radiography? 1-Place more lead strips in the grid. 2-Place the second dosimeter at the waist level. 3-Decrease the source-to-image receptor distance.

Which of the following are the clinical manifestations that can cause involuntary motion during radiographic procedure quizlet?

Some clinical manifestations that can cause involuntary motion during a radiographic procedure include:.
Chills..
Tremors..
Muscle spasms..

Which of the following is a tenet of the Alara concept?

Which of the following is a principle of the ALARA concept? The radiographer's occupational exposure should be kept as low as reasonably achievable.