• Decrease font size
  • Return font size to normal
  • Increase font size
U.S. Department of Health and Human Services

Radiation-Emitting Products

  • Print
  • Share
  • E-mail

MQSA 2003 Report to Congress - Full Report

 

PERFORMANCE EVALUATION OF ACCREDITATION BODIES UNDER THE MAMMOGRAPHY QUALITY STANDARDS ACT OF 1992 as amended by the MAMMOGRAPHY QUALITY STANDARDS REAUTHORIZATION ACT OF 1998

January 1, 2003 through December 31, 2003

A Report to Congress

Purpose

The Mammography Quality Standards Act (MQSA) of 1992 (P.L. 102-539), as amended by the Mammography Quality Standards Reauthorization Act (MQSRA) of 1998 (P.L. 105-248), establishes standards for high quality mammography and requires all facilities to be accredited by a Food and Drug Administration (FDA) approved accreditation body (AB) in order for them to demonstrate that they meet these standards. FDA may approve either private nonprofit organizations or state agencies to serve as ABs. The MQSA also requires FDA to submit an annual performance evaluation of the approved ABs to the Senate Committee on Health, Education, Labor and Pension and the House Committee on Energy and Commerce under 42 USC 263b(e)(6).

This report covers the performance of ABs under the MQSA from January 1, 2003, through December 31, 2003. During the reporting period, there were five ABs: the American College of Radiology (ACR), a private nonprofit organization, and the state ABs of Arkansas (SAR), California (SCA), Iowa (SIA), and Texas (STX).

Status of Accreditation Body Approvals

FDA approved the ACR, the SAR, the SIA, and the STX as ABs under the MQSRA of 1998 and the final regulations. FDA approved the SCA under the interim regulations. While reviewing the SCA’s application under the MQSRA and the final regulations, the SCA withdrew its application (on May 5, 2004) for status as an accreditation body. Through this withdrawal, the SCA relinquished its authority and responsibilities under the MQSRA. As a result, SCA-accredited facilities are in the process of transitioning their accreditation to the ACR.

Standards

MQSA requires that each AB develop (or adopt by reference) standards that are substantially the same as the quality standards established by FDA under subsection (f) of the Act to assure the safety and accuracy of mammography. Regarding state laws, nothing in the Act limits the authority of any state to enact and enforce laws about matters covered by the Act that are at least as stringent as the Act or the standards promulgated under the Act.

All ABs have either adopted the final MQSA standards by reference, or have developed standards that are substantially the same as the quality standards established by FDA. Each AB incorporated the standards into its own accreditation processes.

Methodology

FDA evaluates its ABs through: (1) examination of responses to questionnaires developed by FDA addressing performance indicators, (2) analysis of quantitative accreditation and inspection information, (3) review of selected files (including clinical and phantom images), (4) interviews with AB staff and management to answer questions or clarify issues, and (5) onsite visits. FDA uses the following eight performance indicators (as outlined in the final MQSA regulations) to assess performance: administrative resources, data management, reporting and record keeping processes, accreditation review and decision-making processes, AB onsite visits to facilities, random clinical image reviews, additional mammography reviews, and accreditation revocations and suspensions.

FDA staff analyzes unit accreditation pass and fail data along with data that describe the reasons for each AB failure decision. Significant differences in pass and fail rates or reasons for accreditation denial among ABs could, for example, indicate that one AB is interpreting the significance of a particular quality standard more or less strictly than another.

To complement the information submitted by the ABs, FDA analyzes information from its Mammography Program Reporting and Information System (MPRIS) database of annual facility inspections. Accredited facility performance during inspections is measured by average phantom image scores, average radiation dose values, and average processor speeds. Collectively, these measures reflect the overall functioning of all components of the mammography system.

Performance Indicators

(1) Administrative Resources and Funding

AB staffs generally include management, mammography radiologic technologists, MQSA inspectors, health physicists, information technology program application specialists, and administrative assistants. In 2003, all ABs continued to maintain adequate funding for their respective programs.

(2) Data Management (Process/Errors)

All ABs provide the FDA with electronic transmissions of accreditation data in a secure and appropriately maintained manner. The majority of the ABs reduced their percentage of data management errors from those noted in the previous year. Nevertheless, FDA continues to work individually with ABs to (a) further minimize the number of data errors, (b) emphasize the importance of routinely performing quality assurance and quality control practices to correct errors before transmitting the data, and (c) provide reports that outline errors and the frequency with which they occur.

(3) Reporting and Recordkeeping

FDA’s review of the ABs’ reporting and recordkeeping practices includes examining procedures for handling serious consumer complaints and appeals for accreditation decisions.

(a) Serious Consumer Complaints

MQSA requires ABs to develop and administer a consumer complaint mechanism whereby all facilities that an AB accredits must file serious unresolved complaints with their AB. By regulation, each AB must submit to the agency an annual report summarizing all serious complaints received during the previous calendar year, their resolution status, and any actions taken in response to them.

All ABs have an appropriate serious consumer complaint mechanism in place. Each AB submitted its serious consumer complaint report to FDA for the year 2003, indicating that the ABs follow acceptable procedures when resolving these complaints.

(b) Appeals

Each AB must have an appeals process for facilities to contest an AB’s adverse accreditation decision. In CY 2003, ACR was the only AB that received appeals to its accreditation decisions. Of the five appeals received by the ACR, four decisions (80%) were overturned and one decision was upheld (20%).

Because of the trend of a continuing increase in the percentage of appeals in which ACR overturns its original decision, FDA asked ACR to review its clinical image review process and its appeals process. After careful review by ACR’s Committee on Mammography Accreditation, ACR revised its appeals process in January 2004. In order to better standardize its appeals process, ACR revised its policy so that the senior reviewer (arbitrating the appeal) determines if the original reviewers followed ACR protocol when scoring the images and making reasonable pass/fail conclusions. According to ACR’s legal counsel, this change will make the appeals process more consistent with processes used by government and other entities. FDA will continue to monitor the appeals process to ensure that ACR protocol is being followed in making the original decision.

(4) Accreditation Review and Decision-Making Processes

Review of the ABs’ accreditation and decision-making processes includes evaluating procedures for clinical image review, phantom image review, and mammography equipment evaluation and medical physicist annual survey review.


(a) Clinical Image Review

As part of the accreditation process, mammography facilities must submit clinical images to their ABs for review. To evaluate the ABs’ performance in the clinical image review area, FDA’s MQSA qualified interpreting physicians (IPs) annually review clinical images from a sample of facilities that submit cases to the ABs for clinical image review. Two FDA IPs independently conduct clinical image reviews for each facility in the sample for each of the ABs that perform clinical image review, evaluating each examination on the eight attributes listed in the final regulations using a five-point scale.

The SCA and the STX each have a contract with the ACR to conduct their clinical image reviews. The remaining three ABs have their own clinical image reviewers to evaluate their facilities’ clinical images. A summary of the FDA clinical image reviews follows.

American College of Radiology AB

FDA performed its evaluation of ACR’s clinical image review process on October 10, 2003. FDA found that there was good agreement between the FDA IPs and the ACR clinical image reviewers at the attribute evaluation level with generally no more than one point variation identified between reviewers. In reviewing the exams and summary evaluation forms, FDA reviewers agreed with the final overall assessments (pass and fail) in all the cases. FDA determined that this spot review of cases indicates that the quality of clinical image review by the ACR remains high and has not deviated from past performance.

State of Arkansas AB

FDA performed its evaluation of SAR’s clinical image review process on October 21 and 28, 2003. In reviewing the exams, FDA reviewers agreed with the final overall assessments (pass and fail) in all the cases. FDA’s IPs indicated that the quality of clinical image review performed by the SAR remains high and has not deviated from past performance. FDA commended SAR AB for specifically asking its reviewers to state (in cases of failure) whether the images were of diagnostic quality and if an additional mammography review (AMR) should be considered.

State of Iowa AB

On October 2 and 8, 2003, FDA performed its evaluation of SIA’s clinical image review process. The FDA IPs found consistent agreement among the SIA reviewers and agreed with the SIA reviewers’ final overall assessments (pass/fail) in all the cases reviewed. The review indicated that the quality of clinical image review performed by the SIA AB remains high and has not deviated from past performance. FDA recommended to SIA, however, that while the AB’s “Radiologist Reviewer Evaluation Form” provides space for reviewers’ comments, it should specifically asks the reviewers to state if the images are of such poor quality that an AMR review for the facility should be considered.

Summary of Audits and Training of Clinical Image Reviewers by ABs

Clinical image review quality control activities that promote consistency among the various clinical image reviewers exist at the ACR (and STX and SCA via ACR contract), the SAR, and the SIA. Each of these ABs conducts training sessions at which clinical image reviewers evaluate clinical images and discuss findings, including the application of AB clinical image review evaluation criteria. To ensure uniformity and to identify potential problems, each of these ABs analyzes agreement and nonagreement rates for all individual clinical image reviewers to provide each reviewer with the necessary data to compare his or her results to the rest of the review group

(b) Phantom Image Review

As part of the accreditation process, mammography facilities must submit phantom images to their ABs for review. To evaluate the ABs’ performance in the phantom image review area, FDA’s MQSA expert staff annually reviews phantom images from facilities that submit cases to the ABs for phantom image review. Two FDA staff, working independently, review 10 randomly selected phantom images from each of the ABs that perform phantom image review. FDA evaluates all test objects (fibers, specks, masses) on these images as part of the review. Scores for these test objects should fall within the acceptable limit of plus or minus 0.5.

The STX has a contract with the ACR to conduct its phantom image reviews. The remaining four ABs have their own phantom image reviewers to evaluate their facilities’ phantom images. A summary of the phantom image reviews follows.

American College of Radiology AB

FDA reviewed ACR’s phantom images on October 10, 2003. Most of the test object scores of the FDA reviewers were within the generally accepted range of the scores of the ACR reviewers. For three of the phantom images, the ACR reviewers’ fiber scores diverged by more than +0.5 from the FDA reviewers’ scores. Since the ACR reviewers were less stringent in their scoring, FDA recommended that the ACR reviewers revisit the scoring criteria in these three cases.

State of Arkansas AB

FDA reviewed SAR’s phantom images in November 2003. Most of the test object scores of the FDA reviewers were within the generally accepted range of the scores of the SAR reviewers. For three of the phantom images, the mean scores between the FDA reviewers and the SAR reviewers differed by more than +/- 0.5. Since the SAR reviewers were less stringent in their scoring for two of the phantoms, FDA recommended that the SAR reviewers revisit the scoring criteria in these cases.

State of California AB

FDA completed its review of SCA’s phantom images in November 2003. The phantoms submitted to FDA were scored by only one SCA reviewer. Some of the test object scores of the FDA reviewers were within the generally accepted range of the scores of the SCA reviewer. For six of the phantoms, the SCA reviewer’s scores for either the fibers or the speck groups diverged by more than 0.5 from the FDA reviewers’ scores. FDA believes the benefits afforded from a second reviewer (with a tie-breaker as needed) would ensure a more balanced phantom image review for the accreditation process.

During FDA’s oversight visit on February 10, 2004, it found that the SCA AB failed to implement its FDA-approved policy for the review of phantom images by the due date of April 1, 2003. The policy requires that all phantom images must be reviewed by two reviewers with a third reviewer as a tie-break when needed. Since SCA failed to implement its policy, this issue was included as an action item in its 2003 Performance Evaluation.

State of Iowa AB

FDA reviewed SIA’s phantom images in November 2003. Most of the test object scores of the FDA reviewers were within the generally accepted range of the scores of the SIA reviewers. There were three phantom images where the SIA reviewers’ scores diverged by more than -0.5 from the FDA reviewers’ scores. FDA believes these scores are acceptable since the SIA reviewers were more stringent in their scoring.

Summary of Audits and Training of Phantom Image Reviewers by ABs

An audit of phantom image reviewers ensures uniformity, identifies any potential problems, and provides all individual phantom image reviewers with the necessary data to compare his/her results to the rest of the review group. A summary of this activity and reviewer training for the ABs follows.

Audits

Audit results are used to enhance reviewer training by emphasizing any performance issues. The ACR (and STX via ACR contract), the SAR, and the SIA conducted audits of their phantom image reviewers to collect statistics on reviewer agreement and nonagreement rates. In 2003, SCA was unable to conduct an agreement and nonagreement audit because it had been using only a single reviewer during its phantom image review process as discussed under the section on phantom image review. SCA AB failed to implement its FDA-approved policy that requires that all phantom images must be reviewed by two reviewers with a third reviewer as a tie-break when needed. Since SCA failed to implement its policy, this issue was included as an action item in its 2003 Performance Evaluation.

Training

ACR, SAR, and SIA conducted training sessions for their phantom image reviewers in CY 2003. The SCA AB did not conduct training for its phantom image reviewers in 2003. Therefore, FDA included this as an action item in SCA’s 2003 Performance Evaluation.

(c) Mammography Equipment Evaluation (MEE) and Medical Physicist Survey Report Reviews

The final regulations state that ABs shall require every facility applying for accreditation to submit an MEE with its initial accreditation application and, prior to accreditation, to submit a medical physicist survey on each mammography unit at the facility (21 CFR §900.4(e)(i)). All of the ABs have policies and procedures established for the review of both the MEE and the medical physicist survey report.

(5) AB Onsite Visits to Facilities

The final MQSA regulations (21 CFR §900.4(f)(1)(i)) require that each AB annually conduct onsite visits to at least five percent of the facilities the body accredits to monitor and assess facility compliance with the standards established by the body for accreditation. However, a minimum of five facilities shall be visited, and visits to no more than 50 facilities are required. During such visits, the AB is required to evaluate eight core elements which are: (a) assessment of quality assurance activities; (b) review of mammography reporting procedures; (c) clinical image review; (d) review of medical audit system; (e) verification of personnel duties; (f) equipment verification; (g) verification of consumer complaint mechanism; and (h) other identified concerns.

At least 50 percent of the facilities visited shall be selected randomly and the other facilities visited shall be selected based on problems identified through state or FDA inspections, serious complaints received from consumers or others, a previous history of noncompliance, or other information in the possession of the AB, MQSA inspectors, or FDA (i.e., visits for cause).

American College of Radiology AB

Based on the number of facilities ACR accredits, it is required to conduct onsite visits to 50 facilities. During CY 2003, ACR fulfilled its AB onsite visit obligation by completing 50 onsite visits (43 random, seven for cause).

State of Arkansas AB

SAR conducted five onsite visits (five random, none for cause) in CY 2003, thus meeting the minimum of five onsite visits required by regulation.

State of California AB

Based on the number of facilities SCA accredited, it was required to conduct onsite visits to 24 facilities. In CY 2003, SCA fulfilled its AB onsite visit obligation by completing 24 onsite visits (22 random, two for cause).

State of Iowa AB

SIA conducted 27 onsite visits (27 random, none for cause) in CY 2003, thus exceeding the minimum of seven onsite visits required by regulation.

State of Texas AB

STX conducted 10 onsite visits (six random, four for cause) in CY 2003, thus exceeding the minimum of eight onsite visits required by regulation.

(6) Random Clinical Image Review

The final MQSA regulations (21 CFR §900.4(f)(2)(i)) require that each AB annually conduct random clinical image reviews (RCIRs) of at least three percent of the facilities the body accredits to monitor and assess facility compliance with the standards

American College of Radiology AB

During CY 2003, ACR conducted 309 RCIRs, thereby exceeding the 250 required by regulation.

State of Arkansas AB

SAR conducted 12 RCIRs in CY 2003, thus exceeding the minimum of the three required by regulation.

State of California AB

In CY 2003, SCA conducted 22 RCIRs, thereby exceeding the 14 required by regulation.

State of Iowa AB

The SIA conducted 66 RCIRs in CY 2003, thus exceeding the minimum of the five required by regulation.

State of Texas AB

STX conducted six RCIRs in CY 2003, slightly higher than the five RCIRs required by regulation.

(7) Additional Mammography Review

If FDA thinks that mammography quality at a facility has been compromised and may present a serious risk to human health, the facility must provide clinical images and other relevant information, as specified by FDA, for review by its AB (21 CFR §900.12(j)). This additional mammography review (AMR) helps the agency to determine whether there is a need to notify affected patients, their physicians, or the public that the quality of mammograms may have been compromised. The request for an AMR may also be initiated by an AB or a State Certifying Agency (SAC). When an AB initiates an AMR, FDA encourages it to discuss the case with the agency prior to implementing the AMR.

The following chart summarizes the number of AMRs conducted by each AB during CY 2003:

AB Number of AMRs Conducted or Initiated* Number With Deficiency or Serious Risk Number That Completed Corrective Action and/or Notification
ACR 37 8 8
SAR 1 1 1
SCA 3 0 0
SIA 1 1 1
STX 1 1 1

*Note: The SCA and the STX each have a contract with the ACR to conduct their clinical image reviews during an AMR. The remaining three ABs have their own clinical image reviewers to evaluate their facilities’ clinical images.

(8) Accreditation Revocation and Suspension

The MQSA final regulations (21 CFR §900.3(b)(3)(iii)(I)) require that each AB have policies and procedures for suspending or revoking a facility’s accreditation. If a facility cannot correct deficiencies to ensure compliance with the standards or if a facility is unwilling to take corrective actions, the AB shall immediately notify the FDA, and shall suspend or revoke the facility’s accreditation.

State of Arkansas AB, State of California AB, and State of Iowa AB

Neither the SAR, the SCA, nor the SIA ABs revoked or suspended any facility’s accreditation in 2003.

American College of Radiology AB
ACR revoked the accreditation of one facility during 2003. It issued the facility a letter of revocation following an adverse outcome in an FDA-requested AMR that found the facility posed a serious risk to human health. After the facility took appropriate corrective action, ACR subsequently reinstated the facility’s accreditation.

State of Texas AB
STX suspended the accreditation of one facility during 2003. It issued the facility a letter of suspension following an onsite visit that uncovered serious deficiencies and MQSA violations in the facility’s quality control program. After the facility took appropriate corrective action, STX lifted the facility’s accreditation suspension.

(9) Quantitative Accreditation and Inspection Information

As additional performance indicators, FDA analyzed quantitative accreditation and inspection information related to (a) unit accreditation pass/fail data, (b) reasons for denial of accreditation, and (c) accredited facility performance during inspections. Note: There are a relatively small number of state-accredited facilities compared to ACR-accredited facilities. Therefore, small variations in state-accredited facility performance may lead to differences across accreditation bodies that do not reflect actual differences in accreditation body performance.

(a) Unit Accreditation Pass/Fail Data Sorted by AB

Number of Units ACR SAR SCA SIA STX
Total 5,207 30 294 91 104
Passed Accreditation 5,201 (99.9%) 30 (100%) 294 (100%) 91 (100%) 104 (100%)
Failed Accreditation* 6 (0.1%) 0 0 0 0

*Units that were still denied accreditation as of December 31, 2003.

At the conclusion of the reporting period, the accreditation pass rate of mammography units among the accreditation bodies ranged from 99.9 - 100 percent. The rates for units that failed accreditation decreased from those in the last reporting period. The unit fail rate usually reflects the facility’s second and third attempts at unit accreditation. The majority of facilities whose unit receives a fail in the first attempt at accreditation initiate corrective action and subsequently do not fail.

(b) Reasons for Mammography Unit Denial

Denial of unit accreditation was solely due to clinical image review failure. Most of the facilities that receive a denial in the accreditation process complete rigorous corrective action plans under the ABs’ reinstatement protocols and eventually successfully achieve the levels of quality needed for accreditation.

(c) Facility Performance During Inspections Sorted by AB

In CY 2003, over half (65.5 percent) of the accredited mammography facilities had no MQSA violations while only 2.0 percent of the facilities had a violation characterized as “most serious.” FDA actively works with these facilities on corrective measures, or takes regulatory measures if a facility cannot improve its performance.

  ACR SAR SCA SIA STX
Average Phantom
Image Score*
12.3 12.3 12.6 11.2 12.9
Average Dose (in millirads) 178.1 179.1 174.4 158 177.6
Average Processor Speed 107.6 113.2 111.9 102.1 113.5

*The maximum possible phantom image score is 16. Four fibers, three masses, and three speck groups must be visible on the image for a minimum passing score.

There were no significant differences in average phantom image scores among the facilities accredited by the five ABs. In general, average phantom image scores stayed about the same as those reported in the 2002 Report.

In general, the average doses increased slightly from those reported in the 2002 report, but still remain well below the dose limit of 300 millirads mandated by the MQSA final regulations. This dose limit has the advantage of permitting flexibility for the optimization of technique factors used during examinations to achieve improved image quality.

The average processing speeds among the facilities of all the ABs remained in the range to produce satisfactory clinical images and increased slightly from those speeds reported in the 2002 Report. The evaluation of the mammography facility’s film processing speed is an important quality assurance measure. The speed of film processing impacts directly not only on the resulting image quality of the mammogram, but can also impact on the dose administered to the patient. If a mammography facility is processing film in accordance with the film manufacturer’s recommendations, then the processing speed should be close to 100 (80 – 120 is considered normal processing speed). If the processing speed falls significantly, then the clinical image is not completely developed, appears too light, and the quality of the mammographic image can be significantly compromised. Moreover, the facility may not realize its film processor is the source of the problem and may compensate by increasing the dose administered to the patient.

Status of the Action Items From the 2002 Report to Congress

In almost all instances, the ABs successfully completed their CY 2002 action items. In those few instances where they did not, FDA is actively working with each AB to ensure that it successfully completes the requirements of each action item.

Conclusion

FDA’s AB oversight program promotes collaboration and cooperation. Therefore, each AB, in concert with FDA, is currently addressing all action items cited in this report. FDA and the ABs, working in partnership with the certified mammography facilities in the United States as well as the states participating in inspection and other MQSA activities, are ensuring quality mammography across the nation.