• Decrease font size
  • Return font size to normal
  • Increase font size
U.S. Department of Health and Human Services

Radiation-Emitting Products

  • Print
  • Share
  • E-mail

MQSA 2001 Report to Congress - Full Report

PERFORMANCE EVALUATION OF ACCREDITATION BODIES UNDER THE MAMMOGRAPHY QUALITY STANDARDS ACT OF 1992 as amended by the MAMMOGRAPHY QUALITY STANDARDS REAUTHORIZATION ACT OF 1998


January 1, 2001 through December 31, 2001

A Report to Congress

Purpose

The Mammography Quality Standards Act (MQSA) of 1992 (P.L. 102-539), as amended by the Mammography Quality Standards Reauthorization Act (MQSRA) of 1998 (P.L. 105-248), establishes standards for high quality mammography and requires all facilities to be accredited by a Food and Drug Administration (FDA) approved accreditation body (AB) in order for them to demonstrate that they meet these standards. FDA may approve either private nonprofit organizations or state agencies to serve as ABs. The MQSA also requires submitting an annual performance evaluation of the approved ABs to the Senate Committee on Health, Education, Labor and Pension and the House Committee on Energy and Commerce under 42 USC 263b(e)(6).

Currently, there are five ABs: the American College of Radiology (ACR), a private nonprofit organization, and the state ABs of Arkansas (SAR), California (SCA), Iowa (SIA), and Texas (STX). This report covers the performance of ABs under the MQSA from January 1, 2001 through December 31, 2001.

Status of Accreditation Body Approvals

FDA approved the ACR, the SAR, the SIA, and the STX as ABs under the MQSRA of 1998 and the final regulations. The SCA applied for renewal. However, its application for approval is pending until the State’s mammography standards are signed and in effect. FDA approved SCA’s draft standards which are moving through the State’s legislative process.

Standards

MQSA requires that each AB develop (or adopt by reference) standards that are substantially the same as the quality standards established by FDA under subsection (f) of the Act to assure the safety and accuracy of mammography. Regarding state laws, nothing in the Act limits the authority of any state to enact and enforce laws about matters covered by the Act that are at least as stringent as the Act or the standards promulgated under the Act.

American College of Radiology, State of Arkansas AB, and State of Iowa AB

The ACR, SAR, and the SIA adopted the final MQSA standards by reference, incorporating them into their own standards and accreditation processes.

State of California AB

On October 2, 2001, FDA gave preliminary acceptance to draft standards for the SCA’s AB, allowing SCA to proceed with its plan requesting that the SCA Emergency Process be used for publication of these standards, a process over which the AB itself has no influence. As of September 1, 2002, SCA’s standards remained unpublished. Once the SCA’s standards are published, FDA expects to grant approval to the SCA’s renewal application.

State of Texas AB

FDA determined that the STX mammography standards, at the time of its approval as an AB, were substantially equivalent to the MQSA final standards. However, during CY 2001, STX drafted amendments to these previously approved standards for which it sought FDA approval under 21 CFR section 900.4(a)(8). FDA reviewed these draft amendments and identified problems with the revisions and with the existing standards that FDA had not previously noted.

FDA initially had some concerns about the STX AB’s requirements for technologists.
In response, STX officials confirmed that STX enforces the MQSA technologist standards. STX also agreed to prepare and distribute information to its facilities and technologists to clarify that the more stringent MQSA standards would be required and enforced in Texas.

STX did submit a copy of a revised Personnel Checklist to reflect the additional requirements for technologists under MQSA and a chart, “Acceptable Documents for Radiologic Technologists in Texas,” that appears to outline the necessary MQSA requirements for technologists.

Methodology

FDA evaluates its ABs through: (1) examination of responses to questionnaires developed by the FDA addressing performance indicators, (2) analysis of quantitative accreditation and inspection information, (3) review of selected files (including clinical and phantom images), (4) interviews with AB staff and management to answer questions or clarify issues, and (5) onsite visits. FDA uses the following performance indicators (as outlined in the final MQSA regulations) to assess performance: administrative resources, data management, reporting and record keeping processes, accreditation review and decision-making processes, AB onsite visits to facilities, random clinical image reviews, additional mammography reviews, and accreditation revocations and suspensions.

FDA places heavy emphasis on the ABs’ methods of evaluating clinical and phantom images because these image evaluations are critical components of the ABs’ responsibilities. FDA’s staff of qualified interpreting physicians (IPs) annually reviews the ABs’ clinical image review procedures. To compare their own assessment of these mammograms with those of the ABs’ clinical image reviewers, the FDA IPs evaluate mammograms from facilities accredited by the ABs. Also, the FDA’s expert staff annually evaluates phantom images from facilities accredited by the ABs and compares their own assessment of these phantom images with those of the ABs’ phantom image reviewers.

FDA staff analyzes unit accreditation pass and fail data and data describing reasons for failure from each AB. These indicators reflect consistency or inconsistency in how ABs apply accreditation standards. Significant differences in pass and fail rates or reasons for accreditation denial among ABs could, for example, indicate that one AB is interpreting the significance of a particular quality control standard more or less strictly than another.

To complement the information submitted by ABs, FDA analyzes information from its Mammography Program Reporting and Information System (MPRIS) database of annual facility inspections. Accredited facility performance during inspections is measured by average phantom image scores, average radiation dose values, and average processor speeds. Collectively, these measures reflect the overall functioning of all components of the mammography system.

Performance Indicators

(1) Administrative Resources and Funding

AB staffs generally include management, mammography radiologic technologists, MQSA inspectors, health physicists, information technology program application specialists, and administrative assistants. All ABs continue to maintain adequate funding for their respective programs.

(2) Data Management (Process/Errors)

All ABs provide the FDA with electronic transmissions of accreditation data in a secure and appropriately maintained manner. The majority of the ABs reduced their percentage of data management errors from those noted in the previous year. Nevertheless, FDA continues to work individually with ABs to (a) further minimize the number of data errors, (b) emphasize the importance of routinely performing quality assurance and quality control practices to correct errors before transmitting their data, and (c) provide reports that outline errors and the frequency with which they occur.

(3) Reporting and Recordkeeping

Review of ABs’ reporting and recordkeeping practices includes examining procedures for serious consumer complaints and appeals for accreditation decisions.

(a) Serious Consumer Complaints

MQSA requires ABs to develop and administer a consumer complaint mechanism whereby all facilities that an AB accredits must file serious unresolved complaints with their AB. By regulation, each AB must submit to the agency an annual report summarizing all serious complaints received during the previous calendar year, their resolution status, and any actions taken in response to them.

All ABs have a serious consumer complaint mechanism in place and submitted their serious consumer complaint report to FDA for the year 2001. SCA’s CY 2001 AB Performance Evaluation includes an action item to enhance its consumer complaint process.

(b) Appeals

Each AB must have an appeals process for facilities to contest an AB’s adverse accreditation decision. In CY 2001, only ACR received appeals to its accreditation decisions.

FDA noted in ACR’s 1998-99 AB Performance Evaluation that while the percentage of decisions appealed was relatively small, the ACR overturned almost half of its original decisions during the appeals process. Therefore, FDA recommended that ACR review this area to determine if any underlying processes needed modification in light of the high rate of successful appeals. Given that ACR overturned almost half of its original decisions on appeal again in 2001, we asked ACR to analyze the reason(s) behind this percentage and provide its conclusions to FDA.

(4) Accreditation Review and Decision-Making Processes

Review of ABs’ accreditation and decision-making processes include procedures for clinical image review, phantom image review, and mammography equipment evaluation and medical physicist annual survey review.

(a) Clinical Image Review

As part of the accreditation process, mammography facilities must submit clinical images to their ABs for review. To evaluate ABs’ performance in the clinical image review area, FDA’s MQSA qualified interpreting physicians (IPs) annually review clinical images from facilities that had submitted cases to the ABs for clinical image review. Two FDA IPs independently conduct clinical image reviews for each of the ABs that perform clinical image review, evaluating each examination on the eight attributes listed in the final regulations using a five-point scale.

The SCA and the STX each have a contract with the ACR to conduct their clinical image reviews. The remaining three ABs have their own clinical image reviewers to evaluate their facilities’ clinical images. A summary of the clinical image reviews follows.

American College of Radiology

FDA performed its clinical image review of the ACR AB on July 16, 2001. FDA found that there was good agreement between the FDA IPs and the ACR clinical image reviewers at the attribute evaluation level with generally no more than one point variation identified between reviewers. In reviewing the exams and summary evaluation forms, FDA reviewers agreed with the final overall assessments (pass and fail) in all the cases. FDA determined that this spot review of cases indicates that the quality of clinical image review by the ACR remains high and has not deviated from past performance.

State of Arkansas AB

FDA performed its clinical image review of the SAR AB on August 24, 2001. FDA reviewers disagreed with SAR reviewers in only one instance.

The SAR incorporated FDA’s suggestions from its CY 2000 AB Performance Evaluation for the improvement of its clinical image review and as a result, FDA reviewers indicated that the quality of the clinical image review performed by the SAR during CY 2001 was high.

State of Iowa AB

On October 24, 2001 and November 2, 2001, FDA performed its clinical image review of the SIA AB. The FDA IPs found consistent agreement among the SIA reviewers and agreed with the SIA reviewers’ final overall assessments (pass/fail) in all the cases reviewed. The review indicated that the SIA continues to maintain high quality standards concerning clinical image review.

Summary of Audits and Training of Clinical Image Reviewers by ABs

Clinical image review quality control activities that promote consistency among the various clinical image reviewers exist at the ACR (and STX and SCA via ACR contract), the SAR, and the SIA. Each of these ABs conducts training sessions at which clinical image reviewers evaluate clinical images and discuss findings, including the application of AB clinical image review evaluation criteria. To ensure uniformity and to identify potential problems, each of these ABs analyzes agreement and nonagreement rates of all individual clinical image reviewers to provide the reviewer with the necessary data to compare his or her results to the rest of the review group

(b) Phantom Image Review

As part of the accreditation process, mammography facilities must submit phantom images to their ABs for review. To evaluate ABs’ performance in the phantom image review area, FDA’s MQSA expert staff annually reviews phantom images from facilities that had submitted cases to the ABs for phantom image review. Two FDA staff, working independently, review 10 randomly selected phantom images from each of the ABs that perform phantom image review. FDA evaluates all test objects (fibers, specks, masses) on these images as part of the review. Scores for these test objects should fall within the acceptable limit of +/- 0.5.

The STX has a contract with the ACR to conduct its phantom image reviews. The remaining four ABs have their own phantom image reviewers to evaluate their facilities’ phantom images. A summary of the phantom image reviews follows.

American College of Radiology

FDA reviewed ACR’s phantom images in July 2001. FDA determined that this spot review of the phantom images indicates that the quality of phantom image review by the ACR remains high and has not deviated from past performance. All test objects scores of the FDA reviewers were within the generally accepted range of the scores of the ACR reviewers, with the exception of one object.

State of Arkansas AB

FDA reviewed SAR’s AB phantom images in September 2001. The FDA reviewers judged each of the SAR AB reviewers’ scores to be within the generally accepted range. At the beginning of CY 2001, SAR AB had three phantom image reviewers, all MQSA certified inspectors, in its mammography accreditation program. As of April 1, 2002, it had only two phantom image reviewers which does not allow for a tie breaker should a disagreement occur between the two reviewers with respect to scoring. This issue was included as an action item in SAR’s 2001 AB Performance Evaluation. SAR has already successfully addressed this issue.

State of California AB

FDA completed its review of SCA’s AB phantom images in February 2002. In three out of the 10 images reviewed, the AB score differed by 1.0 – 1.5 from the score of the FDA reviewers in one or more of the test object groups. The results of this review reinforced FDA’s concern about the phantom image quality review for accreditation purposes being conducted by a single reviewer. The general practice among ABs is to have the phantom images for accreditation purposes reviewed independently by two reviewers (with a third independent reviewer being used in a tie). Thus, FDA instructed SCA AB to change its process. SCA AB expects to implement a revised process by the end of CY 2002.

State of Iowa AB

FDA reviewed SIA’s AB phantom images in September 2001. The FDA reviewers judged each of the SIA AB reviewers’ scores to be within the generally accepted range.

FDA learned that on several occasions only one reviewer examined the phantom images. When the reviewer failed the phantom images, the AB required the facility to submit another phantom image. FDA included an action item in SIA’s 2001 AB Performance Evaluation to implement a procedure that will always utilize two reviewers. SIA has already successfully addressed this issue.

Summary of Audits and Training of Phantom Image Reviewers by ABs

An audit of phantom image reviewers ensures uniformity, identifies any potential problems, and provides all individual phantom image reviewers the necessary data to compare his/her results to the rest of the review group. A summary of this activity as well as reviewer training for the ABs follows.

Audits

Audit results are used to enhance reviewer training by emphasizing any performance issues. The ACR (and STX via ACR contract), the SAR, and the SIA conducted audits of their phantom image reviewers to collect statistics on agreement and nonagreement reviewer rates. SCA was unable to conduct an agreement and nonagreement audit because it has been using only a single reviewer during its phantom image review process as discussed under the section on phantom image review. SCA AB expects to implement a revised phantom image review process by the end of CY 2002 and should begin conducting audits in 2003.

Training

The ACR and the SAR conducted training sessions for their phantom image reviewers in CY 2001. Because SCA and SIA did not conduct phantom image review training in CY 2001, FDA is requesting that these two ABs submit a schedule for proposed training.

(c) Mammography Equipment Evaluation (MEE) and Medical Physicist Survey Report Reviews

The final regulations state that ABs shall require every facility applying for accreditation to submit an MEE with its initial accreditation application and, prior to accreditation, to submit a medical physicist survey on each mammography unit at the facility (§900.4(e)(i)). FDA found that the ABs differ on how they review the MEE and is currently working with the ABs as a group to develop a consistent review process.

(5) AB Onsite Visits to Facilities

The final MQSA regulations (§900.4(f)(1)(i)) require that each AB annually conduct onsite visits to at least five percent of the facilities the body accredits to monitor and assess the facility’s compliance with the standards established by the body for accreditation. However, a minimum of five facilities shall be visited, and visits to no more than 50 facilities are required. During such visits, the AB is required to evaluate eight core elements which are: (a) assessment of quality assurance activities; (b) review of mammography reporting procedures; (c) clinical image review; (d) review of medical audit system; (e) verification of personnel duties; (f) equipment verification; (g) verification of consumer complaint mechanism; and (h) other identified concerns.

At least 50 percent of the facilities visited shall be selected randomly and the other facilities visited shall be selected based on problems identified through state or FDA inspections, serious consumer complaints received from consumers or others, a previous history of noncompliance, or other information in the possession of the AB, MQSA inspectors, or FDA (i.e., visits for cause).

American College of Radiology

The 47 visits ACR performed are three less than the 50 on-site visits required by the final regulations. Initially, ACR scheduled a trip in December 2001 that included four random on-site visits. Completion of the four visits would have provided the balance needed to comply with the regulations. Because of the difficulty finding reviewers who were able and willing to travel in the aftermath of the September 11, 2001 events, reviewers who were available were needed to conduct unexpected reviews “for cause” during December 2001. Therefore, ACR was unable to complete the required number of onsite visits by the end of the calendar year.

State of Arkansas AB

Although the SAR AB conducted almost seven times the number of on-site visits than required, it reported that only when there is concern about clinical image quality during an on-site visit, does its AB staff randomly select images for an additional clinical review. However, the FDA requires that clinical images be reviewed on all on-site visits. FDA included this as an action item in SAR’s 2001 AB Performance Evaluation and SAR has already successfully addressed this issue.

State of California AB and the State of Texas

The 18 visits SCA performed are six less than the 24 on-site visits required by the final regulations. The four visits STX performed are two less than the six on-site visits required by the final regulations. Because these two ABs did not provide any explanation for not meeting the required number of AB on-site visits, their 2001 AB Performance Evaluations included additional visits as an action item. FDA is working closely with these ABs to ensure that they fulfill this requirement. FDA is also working with STX to clarify the necessary elements of each on-site visit.

State of Iowa

The SIA AB conducted almost eight times the number of on-site visits than required, thus fulfilling its AB on-site visit obligation.

(6) Random Clinical Image Review

The final MQSA regulations (§900.4(f)(2)(i)) require that each AB annually conduct random clinical image reviews (RCIRs) of at least three percent of the facilities the body accredits to monitor and assess facility compliance with the standards established by the body for accreditation.

Four of the five ABs met their obligation to conduct a random clinical image review of at least three percent of the facilities they accredit. The State of Arkansas AB conducted one fewer than required.

(7) Additional Mammography Review

If FDA believes that mammography quality at a facility has been compromised and may present a serious risk to human health, the facility must provide clinical images and other relevant information, as specified by FDA, for review by its AB (§900.12(j)). This additional mammography review (AMR) helps the agency to determine whether there is a need to notify affected patients, their physicians, or the public that the quality of mammograms may have been compromised. The request for an AMR may also initiate from the AB or a State Certifying Agency (SAC). When an AB initiates an AMR, FDA encourages it to discuss the case with the agency prior to implementation.

The following chart summarizes the number of AMRs conducted by each AB during CY 2001:

AB Number of AMRs Conducted or Initiated* Number With Deficiency or Serious Risk Number That Completed Corrective Action and/or Notification
ACR 10 6 6
SAR 0 0 0
SCA 2 2 2
SIA 0 0 0
STX 2 1 Ongoing**

*Note: The SCA and the STX each have a contract with the ACR to conduct their clinical image reviews during an AMR. The remaining three ABs have their own clinical image reviewers to evaluate their facilities’ clinical images.
**One of the STX facilities’s failed its AMR. As of the writing of STX’s 2001 AB Performance Evaluation, the AB’s follow-up actions for this facility were ongoing.

(8) Accreditation Revocation and Suspension

The MQSA final regulations (§900.3(b)(3)(iii)(I)) require that each AB have policies and procedures for suspending or revoking a facility’s accreditation. If a facility cannot correct deficiencies to ensure compliance with the standards or if a facility is unwilling to take corrective actions, the AB shall immediately notify the FDA, and shall suspend or revoke the facility’s accreditation.

American College of Radiology, State of Arkansas, State of Iowa, and State of Texas

Neither ACR, the SAR AB, the SIA AB, nor the STX AB revoked or suspended any facility’s accreditation in 2001.

State of California AB

According to SCA’s interpretation of its own State authority, it currently lacks the authority to suspend or revoke accreditation. The SCA’s draft standards, where are currently moving through the State’s legislative process, will grant the SCA AB the specific authority to revoke or suspend accreditation. In order to accomplish the same end result until its draft standards are signed and in effect, SCA uses the State’s “cease and desist” authority to force facilities to cease operations. In CY 2001, SCA caused two mammography facilities to cease operations through its cease and desist order authority.

(9) Quantitative Accreditation and Inspection Information

As additional performance indicators, FDA analyzed quantitative accreditation and inspection information related to (a) unit accreditation pass/fail data, (b) reasons for denial of accreditation, and (c) accredited facility performance during inspections. Note: There is a relatively small number of state-accredited facilities compared to ACR-accredited facilities. Therefore, small variations in state-accredited facility performance may lead to differences across accreditation bodies that do not reflect actual differences in accreditation body performance.

(a) Unit Accreditation Pass/Fail Data

Number of Units ACR SAR SCA SIA STX
Fully Processed 6,549 61 425 73 91
Passed Accreditation 4,701 (71.8%) 51 (83.6%) 297 (70%) 57 (78%) 73 (80%)
Failed Accreditation* 46 (0.7%) 0 12 (2.8%) 0 0
Did Not Complete,
Withdrew or Expired
1802 (27.5%) 10 (16.4%) 116 (27.2%) 16 (22%) 18 (20%)

*Units that were still denied accreditation as of December 31, 2001.

At the conclusion of the reporting period, the accreditation pass rate of mammography units among the accreditation bodies ranged from 70 - 83.6 percent. In general, the rates for facilities that failed accreditation stayed about the same as those in the last reporting period, while the rates for facilities that did not complete the accreditation process, withdrew from the process, or whose accreditation expired increased.

(b) Reasons for Mammography Unit Denial

During CY 2001, the SAR AB failed to report to FDA any of the reasons it denied unit accreditation to its facilities. ACR only reported reasons for 50 percent of the denials it reported, while SCA reported reasons for only 80 percent of the denials it reported. The STX AB reported denial reasons 100 percent of the time, while the SIA AB did not have any denials in CY 2001. The 2001 AB Performance Evaluations included action items for those ABs that failed to report reasons for unit denial.

Of the reasons reported, the state ABs denied mammography unit accreditation almost solely due to clinical image failure while ACR denied unit accreditation primarily from clinical image and phantom image failure. The state ABs have interactive relationships with their facilities that enable them to be proactive in resolving potential problems, presumably accounting for the lower overall denial rate among the state ABs as compared to the ACR’s denial rate. However, since the last reporting period, the number of units denied accreditation by ACR decreased by 48 percent while the number of units denied accreditation by the state ABs remained about the same, except for the SCA whose number of units denied accreditation decreased by 25 percent.

Most of the facilities that receive a denial in the accreditation process complete rigorous corrective action plans under the ABs’ reinstatement protocols, and eventually successfully achieve the levels of quality needed for accreditation.

(c) Facility Performance During Inspections Sorted by AB

  ACR SAR SCA SIA STX
Number of Inspections 8,336 69 444 134 123
Average Phantom Image Score* 12.2 12.3 12.3 11.7 12.6
Average Dose (in millirads) 177.2 172.8 163.8 152.1 171.3
Average Processor Speed 103.9 104.3 107.5 101.8 104.6

*The maximum possible phantom image score is 16. Four fibers, three masses, and three speck groups must be visible on the image for a passing score.

There were a total of 9,106 facility inspections in CY 2001. ACR was the AB for 91.54 percent of the facilities inspected; SAR was the AB for 0.77 percent; SCA was the AB for 4.87 percent; SIA was the AB for 1.47 percent; and STX was the AB for 1.35 percent.

There were no significant differences in average phantom image scores among the facilities accredited by the five ABs. Average phantom image scores increased from those reported in the 2000 Report. As phantom images are an indirect measurement of image quality, this rise might suggest that the clinical image quality throughout the mammography facilities improved in 2001.

In general, the average doses increased slightly from those reported in the 2000 report, but still remain well below the dose limit of 300 millirads mandated by the MQSA final regulations. This dose limit has the advantage of permitting flexibility for the optimization of the technique factors used during examinations in order to achieve improved image quality.

Generally, the average processing speeds among the facilities of all the ABs remained about the same as those reported in the 2000 Report, in the range to produce satisfactory clinical images. The evaluation of the mammography facility’s film processing speed is an important quality assurance measure. The quality of film processing impacts directly not only on the resulting image quality of the mammogram, but can also impact on the dose administered to the patient. If a mammography facility is processing film in accordance with the film manufacturer’s recommendations, then the processing speed should be close to 100 (80 – 120 is considered normal processing speed). If the processing speed falls significantly, then the clinical image is not completely developed, appears too light, and the quality of the mammographic image can be significantly compromised. Moreover, the facility may not realize its film processor is the source of the problem and may compensate by increasing the dose administered to the patient.

In CY 2001, over half (59 percent) of the accredited mammography facilities had no MQSA violations while only three percent of the facilities had a violation characterized as “most serious.” FDA actively works with these facilities on corrective measures.

Status of the Action Items From the 2000 Report to Congress

In almost all instances, the ABs successfully completed their CY 2000 action items. In the rare instances where they did not, FDA continues to actively work with each AB to ensure that it successfully completes the requirements of each action item.

Conclusion

Given that FDA’s AB oversight program promotes collaboration and cooperation, each AB, in concert with FDA, is currently addressing all action items cited in this report. FDA and the ABs, working in partnership with the certified mammography facilities in the United States and the states participating in inspection and other MQSA activities, are ensuring quality mammography across the nation.