Are Physicians Compare metrics missing the mark on radiologist performance?

When the Affordable Care Act required physician performance information to be made available in a way that allows patients to make comparisons based on quality of care, the CMS Physicians Compare Initiative was born. 

However, according to a recent study published by the Journal of the American College of Radiology, the initiative should include more imaging-specific metrics so it can accurately reflect a radiologist’s overall performance.

Andrew B. Rosenkrantz, MD, of the department of radiology at the New York University Langone Medical Center, and colleagues studied how radiologists scored in six different Physicians Compare metrics compared to nonradiologists.

In two of the metrics, radiologists significantly outperformed the nonradiologists. Radiologists scored 60.5 percent in PQRS participation, while nonradiologists scored 39.4 percent. And in receipt of PQRS Maintenance of Certification incentives, radiologists scored a 4.7 percent while nonradiologists scored 0.3 percent.

In the other four metrics, however, nonradiologists fared much better than radiologists.

According to the authors, it’s clear why the statistics shook out in this way: the two metrics radiologists did well in are the same two that specifically cater to radiology.

“PQRS, in particular, incorporates specific measures relevant to radiology, such as fluoroscopic exposure time, usage of the 'probably benign' assessment category in mammography screening, and proper reporting of stenosis measurement in carotid imaging reports,” the authors wrote. “Similarly, the MOC program and its associated requirements are unique to each specialty’s board. Radiologists, for example, must meet criteria put forth by the [American Board of Radiology] rather than another specialty’s board.”

Meanwhile, those other four metrics were purposely designed to be less specific.

“All remaining metrics, by design, are not tailored to any given specialty,” the authors wrote. “Electronic prescribing, as well as heart and stroke-prevention measures, although worthy efforts for the medical community in general, make little sense when evaluating the quality of diagnostic imaging work performed by a radiologist.”

According to Rosenkrantz et al., it may be more appropriate to judge radiologists based on how they perform in those specialty-specific metrics than the more generic options. To judge radiologists based on less appropriate metrics does a disservice to the entire specialty, especially considering the increased importance of the Physician Compare metrics.

“With patients increasingly turning to online physician ratings websites—which are largely populated by subjective provider information—the need for more-objective data is clear,” the authors wrote. “For this reason, and because they will be used increasingly as a basis for payment, radiologists are encouraged to participate in the various CMS Physician Compare metrics. However, for the metrics to meet their intended goal of providing stakeholders with useful information, those metrics must provide meaningful data.”

Michael Walter
Michael Walter, Managing Editor

Michael has more than 18 years of experience as a professional writer and editor. He has written at length about cardiology, radiology, artificial intelligence and other key healthcare topics.

Around the web

The nuclear imaging isotope shortage of molybdenum-99 may be over now that the sidelined reactor is restarting. ASNC's president says PET and new SPECT technologies helped cardiac imaging labs better weather the storm.

CMS has more than doubled the CCTA payment rate from $175 to $357.13. The move, expected to have a significant impact on the utilization of cardiac CT, received immediate praise from imaging specialists.

The all-in-one Omni Legend PET/CT scanner is now being manufactured in a new production facility in Waukesha, Wisconsin.