A critical synopsis of the diagnostic and screening radiology outcomes literature.
Review
Overview
abstract
In summary, the radiology outcomes research literature is both extensive and broad. The methodologic quality, however, is quite variable. Overall, this quality could be improved by intervention in two areas: methodologic dissemination and development. The number of researchers investigating radiology-related outcomes is high, and presently there are over 20 journals devoted exclusively to radiology research. Even with a relatively narrow definition of "outcomes," we identified over 200 radiology outcomes studies, most from the past few years. However, the methodologic quality of most of these articles was relatively low, with important design flaws and biases. Nonetheless, a substantial number of radiology publications do employ state-of-the-art research methods and innovative approaches to methodologic challenges. The quality of radiology outcomes research overall would benefit tremendously from dissemination of such research methods. Instruction in outcomes research methods is accessible to radiologists. For example, there have been several recent articles and series of articles on outcomes research methods in JAMA, including guidelines for the performance and reporting of cost-effectiveness analyses (38-40) and for developing clinical prediction rules (57). Within radiology, several recent articles have appeared on, among other things, cost-effectiveness analysis (34,59,60), assessing quality of life (43), screening for disease (53), and defining the study population (61). The research compendium compiled for the GERRAF (General Electric-Association of University Radiologists Radiology Research Academic Fellowships) program remains a comprehensive methodologic source for many of the issues in radiology outcomes research, and outcomes research methods courses are offered every year at the Society for Health Services Research in Radiology and Society for Medical Decision Making meetings, as well as at the meeting of the Radiological Society of North America. Even so, awareness of the need for such research techniques remains limited. Dissemination of sound research methods is limited at least in part by the current incentives in radiology research. At many institutions, the number of research publications produced, rather than their quality, determines promotion or academic success. Unfortunately, more rigorous study designs often require more time and resources. Further, because peer reviewers are often as uninformed about research methods as the bulk of those who are submitting manuscripts, it may actually be more difficult to publish articles with more advanced methodologic designs. The standard in radiology is the uncontrolled case series, and deviation from the standard may make acceptance for publication more difficult. On a more optimistic note, recent publication of a number of methodology articles suggests that at least some journals are promoting improved research in methodology (43,53,59-61). We hope that time will be available for manuscript reviewers to learn to understand the strengths and weaknesses of various research approaches. If more rigorous study designs were required for publication, radiology outcomes research would probably improve drastically. Nevertheless, the current peer-review system does not effectively promote sound research design. The other great incentive in research is funding. Clearly, if advanced research design is required for funding, then there is incentive for improvement in research quality. Traditionally, National Cancer Institute and other National Institutes of Health and public sector funding has been predicated on a high level of research sophistication. Undoubtedly, availability of grants for diagnostic and screening imaging clinical trials and other research will go far to improve radiology research methods. The other traditional source of research funding is industry.