Session Information
Title: Health Services Research, Quality Measures and Quality of Care - Innovations in Health Care Delivery
Session Type: Abstract Submissions (ACR)
Background/Purpose:
When faced with unclear diagnostic or therapeutic uncertainties, patients often seek further clinical opinions. Similarly, when in doubt, physicians seek expert opinions to assist with clinical care and minimize errors. The audience response system (ARS) has been traditionally used to test knowledge and improve the educational experience. However, ARS can be used to solicit crowd opinion (crowdsourcing) for specific medical questions. In this proof-of-concept pilot study, we sought to examine the ability of professional crowds to correctly answer difficult medical questions using the ARS.
Methods:
This study analyzed data on answers to questions based on clinical vignettes administered during the 2009, 2010, and 2011 ACR/ARHP scientific meetings for which the ARS was used. Questions were considered evidence-based or expert-based. All questions were multiple-choice format. The crowd chose the answers independently and anonymously using remote control panels. Questions were considered answered correctly if the majority of the crowd chose the correct answer. To minimize the influence of the speaker on the crowd, only questions administered prior to any medical discussion were used for this study. As a validation group and to determine if physicians other than rheumatologists were able to answer correctly, 10 randomly chosen ACR/ARHP questions were administered using the ARS during medical Grand Rounds at 2 academic institutions. The percentage of correct answers was calculated. Chi- square test was used to analyze the data.
Results:
93 ARS questions were administered during three consecutive ACR/ARHP scientific meetings. 21 questions were excluded because they either had unknown answers (12 questions) or were administered following medical discussions (9 questions). A total of 72 questions were included in the study, of which 59 had evidence-based and 13 had expert-based answers. The number of multiple-choice answers for each question ranged from 2 to 8 choices, with 97% having at least 4 choices. Vasculitis was the most common topic covered (19%) followed by rheumatoid arthritis (16%). 41 of 59 evidence-based questions (70%) were answered correctly (p<0.0001), and 8 of 13 expert-based questions (62%) were answered similarly to what an expert would answer (p<0.0001). The mean percentage of people answering correctly was 59 ± 18 %. The mean difference in percentage between the correct answer and the next best answer was 37 ± 26 %. 23 of 72 (32%) questions were answered incorrectly. Of those, the correct answer was the second best answer in the majority of questions (66%). For the validation groups, 7/10 (70%) and 9/10 (90%) questions from each institution were answered correctly, mirroring the results seen at the ACR/ARHP meetings.
Conclusion:
Regardless of their specialty, professional crowds as a whole are able to answer difficult clinical questions and solve clinical vignettes using the ARS. Their answers frequently match evidence and expert answers. In the age of crowdsourcing and wide Internet use, professional crowds may serve as a source for solving difficult clinical questions.
Disclosure:
A. G. Sreih,
None;
F. Aldaghlawi,
None.
« Back to 2013 ACR/ARHP Annual Meeting
ACR Meeting Abstracts - https://acrabstracts.org/abstract/crowdsourcing-using-the-audience-response-system-to-solve-medical-problems-a-pilot-study/