ACR Meeting Abstracts

ACR Meeting Abstracts

  • Home
  • Meetings Archive
    • ACR Convergence 2020
    • 2020 ACR/ARP PRSYM
    • 2019 ACR/ARP Annual Meeting
    • 2018 ACR/ARHP Annual Meeting
    • 2017 ACR/ARHP Annual Meeting
    • 2017 ACR/ARHP PRSYM
    • 2016-2009 Meetings
    • Download Abstracts
  • Keyword Index
  • Advanced Search
  • Your Favorites
    • Favorites
    • Login
    • Register
    • View and print all favorites
    • Clear all your favorites
  • Meeting Resource Center

Abstract Number: 1372

An Analysis of the Quality and Readability of Online Osteoarthritis Information with Historical Comparison

Kieran Murray1, Tim Murray2, Anna O'Rourke3, Candice Low4 and Douglas J. Veale4, 1Rheumatology, Saint Vincent's University Hospital, Dublin 4, Ireland, 2Beaumont Hospital, Dublin, Ireland, 3Saint Vincent's University Hospital, Dublin 4, Ireland, 4Centre for Arthritis and Rheumatic Diseases, Dublin Academic Medical Centre, University College Dublin, Dublin, Ireland

Meeting: 2018 ACR/ARHP Annual Meeting

Keywords: Online resources, Osteoarthritis, patient, patient engagement and patient participation

  • Tweet
  • Email
  • Print
Save to PDF
Session Information

Date: Monday, October 22, 2018

Session Title: Osteoarthritis – Clinical Poster II

Session Type: ACR Poster Session B

Session Time: 9:00AM-11:00AM

Background/Purpose:

OA is the most common cause of disability in people >65 years old.

Most patients in the USA use the internet for healthcare information. The quality and readability of this is variable. Guidelines state health information for the general public should be at a 7– 8th grade reading level.

Health on the Net Foundation Code of Conduct (HONcode) is a well-known quality label for medical and health websites. The DISCERN instrument and JAMA Benchmark criteria are standardized validated tools to assess healthcare information quality. In 2003, online OA information was graded as “poor” by DISCERN. This study reviews the quality and readability of current online OA information.

Methods:

We searched the term “osteoarthritis” on the three most popular (>99%) search engines in the USA (Google, Bing and Yahoo). Research has shown patients are unlikely to search beyond 25 pages. Thus, the 25 most-viewed websites, excluding paid adds, on each search engine were included.

Age of content, content producer, author characteristics and HONcode status were noted. Website quality was evaluated using DISCERN and JAMA criteria. Readability was measured using three validated scoring systems: Flesch Reading Ease Score, Flesch-Kincaid Grade Level and Gunning-Fog Index.

Mean website age, JAMA benchmark criteria and DISCERN score for each website were reviewed with one-way analysis of variance (ANOVA). Analysis was performed by Prism 7 (GraphPad Software). Significance was set at p <0.05.

Results:

Of 75 articles, 38 met exclusion criteria. 31 were duplicate websites;3 non-text pages;2 paywall protected websites; 2 inaccessible for geographic reasons. 37 websites were suitable for analysis.

For 23 websites, author/reviewers were not reported. Reported authors/reviewers were doctors (n=8), other health professionals (n=3), non-specified medical staff (n=1) or non-medical author (n=2). Website characteristics are shown in Table 1.

One website met all four JAMA Criteria. Mean DISCERN quality of information for OA websites was “fair”.

There was a significant difference in quality between author types (ANOVA r2=0.24, p=0.028). Non-doctor health professional authors scoring the highest and non-medical authors scoring the lowest. HONCode endorsed websites (n=16) were of a statistically significantly higher quality.

Readability varied by assessment tool from 8th to 12th grade level.

Conclusion:

Quality of online health information for OA is “fair”, an improvement from 2003. Readability was equal to or more difficult than recommendations. HONcode certification was indicative of higher quality, but not readability.

Summary of Results

DISCERN Score

JAMA Benchmark Criteria

Readability

Producer

Age (years)

HONcode certified

Readability

Treatment Choices

Quality

Total

Authorship

Attribution

Currency

Disclosure

FRES

FKGL

GFI

All

1.4

43.2%

23.1

16.6

2.6

42.3

29.7%

24.3%

59.4%

24.3%

51.4

7.8

9.0

Not-for-profit,

(governmental and NGOs)

(n=14)

0.9

21.4%

24.1

17.2

2.6

43.9

0%

21.4%

50%

21.4%

50.8

7.9

9.4

Professional Society

(n=4)

1.3

0%

22.6

16.3

2.5

41

50%

25%

75%

25%

49.4

8.2

8.8

For-Profit Company

(n=15)

1.5

80%

23

16.9

2.6

42.5

53.3%

26.7%

73.3%

33.3%

53.2

7.6

8.6

Healthcare Providers

(n=4)

0.4

25%

20.5

14

2.3

36.8

0%

25%

25%

0%

48.5

8.2

9.2

All results are mean values

JAMA=Journal of the American Medical Association HONcode=Health On the Net certification NGO=Non-governmental organisation FRES=Flesch Reading Ease Score FKGL=Flesch-Kincaid Grade Level GFI=Gunning-Fog Index


Disclosure: K. Murray, None; T. Murray, None; A. O'Rourke, None; C. Low, None; D. J. Veale, None.

To cite this abstract in AMA style:

Murray K, Murray T, O'Rourke A, Low C, Veale DJ. An Analysis of the Quality and Readability of Online Osteoarthritis Information with Historical Comparison [abstract]. Arthritis Rheumatol. 2018; 70 (suppl 10). https://acrabstracts.org/abstract/an-analysis-of-the-quality-and-readability-of-online-osteoarthritis-information-with-historical-comparison/. Accessed January 17, 2021.
  • Tweet
  • Email
  • Print
Save to PDF

« Back to 2018 ACR/ARHP Annual Meeting

ACR Meeting Abstracts - https://acrabstracts.org/abstract/an-analysis-of-the-quality-and-readability-of-online-osteoarthritis-information-with-historical-comparison/

Advanced Search

Your Favorites

You can save and print a list of your favorite abstracts by clicking the “Favorite” button at the bottom of any abstract. View your favorites »

ACR Convergence: Where Rheumatology Meets. All Virtual. November 5-9.

ACR Pediatric Rheumatology Symposium 2020

© COPYRIGHT 2021 AMERICAN COLLEGE OF RHEUMATOLOGY

Wiley

  • Home
  • Meetings Archive
  • Advanced Search
  • Meeting Resource Center
  • Online Journal
  • Privacy Policy
  • Permissions Policies
loading Cancel
Post was not sent - check your email addresses!
Email check failed, please try again
Sorry, your blog cannot share posts by email.
This site uses cookies: Find out more.