ACR Meeting Abstracts

ACR Meeting Abstracts

  • Meetings
    • ACR Convergence 2024
    • ACR Convergence 2023
    • 2023 ACR/ARP PRSYM
    • ACR Convergence 2022
    • ACR Convergence 2021
    • ACR Convergence 2020
    • 2020 ACR/ARP PRSYM
    • 2019 ACR/ARP Annual Meeting
    • 2018-2009 Meetings
    • Download Abstracts
  • Keyword Index
  • Advanced Search
  • Your Favorites
    • Favorites
    • Login
    • View and print all favorites
    • Clear all your favorites
  • ACR Meetings

Abstract Number: 1372

An Analysis of the Quality and Readability of Online Osteoarthritis Information with Historical Comparison

Kieran Murray1, Tim Murray2, Anna O'Rourke3, Candice Low4 and Douglas J. Veale4, 1Rheumatology, Saint Vincent's University Hospital, Dublin 4, Ireland, 2Beaumont Hospital, Dublin, Ireland, 3Saint Vincent's University Hospital, Dublin 4, Ireland, 4Centre for Arthritis and Rheumatic Diseases, Dublin Academic Medical Centre, University College Dublin, Dublin, Ireland

Meeting: 2018 ACR/ARHP Annual Meeting

Keywords: Online resources, Osteoarthritis, patient, patient engagement and patient participation

  • Tweet
  • Click to email a link to a friend (Opens in new window) Email
  • Click to print (Opens in new window) Print
Session Information

Date: Monday, October 22, 2018

Title: Osteoarthritis – Clinical Poster II

Session Type: ACR Poster Session B

Session Time: 9:00AM-11:00AM

Background/Purpose:

OA is the most common cause of disability in people >65 years old.

Most patients in the USA use the internet for healthcare information. The quality and readability of this is variable. Guidelines state health information for the general public should be at a 7– 8th grade reading level.

Health on the Net Foundation Code of Conduct (HONcode) is a well-known quality label for medical and health websites. The DISCERN instrument and JAMA Benchmark criteria are standardized validated tools to assess healthcare information quality. In 2003, online OA information was graded as “poor” by DISCERN. This study reviews the quality and readability of current online OA information.

Methods:

We searched the term “osteoarthritis” on the three most popular (>99%) search engines in the USA (Google, Bing and Yahoo). Research has shown patients are unlikely to search beyond 25 pages. Thus, the 25 most-viewed websites, excluding paid adds, on each search engine were included.

Age of content, content producer, author characteristics and HONcode status were noted. Website quality was evaluated using DISCERN and JAMA criteria. Readability was measured using three validated scoring systems: Flesch Reading Ease Score, Flesch-Kincaid Grade Level and Gunning-Fog Index.

Mean website age, JAMA benchmark criteria and DISCERN score for each website were reviewed with one-way analysis of variance (ANOVA). Analysis was performed by Prism 7 (GraphPad Software). Significance was set at p <0.05.

Results:

Of 75 articles, 38 met exclusion criteria. 31 were duplicate websites;3 non-text pages;2 paywall protected websites; 2 inaccessible for geographic reasons. 37 websites were suitable for analysis.

For 23 websites, author/reviewers were not reported. Reported authors/reviewers were doctors (n=8), other health professionals (n=3), non-specified medical staff (n=1) or non-medical author (n=2). Website characteristics are shown in Table 1.

One website met all four JAMA Criteria. Mean DISCERN quality of information for OA websites was “fair”.

There was a significant difference in quality between author types (ANOVA r2=0.24, p=0.028). Non-doctor health professional authors scoring the highest and non-medical authors scoring the lowest. HONCode endorsed websites (n=16) were of a statistically significantly higher quality.

Readability varied by assessment tool from 8th to 12th grade level.

Conclusion:

Quality of online health information for OA is “fair”, an improvement from 2003. Readability was equal to or more difficult than recommendations. HONcode certification was indicative of higher quality, but not readability.

Summary of Results

DISCERN Score

JAMA Benchmark Criteria

Readability

Producer

Age (years)

HONcode certified

Readability

Treatment Choices

Quality

Total

Authorship

Attribution

Currency

Disclosure

FRES

FKGL

GFI

All

1.4

43.2%

23.1

16.6

2.6

42.3

29.7%

24.3%

59.4%

24.3%

51.4

7.8

9.0

Not-for-profit,

(governmental and NGOs)

(n=14)

0.9

21.4%

24.1

17.2

2.6

43.9

0%

21.4%

50%

21.4%

50.8

7.9

9.4

Professional Society

(n=4)

1.3

0%

22.6

16.3

2.5

41

50%

25%

75%

25%

49.4

8.2

8.8

For-Profit Company

(n=15)

1.5

80%

23

16.9

2.6

42.5

53.3%

26.7%

73.3%

33.3%

53.2

7.6

8.6

Healthcare Providers

(n=4)

0.4

25%

20.5

14

2.3

36.8

0%

25%

25%

0%

48.5

8.2

9.2

All results are mean values

JAMA=Journal of the American Medical Association HONcode=Health On the Net certification NGO=Non-governmental organisation FRES=Flesch Reading Ease Score FKGL=Flesch-Kincaid Grade Level GFI=Gunning-Fog Index


Disclosure: K. Murray, None; T. Murray, None; A. O'Rourke, None; C. Low, None; D. J. Veale, None.

To cite this abstract in AMA style:

Murray K, Murray T, O'Rourke A, Low C, Veale DJ. An Analysis of the Quality and Readability of Online Osteoarthritis Information with Historical Comparison [abstract]. Arthritis Rheumatol. 2018; 70 (suppl 9). https://acrabstracts.org/abstract/an-analysis-of-the-quality-and-readability-of-online-osteoarthritis-information-with-historical-comparison/. Accessed .
  • Tweet
  • Click to email a link to a friend (Opens in new window) Email
  • Click to print (Opens in new window) Print

« Back to 2018 ACR/ARHP Annual Meeting

ACR Meeting Abstracts - https://acrabstracts.org/abstract/an-analysis-of-the-quality-and-readability-of-online-osteoarthritis-information-with-historical-comparison/

Advanced Search

Your Favorites

You can save and print a list of your favorite abstracts during your browser session by clicking the “Favorite” button at the bottom of any abstract. View your favorites »

All abstracts accepted to ACR Convergence are under media embargo once the ACR has notified presenters of their abstract’s acceptance. They may be presented at other meetings or published as manuscripts after this time but should not be discussed in non-scholarly venues or outlets. The following embargo policies are strictly enforced by the ACR.

Accepted abstracts are made available to the public online in advance of the meeting and are published in a special online supplement of our scientific journal, Arthritis & Rheumatology. Information contained in those abstracts may not be released until the abstracts appear online. In an exception to the media embargo, academic institutions, private organizations, and companies with products whose value may be influenced by information contained in an abstract may issue a press release to coincide with the availability of an ACR abstract on the ACR website. However, the ACR continues to require that information that goes beyond that contained in the abstract (e.g., discussion of the abstract done as part of editorial news coverage) is under media embargo until 10:00 AM ET on November 14, 2024. Journalists with access to embargoed information cannot release articles or editorial news coverage before this time. Editorial news coverage is considered original articles/videos developed by employed journalists to report facts, commentary, and subject matter expert quotes in a narrative form using a variety of sources (e.g., research, announcements, press releases, events, etc.).

Violation of this policy may result in the abstract being withdrawn from the meeting and other measures deemed appropriate. Authors are responsible for notifying colleagues, institutions, communications firms, and all other stakeholders related to the development or promotion of the abstract about this policy. If you have questions about the ACR abstract embargo policy, please contact ACR abstracts staff at [email protected].

Wiley

  • Online Journal
  • Privacy Policy
  • Permissions Policies
  • Cookie Preferences

© Copyright 2025 American College of Rheumatology