Evaluation of the Content, Quality, and Readability of Patient Accessible Online Resources Regarding Cataracts

被引:27
作者
Patel, Annika J. [1 ]
Kloosterboer, Amy [1 ]
Yannuzzi, Nicolas A. [1 ]
Venkateswaran, Nandini [1 ]
Sridhar, Jayanth [1 ]
机构
[1] Univ Miami, Miller Sch Med, Dept Ophthalmol, Bascom Palmer Eye, 900 NW 17th St, Miami, FL 33136 USA
关键词
Cataract surgery; patient education; readability; online resources;
D O I
10.1080/08820538.2021.1893758
中图分类号
R77 [眼科学];
学科分类号
100212 ;
摘要
Purpose: To evaluate the content quality, accuracy, and readability of commonly visited websites by cataract patients contemplating cataract surgery. Setting: Freely available online information. Design: Cross-sectional study. Methods: Ten websites were evaluated in a cross-sectional study for content analysis using a grading sheet of 40 questions individually scored by three ophthalmologists. JAMA benchmarks were used to assess the quality. An online readability tool, Readable, was used to assess the readability. Results: There was a significant difference between the content and accuracy of each website according to a Kruskal-Wallis test (H = 22.623, P = .007). The average score for all websites using the grading sheet was 90.85 out of 160 points, or 57% (SD 29.93, CI 95%+/- 17.69). There was no significant correlation between website rank on Google.com and content quality of the website (r = 0.049, P = .894). No websites complied with all 4 JAMA criteria for authorship. There was no significant correlation between content quality of each website and number of JAMA requirements met (r = -0.563, P = .09). The average Flesch Reading Ease Score for all websites was 52.64 (SD 11.94, CI 95%+/- 7.40), and the average Mean Reading Grade was 10.72 (SD 1.58, CI 95%+/- 0.98). There was a significant difference in Mean Reading Grades between websites (H = 23.703, P = .005). There was no significant correlation between content quality of the website and Mean Reading Grade (r = -0.552, P = .098). Conclusion: Commonly accessed online resources on cataracts and cataract surgery are insufficient to provide patients with a clear and complete understanding of their condition as well as available medical and surgical treatment options.
引用
收藏
页码:384 / 391
页数:8
相关论文
共 20 条
  • [1] [Anonymous], EYEWIRE NEWS
  • [2] Readability Assessment of Online Uveitis Patient Education Materials
    Ayoub, Samantha
    Tsui, Edmund
    Mohammed, Taariq
    Tseng, Joseph
    [J]. OCULAR IMMUNOLOGY AND INFLAMMATION, 2019, 27 (03) : 399 - 403
  • [3] Health Information Obtained From the Internet and Changes in Medical Decision Making: Questionnaire Development and Cross-Sectional Survey
    Chen, Yen-Yuan
    Li, Chia-Ming
    Liang, Jyh-Chong
    Tsai, Chin-Chung
    [J]. JOURNAL OF MEDICAL INTERNET RESEARCH, 2018, 20 (02)
  • [4] Choi Ariel R, 2018, J Evid Based Med, V11, P71, DOI 10.1111/jebm.12297
  • [5] CONGDON N, 2004, ARCH OPHTHALMOL-CHIC, V122
  • [6] Readability of Online Health Information: A Meta-Narrative Systematic Review
    Daraz, Lubna
    Morrow, Allison S.
    Ponce, Oscar J.
    Farah, Wigdan
    Katabi, Abdulrahman
    Majzoub, Abdul
    Seisa, Mohamed O.
    Benkhadra, Raed
    Alsawas, Mouaz
    Larry, Prokop
    Murad, M. Hassan
    [J]. AMERICAN JOURNAL OF MEDICAL QUALITY, 2018, 33 (05) : 487 - 492
  • [7] DIAZ J, 2002, J GEN INTERN MED, V17
  • [8] Patient Information in Graves' Disease and Thyroid-Associated Ophthalmopathy: Readability Assessment of Online Resources
    Edmunds, Matthew R.
    Denniston, Alastair K.
    Boelaert, Kristien
    Franklyn, Jayne A.
    Durrani, Omar M.
    [J]. THYROID, 2014, 24 (01) : 67 - 72
  • [9] Femtosecond laser-assisted versus phacoemulsification for cataract extraction and intraocular lens implantation: clinical outcomes review
    Ewe, Shaun Y.
    Abell, Robin G.
    Vote, Brendan J.
    [J]. CURRENT OPINION IN OPHTHALMOLOGY, 2018, 29 (01) : 54 - 60
  • [10] Graham Suzanne, 2008, Perm J, V12, P67