• Therapeutic Cataract & Refractive
  • Lens Technology
  • Glasses
  • Ptosis
  • AMD
  • COVID-19
  • DME
  • Ocular Surface Disease
  • Optic Relief
  • Geographic Atrophy
  • Cornea
  • Conjunctivitis
  • LASIK
  • Myopia
  • Presbyopia
  • Allergy
  • Nutrition
  • Pediatrics
  • Retina
  • Cataract
  • Contact Lenses
  • Lid and Lash
  • Dry Eye
  • Glaucoma
  • Refractive Surgery
  • Comanagement
  • Blepharitis
  • OCT
  • Patient Care
  • Diabetic Eye Disease
  • Technology

4 tips to determine if a study article is accurate

Article

Determining article credibility can be difficult. One OD gives her four tips to determine if a study follow-up article is fact or fiction.

The views expressed here belong to the author. They do not necessarily represent the views of Optometry Times or UBM Medica.

The recent publication of the Dry Eye Assessment and Management Study (DREAM) study has some dry eye specialists questioning what is being reported in follow-up articles.1 In the DREAM report, the National Institute of Health (NIH) reported that omega-3 fatty acids from fish oil supplements is no better than placebo for dry eye.2 One headline claims  omega-3 fatty acid supplements ineffective in treating dry eye disease.3

The DREAM study found 61 percent of participants in the omega-3 group and 54 percent of those in the control group achieved at least a 10-point improvement in their symptom score, but the difference between the groups was not statistically significant.

Previously from Dr. Schroeder-Swartz: Cosmetic dangers: Part 3-Keep patients informed, report cosmetic problems

“We were surprised that the omega-3 supplements had no beneficial effect,” says Vatinee Y. Bunya, MD, DREAM principal investigator for the clinical center at the University of Pennsylvania.

This conclusion is drawn because the treatment group and control group were not statistically different.

So, how can ODs determine if the information in lay media covering study results are real or not? You need to get the source, read it, and make up your own mind.

Follow these four steps to critique study coverage before you assume the headline is true.

1. Format matters
Study reports are formatted with an abstract (summary), introduction, methods, results, discussion, and references. When you look at an article, determine if the journal is peer-reviewed.

Peer-reviewed journals are considered the best for publication because they undergo a rigorous process for quality. The submitted manuscript is read by one or more experts in the field without knowing the identity of the authors. Reviewers evaluated the article for strength of publication and recommend it be published, edited, or rejected.

The authors may then edit as requested and resubmit their article for an additional review. While peer-reviewed articles are held to a higher standard, they are not perfect.

2. Dig deep into the text
Determine what to believe when reading an article by reading the article several times and questioning the authors, study, and conclusions. My graduate advisors told me to look for reasons to stop reading the article.

Investigate the authors to determine if they are financially connected to the study. A financial connection may create bias. If the study is funded by a company that manufactures or sells the product featured in the study it weakens the strength of the paper in question. Look for independent studies that confirm industry-sponsored conclusions.

Related: Placebo may be as good as omega-3s according to DREAM study findings

The article introduction should discuss three things:
• What the article is about
• Why the topic is significant
• What was done

A literature review often opens the article and should include an unbiased review of the current publications on the topic being studied. The discussion of the literature should direct the reader to why the study is being done. This is often accomplished with a clear statement of the study goals.

Do these study goals make sense and are they clinically relevant? For example, a study on the validity of daily apple intake on intraocular pressure is not clinically relevant. Guess what? Stop reading.

3. Pay attention to study design
Studies may be experimental, such as randomized controlled trials, in which the researcher tries to change something. Or it may be observational, such as case-control, cross-sectional, and reviews, in which the researcher studies what happens, but does not treat the subjects.

Not all studies are created equal. Experimental trials are considered the strongest for evaluating cause and effect, with randomized controlled trials the gold standard. Controlled trials without randomization are next best.4

When considering the strength of the study design, cohort or case-control studies, preferably from more than one center or research group, are next best, followed by uncontrolled experiments or evidence from a series with or without intervention, descriptive studies, and case reports.

Human and animal research must be approved by an ethics committee. The purpose of ethical oversight is to ensure no humans or animals were harmed or taken advantage of during the study. Universities have their own institutional review boards, while private research may be approved through a third-party entity.

4. Dissecting the results
Results and discussion should include statistical methods used, discussion of results, problems encountered, problems with study design, and suggestions for future research.

Statistical significance means researchers are confident a study was not simply random chance. Statistical significance is expressed in studies with the P value. If you see P = 0.05 or under, that means there is a 95 percent or more chance of the result not being due to random chance.

Related: Omega-3s no better than placebo for dry eye

A P values of 0.05 is the accepted cutoff for results to be considered statistically significant. P values greater than 0.05 do not support the thought that what is being studied is responsible for the outcome, although it might be true.

If you see a study that says, “taking fish oil increases the risk of heart attack by 40 percent (P = 0.9),” the high P value suggests you should not consider this in your clinical decision making.

Do not assume that headlines about study results are accurate. Do your homework and make your own decisions about how or if to change your clinical care based on new research.

Special thanks to Zac Denning, project manager at ScienceBased Health, for his assistance with this article.

References:

1. The Dry Eye Assessment and Management Study Research Group. Omega-3 fatty acid supplementation for treatment of dry eye disease. N Engl J Med. 2018; 378:1681-1690.
2. National Institute of Health. Omega-3s from fish oil supplements no better than placebo for dry eye. Available at: https://www.nih.gov/news-events/news-releases/omega-3s-fish-oil-supplements-no-better-placebo-dry-eye. Accessed 5/24/18.
3. University of Pennsylvania School of Medicine. New study finds omega-3 fatty acid supplements ineffective in treating dry eye disease. Available at: https://www.eurekalert.org/pub_releases/2018-04/uops-nsf041118.php. Accessed 5/24/18.
4. Georgia State University Library. Literature Reviews: Types of Clinical Study Designs. Available at: http://research.library.gsu.edu/c.php?g=115595&p=755213. Accessed 5/24/18.

Related Videos
Carole Burns, OD, FCOVD
Scott Schachter, OD
© 2024 MJH Life Sciences

All rights reserved.