In 1994, George Davies (a founding father of sports physical therapy), authored an article titled, “The Need for Critical Thinking in Rehabilitation,”1 in which he described the need for clinicians to apply critical thinking to clinical interventions, using an example of the integration of open and closed kinetic chain exercises in rehabilitation. Dr. Davies observed the predominant use of empirically based clinical experiences more so than quality research in guiding clinical practice.

This article formed the basis for my passion to integrate critical thinking in sports rehabilitation as I graduated from physical therapy school that in 1994. Around that time, the concept of “evidence-based medicine”2 was becoming more popular in rehabilitation: apply the “best evidence” while considering the values of the patient and your clinical experience. This new 3-pronged concept of evidence-based practice seemed to serve as an appropriate model for critical thinking.

Over the following 25+ years of practice as a physical therapist and athletic trainer, my colleagues and I witnessed many trends come and go. It seems that every few years, different treatments become popular and widely utilized only to be replaced by something new (Figure 1).

Figure 1
Figure 1.Archeology of popular physical therapy interventions (circa 1990 to present)

Over those years, I noticed an interesting paradox of research in clinical practice. While we wanted to have “research” to base our clinical decisions (best evidence), we relied on what worked for us and the patient (clinical experience and patient values). Many treatments without research support gained popularity because clinicians and patients saw results (or saw it used in the Olympics); however, when one research article was published that suggested the treatment wasn’t as effective or useful, clinicians quickly abandoned them for the “next shiny object.” This phenomenon was described as “Scott’s Parabola” in the British Medical Journal in 2001 to describe the rise and fall of a surgical technique.3 I’ve modified the original Scott’s parabola (Figure 2) to help explain the rise and fall of common physical therapy treatments in Figure 1 above. Unfortunately, this continuous wave of ups and downs leads to inefficiency in rehabilitation, as Silbernagel et al.4 suggested in 2019: “…the hasty implementation of new tools without solid evidence potentially results in extended time and effort to de-implement ineffective management approaches.” In other words, we waste time “un-doing” the unwanted ripple effects from an ineffective treatment.

Figure 2
Figure 2.Modified Scott’s Parabola applied to physical therapy treatments.

Modified from Scott.3

While my clinical experience grew with time, I realized that the ability to identify the “best evidence” was a continuous process. The process of identifying the “best evidence” was poorly defined, and we often relied on the few professional journals in our field at the time for the best evidence. But today, how do busy clinicians have the time to find, read, analyze, and integrate the multitude of research articles coming out each month? Ideally, clinicians would be able to keep up with the literature, but we continue to rely on colleagues, gurus, websites, and (gulp) social media to select, interpret, and apply research for us…sometimes in 280 characters or less.

It seems that today, more than ever, rehabilitation clinicians need to be better-informed consumers of the scientific literature. While most clinicians strive to be ‘evidence-based’ practitioners, there are many barriers to incorporating evidence in practice5: lack of time, lack of access, and lack of knowledge and skills may hinder clinicians efforts to apply the best-available evidence with patient values and clinical experience. This is compounded by the sheer volume of new research, which includes poor-quality studies with lack of adequate peer review, sometimes published in so-called predatory journals. In addition, misinformation continues to be spread through the profession through advertising and social media, likely due to bias, lack of understanding, or profiteering.

Unfortunately, Dr. Davies’ observations about critical thinking in rehabilitation still ring true today. Clinicians still rely on poor-quality studies and “jump on the bandwagon” of today’s “trendy treatments,” while gurus continue to “preach the word about the beneficial effects of certain treatments without any prospective research documentation other than testimonials.”1 This requires today’s clinicians to take responsibility for overcoming the barriers rather than relying on trusted journals and lecturers for the answers.

Educating clinicians on finding and appraising research for the “best evidence,” and applying to individual patients remains paramount in developing critical thinking in rehabilitation. Today’s rehabilitation professionals should maximize their scientific literacy to support critical thinking. This may begin with the students at professional schools, where more emphasis could be placed on critical thinking and critical appraisal of the literature, as well as the proper application of research findings in making clinical inferences. Practitioners should devote more time to critical appraisal, analyzing original sources rather than relying on secondary sources (ie, “gurus”), stay current by participating journal clubs, and even participate in clinical research studies.

Although beyond the scope of this editorial, critical appraisal relies on several factors. However, the main factors in quality assessment are presence of bias and confounders, as well as reporting standards. Bias and confounders threaten the internal validity of a study by potentially influencing the outcome and its interpretations. Operationally, bias refers to factors that can be controlled by the researcher through study methodology (recruiting, statistics, etc), while confounders are factors that are inherent to subjects (age, race, gender, etc) and may be addressed through design or analysis.

External validity refers to the generalizability of the results, but also can be affected by the details reported by the authors in allowing replication of the study. The Equator Network (www.equator-network.org) provides a vast number of reporting standards for various research designs; however, few journals regularly require reporting of these standards (although the IJSPT does require them). Quite simply, we can’t rely on journals alone as the basis for our critical thinking.

Case-in-point: In 2019, a meta-analysis was published in an open-access journal, “Effects of training with elastic resistance versus conventional resistance on muscular strength: A systematic review and met-analysis.”6 I closely examined the article, finding many discrepancies in the reporting; so much so that I wrote a letter to the editor that resulted in a corrigendum to address each of my concerns over a year later7; however, the original article still remains available online with the errors.

I’ve developed the “8-Rs” in applying critical thinking to rehabilitation research. As you evaluate a research study, ask the following questions relative to your clinical question (Table 1):

Table 1.The 8 R’s of applying critical thinking to rehabilitation research
RESEARCH DESIGN Does the design answer the research question, and how strong is the design (level of evidence)?
RELEVANCE Is the “PICO” relevant to your patient? (Population, Intervention, Comparison, outcome)
REPORTING Did the authors use reporting guidelines from Equator-Network.org? Are data tables and figures consistent with narrative?
REPEATABILITY Can the intervention be replicated in your setting?
RESULTS Is the study conclusion supported by the results, and what is the clinical impact? Does the benefit outweigh the harm?
RELIABILITY & VALIDITY Has bias been identified (internal and external validity), and does it influence results or implementation?
RELATIVITY Compared to other treatments, is this better, worse or same?
REFERENCE What’s the impact / credibility of the source?

In conclusion, this editorial is not meant to suggest that everything we do has to have high levels of evidence supporting its efficacy. But we need to apply critical thinking skills to ensure the treatment is safe and effective for each individual patient using the best available evidence. Developing critical thinking and appraisal skills takes time; however, if you take the time to apply them on a regular basis, your skills will quickly become strong enough to enable you to identify research quality on a spectrum from high to low quality. This will allow you to determine the “best evidence” available, then apply the findings of the studies (given adequate reporting) within the context of your individual patients when combined with your clinical experience. Thus, critical thinking in rehabilitation research supports evidence-based practice…and gives you another critical skill that’s much-needed in our profession: quality peer reviewers.