Evidence-based Practice: Do the Rules Apply to Us?

Home > Articles > Evidence-based Practice: Do the Rules Apply to Us?
By John T. Brinkmann, MA, CPO/L, FAAOP(D)

The role that research evidence should play in day-to-day practice has been one of the most formidable issues the O&P profession has addressed over the past few decades. The increased focus on outcome measurement, pressures from reimbursement organizations, increased funding for research in response to military conflicts, and public interest in high-tech rehabilitation have all contributed to greater attention to and improvements in the level of research evidence supporting daily practice. Principles of the evidence-based medicine movement have been applied to O&P since they were first articulated in the early 1990s, with corresponding calls for more and higher-level evidence to support our clinical practices. Despite improvements in research education over that same period, many clinicians struggle with understanding how research knowledge fits into day-to-day clinical care. An emphasis on specific research strategies that require a high level of expertise and methodological rigor can leave clinicians with the impression that research is not relevant to their clinical decision making. Is it possible that the rules of evidence-based practice (EBP) do not apply to us?

By individual clinical expertise we mean the proficiency and judgment that individual clinicians acquire through clinical experience and clinical practice. Increased expertise is reflected in many ways, but especially in more effective and efficient diagnosis and in the more thoughtful identification and compassionate use of individual patients' predicaments, rights, and preferences in making clinical decisions about their care.1

Integrating All Three Elements of EBP

One the most quoted descriptions of evidence-based medicine was written in 1996 by David Sackett: "Evidence-based medicine is the conscientious, explicit, and judicious use of current best evidence in making decisions about the care of individual patients. The practice of evidence-based medicine means integrating individual clinical expertise with the best available external clinical evidence from systematic research."1 Sackett's definition included an emphasis on the "patients' predicaments, rights, and preferences" as an element of clinical expertise.1 (See text above.)

 

While the integration of formal, peer-reviewed research evidence into practice is the clear emphasis of EBP, the prominent mention of clinical experience and the perspectives of patients in more recent publications brings focus to an important tension within daily practice.2,3 How, exactly, can practitioners integrate these three aspects of EBP when making decisions, particularly when established practices are not evidence based? More importantly, what role does new evidence play in clinical decision making when high-quality research is often inadequate to support a specific decision or course of treatment? Barriers to implementing EBP into O&P relate to limitations in availability of, access to, and expertise in evaluating research. These and other issues related to EBP have been addressed by O&P researchers in peer-reviewed journals, and many of these articles are freely available online.4-9

 

High levels of evidence exist in some areas of clinical practice in O&P. As examples, the value of orthotic management of idiopathic scoliosis and infantile positional cranial deformities has been clearly demonstrated, and the evidence provides valid guidance in specific treatment decisions, such as when orthotic management should begin and end. However, that high level of evidence is not available in many other practice areas. The way research is reviewed and reported presents a barrier to the implementation of EBP because some evidence that may be relevant to clinicians is not considered sufficiently rigorous by EBP purists to form the basis for practice.

 

How Research Evidence is Reported

Systematic Reviews

One of the more respected methods for reviewing evidence is to use a formal, structured process known as a systematic review. When performing a systematic review, "evidence is searched for, evaluated, and synthesized in clearly defined steps, following a protocol that has been written before the review begins" using "a hierarchy of research designs to sort stronger evidence from weaker…."10 Unfortunately, the rigorous methodology inherent in this process and restricting the scope of reviewed literature to only high-level evidence often means that these reviews have limited value for guiding clinical practice. Clinical practice involves complexities that are not replicated in studies considered to have high-level evidence (which requires, among other things, that variables be limited as much as possible). When discussing how EBP can be implemented into prosthetic practice, van Twillert et al. point out that "exclusions made by researchers to prevent bias entering the research setting in order to produce methodological sound and generalizable results, do not resolve the complexity of the clinical decision process in prosthetic rehabilitation…."11 Systematic reviews often conclude by recommending more research rather than a specific clinical decision or course of action, and this is of little value to a practitioner who must make a decision regarding a specific case. O&P is not a theoretical discipline—practitioners must make decisions and implement treatments even when no options are supported by high-level evidence.

 

Narrative Reviews

Many topics of interest to practicing O&P clinicians lack sufficient published evidence to form the basis of a structured systematic review. Narrative (also called qualitative or nonsystematic) reviews involve a less rigorous methodology than structured reviews. Narrative reviews "may lack a focused question, rarely develop a methodology that is peer reviewed, seldom use forms for abstracting data or have independent abstraction of evidence by two or more reviewers, and may go well beyond the evidence in the literature in making recommendations."12 However, narrative reviews should not be dismissed, as they often are by proponents of EBP, since "reviews play a number of roles in scientific research and professional practice…. For some of these purposes, systematic reviews are better; for others, a narrative review is more suitable."12

 

It is important to consider all types of evidence when making clinical decisions, even when it is not based on the most rigorous methodology, or conversely, does not result in immediate, definitive answers to clinical problems. An awareness of research results is an important part of a clinician's professional responsibility, and that knowledge can inform daily practice in less tangible ways than providing an exact blueprint for clinical decisions.10 According to Dijkers, "The real question is not, ‘What is the most rigorous research design?' but ‘At this time, what is the best research design for the research question or practical problem at issue?' ‘Rigorous' and ‘best' are not the same."10 Dijkers also comments that "there may be benefit to a review article in which an experienced clinician offers a conceptual understanding of the problem, makes suggestions for treatment based on analogies with other, better understood problems, and offers guidance for assessment and management."12 Narrative reviewers should make their values, preferences, and assumptions clear to address the increased risk of bias when selecting and synthesizing the evidence using a less rigorous methodology.

 

Research Science and Clinical Art

In a 2013 article in Prosthetics and Orthotics International, van Twillert et al. suggest a reconsideration of evidence-based practice in prosthetic rehabilitation. The authors set out to "discuss the complexity inherent in establishing evidence-based practice in a prosthetic rehabilitation team" by using three approaches to decision making in the case of an older individual who sustained a transfemoral amputation due to vascular problems.11 First, individual and group interviews were conducted with physiatrists, physical therapists, occupational therapists, and a prosthetist. Second, the Ottobock and Össur websites and the Department of Veterans Affairs/Department of Defense clinical practice guidelines were reviewed to gain perspective on technological advancements and component options. Third, published literature on prosthetic rehabilitation was reviewed.

 

During the interviews with rehabilitation professionals, "clinicians described the prescription phase…as an art, a sensitivity, or something speculatively. Unfortunately, this so-called tacit knowledge is scarcely made explicit in prosthetic rehabilitation, which does not attribute to transparency in the field of O&P…. It is indeed acknowledged that it is difficult for clinicians in general to put the reasoning behind their decisions and actions into words."11 The authors make the point that "researchers seldom reflect on clinical experience as a form of knowledge in itself" and suggest that "clinical knowledge is understudied and needs more articulation to make it transportable for sharing and thus for improvement."11 In a passage that should reassure any clinician who has felt inadequate when facing the challenges of EBP, these researchers point out that "…clinical decision making in rehabilitation practice requires craftsmanship, creativity, and pragmatic considering and assembling of all multifaceted aspects…. Explicating this craftsmanship, creativity and resourcefulness, which clinicians make use of when dealing with the complexity, is of great importance. This is still a blind spot in rehabilitation research that does not do justice to the hard work of rehabilitation clinicians. The articulation of clinical practice as having a quality and logic of its own is a first step in the quest for more situated strategies for the improvement of rehabilitation practice."11

 

Research Informing Practice

Understanding that EBP involves not only high grades of formal research evidence, but includes the expertise of clinicians and the perspectives of patients raises some interesting issues for clinicians. van Twillert et al. put it this way: "If we argue that research evidence produced in experimental settings should not override, or take precedence over, clinical experience, clinical embodied skills, patients' needs, values and knowledge, then the relationship between evidence and practice cannot be that of supplying a basis for that very practice…. We suggest that all three knowledge practices, scientific knowledge from researchers, clinical knowledge from clinicians, and practical knowledge from patients are important and should inform and strengthen each other."11 They go on to recommend that the term based should be replaced with the term informed, and suggest that evidence-informed practice (EIP) "acknowledges both the crucial role and craftsmanship of the clinician and the researcher in providing the knowledge for the performance of optimal patient care in prosthetic rehabilitation."11 Implementing this type of practice still requires that clinicians have the skills necessary to access and understand all levels of research. Perhaps more importantly, it requires that clinicians can recognize the characteristics and implications of different levels of evidence. It also requires that researchers (particularly those conducting reviews) support clinicians by considering the clinical implications of all levels of evidence.

 

Collaborating for Maximum Impact

Close collaboration between researchers and clinicians makes research evidence more meaningful than when each party operates on different ends of the evidence spectrum. There are many ways that researchers and clinicians are working together toward this shared responsibility to deal with "the complexity in O&P and rethink the relation between evidence, technology, and rehabilitation practice."11 Involving clinicians as active participants in research projects adds important perspectives and clinical relevance to research output. Clinicians and employers must recognize that this requires a commitment to learning research skills (a bigger challenge than most clinicians appreciate) as well as a significant investment of time and energy that could otherwise be devoted to more immediate clinical productivity. Combining qualitative methods with quantitative methods (referred to as a mixed methods approach) allows the patients' perspectives to influence research and may result in conclusions that more closely match the complexity of clinical practice.11 Affirming the value of narrative reviews, case reports, clinical guidelines based on consensus, and other less rigorous descriptions of current practice makes it more likely that clinicians will use, and even contribute these types of observations to, the professional pool of evidence.

 

Closing Thoughts

It is not an option for clinicians to ignore or dismiss research evidence—properly understood, the rules of EIP apply to us. In 1996, David Sackett wrote that "external clinical evidence can inform, but can never replace, individual clinical expertise, and it is this expertise that decides whether the external evidence applies to the individual patient at all and, if so, how it should be integrated into a clinical decision."1 This perspective affirms the important role that clinical expertise plays in the clinical decision-making process, as well as the responsibility of clinicians to remain informed about research developments. By working toward a deepening understanding of the research process and the nature of evidence, and making contributions to those efforts, clinicians can strengthen the evidence base in our profession, even if their contributions are less methodologically rigorous. There is no perfect evidence. The goal should be to use all evidence at our disposal to make the best decision possible. "The essence of wisdom is the ability to make the right decisions on the basis of inadequate evidence."12

 

John T. Brinkmann, MA, CPO/L, FAAOP(D), is an assistant professor at Northwestern University Prosthetics-Orthotics Center. He has more than 20 years of experience treating a wide variety of patients.

 

References

1. Sackett, D. L, W. M. C. Rosenberg, J. A. Muir Gray, R. B. Haynes, and W. S. Richardson. 1996. Evidence based medicine: what it is and what it isn't. BMJ 312:71-2.

2. Sackett, D. et al. 2000. Evidence-based medicine is the integration of best research evidence with clinical expertise and patient values in Evidence-Based Medicine: How to Practice and Teach EBM, 2nd edition. (Edinburgh, Scotland: Churchill Livingstone) 1.

3. Straus, S. E., P. Glasziou, W. S. Richardson, and R. B. Haynes. 2010. Evidence-based medicine requires the integration of best research evidence with our clinical expertise and our patient unique values and circumstances in Evidence-Based Medicine: How to Practice and Teach It. (New York, NY: Churchill Livingstone) 4e.

4. Geil, M. 2008. Assessing the state of clinically applicable research for evidence based practice in prosthetics and orthotics. https://scholarworks.gsu.edu/cgi/viewcontent.cgi?referer=https://scholar.google.com/&httpsredir=1&article=1004&context=kin_health_facpub.

5. Ramstrand, N., and T. H. Brodtkorb. 2008. Considerations for developing an evidenced-based practice in orthotics and prosthetics. Prosthetics and Orthotics International 32(1):93-102. http://journals.sagepub.com/doi/pdf/10.1080/03093640701838190.

6. Fatone, S. 2010. Challenges in lower-limb orthotic research. Prosthetics and Orthotics International 34(3):235-7. http://journals.sagepub.com/doi/pdf/10.3109/03093646.2010.515875.

7. Stevens, P. M. 2011. Barriers to the implementation of evidence-based practice in orthotics and prosthetics. JPO: Journal of Prosthetics and Orthotics 23(1):34-9. http://journals.lww.com/jpojournal/Fulltext/2011/01000/Barriers_to_the_Implementation_of_Evidence_Based.8.aspx.

8. Andrysek, J., J. Christensen, and A. Dupuis. 2011. Factors influencing evidence-based practice in prosthetics and orthotics. Prosthetics and Orthotics International 35(1):30-8. http://journals.sagepub.com/doi/pdf/10.1177/0309364610389353.

9. Ramstrand, N. 2013. Translating research into prosthetic and orthotic practice. Prosthetics and Orthotics International 37(2):108-12. http://journals.sagepub.com/doi/pdf/10.1177/0309364612451268.

10. Dijkers, M. P. J. M. for the NCDDR Task Force on Systematic Review and Guidelines. 2009. When the best is the enemy of the good: The nature of research evidence used in systematic reviews and guidelines. Austin, TX: SEDL.  http://ktdrr.org/ktlibrary/articles_pubs/ncddrwork/tfsr_best.

11. van Twillert S., J. Geertzen, T. Hemminga, K. Postema, and A. Lettinga. 2013. Reconsidering evidence-based practice in prosthetic rehabilitation: A shared enterprise. Prosthetics and Orthotics International 37(3):203-11. http://journals.sagepub.com/doi/pdf/10.1177/0309364612459541.

12. Dijkers, M. P. 2009. The value of "traditional" reviews in the era of systematic reviewing. American Journal of Physical Medicine & Rehabilitation 88(5):423-30.