Empiric therapy

Empiric therapy or empirical therapy is therapy based on experience[1] and, more specifically, therapy begun on the basis of a clinical educated guess in the absence of complete or perfect information. Thus it is applied before the confirmation of a definitive diagnosis or in the absence of complete understanding of mechanism, whether the biological mechanism of pathogenesis or the therapeutic mechanism of action. The name shares the same stem with empirical evidence, involving an idea of practical experience. Empiric therapy is most often used when antibiotics are given to a person before the specific bacterium causing an infection is known. Fighting an infection sooner rather than later is important to minimize morbidity, risk, and complications, so there is value in getting started with good information rather than waiting around for better information. Examples of this include antibiotics given for pneumonia, urinary tract infections, and suspected bacterial meningitis in newborns aged 0 to 6 months. Empiric antibiotic therapy thus may be thought of as taking the initiative against an anticipated and likely cause of infectious disease.

Empiric antibiotic therapy

Empiric antibiotics are typically broad-spectrum, in that they treat both a multitude of either Gram-positive and/or Gram-negative bacteria. When more information is known (as from a blood culture), treatment may be changed to a narrow-spectrum antibiotic which more specifically targets the bacterium known to be causing disease.

The advantage of indicating antibiotics empirically exists where a causative pathogen is likely albeit unknown and where diagnostic tests will not be influential to treatment. In this case, there may be little if any perceived benefit of using what may be costly and inconclusive tests that will only delay treatment of the same antibiotics. The empirical use of broad-spectrum antibiotics increases, by selection, the prevalence of bacteria resistant to several antibiotics. However, the delay and expense that would be required to perform definitive species identification in every single clinical case are not affordable, so some degree of trade-off is accepted on the principle of the benefits outweighing the risk.

Other than antibiotics

Another sense of the term empiric therapy involves quackery, and empiric as a noun has been used as a synonym of quack.[2] This sense applies when the amount of guessing involved by the clinician transcends so far beyond science that the standard of care is not upheld. Whereas prescribing a broad-spectrum antibiotic to fight a clinically apparent infection as early as possible is entirely prudent and scientific despite the absence of confirmatory cultures, prescribing magic rituals or pseudoscientific schemes is not.

The fact that "acting on practical experience in the absence of theory or complete knowledge" can have both legitimate and illegitimate forms stretches back to long before science existed. For example, in the era of ancient Greece, when medical science as we now know it did not yet exist, all medicine was unscientific and traditional; theories of etiology, pathogenetic mechanism, and therapeutic mechanism of action were based on religious, mythologic, or cosmologic ideas. For example, humorism could dictate that bloodletting was indicated for a certain disorder because a supposed excess of water could be rebalanced. However, because such theories involved a great deal of fanciful notions, their safety and efficacy could be slim to negative. In the example of bloodletting to correct excess water, the fact that fluid balance is a legitimate physiologic concern didn't mean that the then-state-of-the-art "understanding" of causation was well founded overall. In this environment where mainstream medicine was unscientific, a school of thought arose in which theory would be ignored and only practical results would be considered. This was the original introduction of empiricism into medicine, long before medical science would greatly extend it.

However, by the late 19th and early 20th centuries, as biological and medical science developed, the situation had reversed: because the state of the art in medicine was now scientific medicine, those physicians who ignored all etiologic theory in favor of only their own experience were now increasingly quackish, even though in the era of religion-based or mythology-based medicine (the era of medicine men) they might have been, as viewed through today's hindsight, admirably rational and in fact protoscientific. Thus as science became the norm, unscientific and pseudoscientific approaches qualified as quackery.

In the 21st century, the next phase of differentiation on this topic is underway. Today, all clinical practice based on medical science is therefore based on empirical evidence to a large degree, but now efforts are underway to make sure that all of the science on any given medical topic is consistently applied in the clinic, with the best portions of it graded and weighted more heavily. This is the latest cycle in which personal experience (even expert opinion with scientific basis) is not considered good enough by itself. Thus, in evidence-based medicine, the goal is that every clinician will make decisions for every patient with total mastery and critical analysis of the entire scientific literature at their fingertips. This is a formidably vast goal to implement operationally (because it is not even possible for one person to master all extant biomedical knowledge on the basis of individual education[3]), but development of health information technology such as expert systems and other artificial intelligence in medicine is underway in pursuit of it.[3] In the meantime, until the day when the reality begins to approximate the ideal, it is duly recognized that expert opinion and clinical judgment are still necessary in clinical practice,[4] because their premature banishment, before their eventual replacement has been fully prepared, amounts to scientism.

See also

References

  1. Elsevier, Dorland's Illustrated Medical Dictionary, Elsevier.
  2. Merriam-Webster, Merriam-Webster's Collegiate Dictionary, Merriam-Webster.
  3. 1 2 Khosla, Vinod (2012-12-04), "Technology will replace 80% of what doctors do", Fortune magazine, Most doctors couldn’t possibly read and digest all of the latest 5,000 research articles on heart disease. And, most of the average doctor’s medical knowledge is from when they were in medical school, while cognitive limitations prevent them from remembering the 10,000+ diseases humans can get.
  4. Koppenheffer, Michael (January 29, 2015), Guidelines are not gospel
This article is issued from Wikipedia - version of the 7/29/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.