Sunday, July 24, 2011

ANTHROPIATRY PART I: PSYCHE, THE MIND



What begins with this essay is a medical study of the human being. For in medicine we have pulmonologists, cardiologists, endocrinologists but we have no humanologists. Each of these essays will have the Anthropiatry Symbol, above, to signal that what follows is another in this series of concepts. The goal here is to engage health care professionals in the discovery of what it means to be a human being and to generate an emphasis on health and life in medicine to balance the current emphasis on disease and death. We begin with an exploration of what is the mind, starting with thought, and within thought, empiricism.

Anthropiatry: Psyche, or The Mind

As we begin our study of the Human Being, we examine the Mind. The Atlas of the Mind includes thought, memory and emotion. We will start with thinking or Thought, and since we are focused on the Occidental Tradition, we examine the thinking of the West.

The Three Kinds of Thinking of the West include Empiricism, Rationalism and Mysticism. We will begin with Empiricism.

Thinking Part I: Empiria

In the gleaming white hallways of Academic Medicine, the word "Empirical" has the character of the all-holy. The word is nowadays used as a kind of whip by the Academic Brahmans to slash ideas that are unworthy or dangerous, or not "empirical." That which is empirical, however, is the highest form of thought, and if you want to be properly anointed and get a key to the executive washroom, you had better be an Empiricist.

At the University of Virginia during my residency in medicine and psychiatry I observed the process of the widespread conversion to this form thinking at the expense of all others. We had been an outpost of The Enlightenment since, after all, we were intellectual heirs of Thomas Jefferson; but in the end even the Rotunda could not resist the empirical tide (1). The psychoanalysts were no longer revered and suddenly the behaviorists could no longer be bothered remember your name. Medical reasoning began to lose its value and in its place arose that which was "evidence based." It reminded me of the Old Norse Saga "Thidrandi Whom the Goddesses Slew," where the wise Thorhall sees in strange events that a New Faith (Christianity) is coming to Iceland. "I am laughing," he says, "because many a hill is opening, and every living creature, great and small, is packing his bags and making this his moving-day (2))." All of the adherents to the old way are put to the door.

When I was a fourth year resident (3), I had rotated onto the Oncology Service. Our Attending Physician was Charlie Hess, one of our most revered professors. Knowing that I was also training to be a psychiatrist, Dr. Hess felt compelled to make this observation: "No offense, Dr. Albanese, but my problem with Psychiatry is that there are no testable hypotheses." In other words, psychiatry was not sufficiently empirical.

Now Medical Schools have many Academics, but they don't have too many scholars (3). How could they? Their training teaches them to be scornful of the humanities, for they provide so little of substance! And yet to say that one is empirical makes direct reference to the philosophies of the ancient Greek Fathers, who were the first to distill its meaning. To me the evidence is that those who wield the empirical war hammer are not fully aware of its significance, what are its historical origins and even what is its proper use. Now that I have dropped the gauntlet, as it were, we will examine the origins of Empiricism and view this essential approach to knowledge in its appropriate historical context.

Empiricism, and empirical, have as their root a Greek work, empirĂ­a, meaning experience. Empirical knowledge, then, is knowledge that is acquired through experience or observation. Although Babylonian and Egyptian astronomers were great observers of natural phenomena, generally the title of Father of Empiricism is accorded to Aristotle, the most famous student of Plato. I suppose he occupies that position because of his stature as a philosopher, so monumental is his work. Also, while he may have perfected the classical empirical point of view, he is also the doorway between empiricism and Western Civilization.

Now as I describe empiricism, I will be guilty of oversimplification, and my philosopher friends will be justifiably scornful. But I am a physician by profession and not a philosopher, and in Medicine the goal is to simplify to the extent possible. We have Occam's Razor in Medicine, for example, a principle which states that the simplest answer to a problem is usually the most correct. When I'm trying to describe a medical problem to a patient or to a patient's family member, complete accuracy generally comes at the expense of the full understanding of the patient.

Aristotle's empiricism may have been a reaction to the Rationalism (we will take this one on later) of his teacher Plato. For Plato, I imagine, was like a Renaissance scholar, withdrawn into a place of study where his thoughts and theories could be analyzed and refined. Aristotle, by contrast, felt that to understand the world one had to experience it, to get out into it and record one's observations. It is no accident that Aristotle's work had a great deal of impact on how we study and understand the natural sciences.

To be an accomplished empiricist, an individual must develop the ability to be observant. This is a very difficult skill to acquire, and even the more difficult to teach. William Osler, the patron saint (so to speak) of Internal Medicine wrote: “The whole art of medicine is in observation…but to educate the eye to see, the ear to hear and the finger to feel takes time….(5)” The process of observation requires sensory attentiveness, in other words the observer must consciously direct his or her senses to phenomena and maintain a heightened level of awareness. This is not just awareness of what one is observing, but also awareness of how one's observations might be influenced by other factors. This is the great difficulty of being an empiricist: Knowing how your observations might be affected by internal and external elements.

Because empiricism involves physical proximity to what is being studied and because its adherents know the world through direct experience, empiricism has the aroma of that most unattainable thing in science: absolute truth. There are two great pitfalls in empiricism, however. The first one is physical, mechanical. We perceive the world through finite senses, each sense having a specific spectrum within which it functions. Vision, for example, is limited to those wavelengths between the infrared and the ultraviolet, and we see poorly in limited light. Similarly there are frequencies of sound waves too high and too low to be perceived by the human ear. No matter how one augments the senses by the use of various technologies, one must always be aware of the fact that observation cannot be absolute.

The second great limitation on observation is that of assumptions. If everyone in the world believes the world is flat, and they send a ship to the west and the ship never comes back, they conclude that the ship has fallen off the edge of the world. In reality the ship has gotten stranded on the North American shore. The problem with assumptions is that observers frequently don't know they have them, so erroneous conclusions derived from observations may have long and prosperous lives. An example of such an assumption comes from the field of Medicine where for many years it was believed that the spleen served no important function. When individuals sustained injury to the spleen, as is occasionally the case with blunt abdominal trauma, surgeons simply removed it. The surgeons observed that when they removed the spleens, the patients recovered better from the trauma, so splenectomy established itself as the standard of care. Later on specialists in infectious diseases observed that patients without spleens were more likely to die of certain kinds of infections (encapsulated gram-negative organisms); now surgeons make a much greater effort to preserve the spleen. The scientific term for these kinds of assumptions is bias.

One interesting aspect of empiricism is that it has two forms, one very old and one rather new. And you can use one form to prove that God exists, and another to prove that God does not exist. What a conundrum! The two forms are linked in the French language, by the verb expérimenter, which means both experience and experiment.

So the older form of empiricism is an empiricism whereby if a person observes or experiences something then it is real, or true. It could be called classical empiricism or radical empiricism, but I call it existential (6) empiricism. Let us say that a person is trying to determine whether or not God exists. This individual goes on a spiritual journey of sorts that includes, fasting, meditation and so on. One day this person has a dramatic experience of insight, inner peace, emotional and cognitive transformation. That person will now tell you that he or she has experienced God directly and that to that person, the existence of God is as certain as is the fact that they themselves are alive. By way of another example, let us say that you are hiking deep into the forests of central Idaho, and suddenly you encounter a Gigantopithecus! It grabs you, shakes, you, empties your pockets and knapsack of food items, gathers them up and lopes away. Now you know that Sasquatch exists! Experience, after all, is reality. To the scientific community, however, and to your friends for that matter, you are mad.

The second, newer (but not new) form of empiricism is what I call objectivist empiricism. In her book, Introduction to Objectivist Epistemology, Ayn Rand challenges the existential order by saying that reality is composed of absolutes, and to the extent that there are differences between people over what those absolutes are, it is because there are differences in the quality of observation. In scientific terms, the objective empiricist states an assumption called a hypothesis. Then this scientist designs experiments that are meant to test this hypothesis. The experiments are conducted, the results are carefully observed and recorded and the hypothesis is either proved or disproved. The trick here is to design experiments that, under the same circumstances, will have the same outcomes no matter where they are done, and no matter who does them (reproducibility). In addition to reproducibility, the scientist has to do his or her best to make certain that the results of the experiments do in fact answer the question that is being asked. For example, let us say that a scientist states the hypothesis that treating depression improves cancer outcomes. So she pulls together a group of 200 depressed cancer patients. One hundred will get treated with a new antidepressant, Acarcinol, and the rest will not have treatment for their depression (7). In the end it turns out that those who get the new antidepressant not only have less depression, they have improved survival with chemotherapy than those who do not. Triumph! The researcher publishes an article in a peer review journal and she is promoted to professor!

Later on other investigators try to reproduce her results. Using different antidepressants to the one she used, they find that treating depression, even when the depression is much improved, does not improve survival in the cancer patients. Later on it is discovered that the particular antidepressant she used in her trial has specific anticancer activity and it was this characteristic, not the antidepressant effect, that led to improved outcomes. So the experiment she designed does not actually answer the question whether or not treating depression improves cancer outcomes. Her experiment actually answered the question (hypothesis) whether treating depressed cancer patients with Acarcinol improves cancer outcomes.

Another pitfall for empiricists is that they have to be as neutral as they can be with respect to proving or disproving their hypotheses. We always have to entertain the possibility that an academic with a career at stake or a pharmaceutical company with a hundred million dollars at stake may have so much interest in a specific outcome that conscious or unconscious factors may enter into the experimentation and thereby influence the outcome. An associate professor may develop an idea on paper and publish it; if he later demonstrates with empirical evidence that his theory was correct he has a great deal to gain, for example full professorship. If his experiments demonstrate that he was wrong, however, it is just more data for the scrapheap and his promotion will have to wait.

Objectivist empiricists tend to be atheists, like Ayn Rand and also like Carl Sagan. In his television series Cosmos, Sagan said "I don't believe in God because I can't see him." More broadly, if you state the hypothesis that God does exist or that he does not exist, you cannot design experiments that answer that question. For the orthodox objectivist empiricist, that is as good as demonstrating the non-existence of God, for anything that exists can be proven to exist.

It was necessary for empiricism to branch into these two (objective and existential) forms because humanity has two meanings: individual human beings, for whom reality is rather subjective, and the whole of humanity, for whom reality is best viewed objectively. It incorporates the fact that the two different visions of reality, existentialism and objectivism, although opposite to one another, are paradoxically both true.


(1) Because of the historical connection to The Enlightenment (the Age of Reason), this conversion process took place at UVa, I believe, well after it took place at most universities.
(2) Eirik the Red and Other Icelandic Sagas, translated by Gwyn Jones, Oxford University Press 1961.
(3) I trained in a combined residency in Internal Medicine and Psychiatry, which lasts for five years. Despite offering one of the most useful skill sets in medicine, these residencies struggle for survival.
(4) I define a scholar as an individual who holds all knowledge as precious and worthy of acquisition. Since those in Medicine often have a dismissive attitude toward the arts and the humanities, it's hard to number them among scholars using this definition.
(5) The Quotable Osler, American College of Physicians (2007).
(6) We will say more about existentialism later in this collection of essays.
(7) To withhold treatment for depression as part of a scientific study would be considered to be unethical nowadays, thank goodness. Not too long ago it would have been considered acceptable.

Copyright 2011 Robert Albanese

No comments:

Post a Comment