Knowledge without Reason

Is it possible to gain real and valuable knowledge without using reason? Many would scoff at this notion. If an idea can’t be defended on rational grounds, it is either a personal preference that may not be held by others or it is false and irrational. Even if one acknowledges a role for intuition in human knowledge, how can one trust another person’s intuition if that person does not provide reasons for his or her beliefs?

In order to address this issue, let’s first define “reason.” The Encyclopedia Britannica defines reason as “the faculty or process of drawing logical inferences,” that is, the act of developing conclusions through logic. Britannica adds, “Reason is in opposition to sensation, perception, feeling, desire, as the faculty . . .  by which fundamental truths are intuitively apprehended.” The New World Encyclopedia defines reason as “the ability to form and operate upon concepts in abstraction, in accordance with rationality and logic. ” Wikipedia states: “Reason is the capacity of consciously making sense of things, applying logic, and adapting or justifying practices, institutions, and beliefs based on new or existing information.”

Fundamental to all these definitions is the idea that knowledge must be based on explicit concepts and statements, in the form of words, symbols, or mathematics. Since human language is often ambiguous, with different definitions for the same word (I could not even find a single, widely-accepted definition of “reason” in standard reference texts), many intellectuals have believed that mathematics, science, and symbolic logic are the primary means of acquiring the most certain knowledge.

However, there are types of knowledge not based on reason. These types of knowledge are difficult or impossible to express in explicit concepts and statements, but we know that they are types of knowledge because they lead to successful outcomes. In these cases, we don’t know how exactly a successful outcome was reached — that remains a black box. But we can judge that the knowledge is worthwhile by the actor’s success in achieving that outcome. There are at least six types of non-rational knowledge:

 

1. Perceptual knowledge

In a series of essays in the early twentieth century, the American philosopher William James drew a distinction between “percepts” and “concepts.” According to James, originally all human beings, like the lower life forms, gathered information from their environment in the form of perceptions and sensations (“percepts”). It was only later in human evolution that human beings created language and mathematics, which allowed them to form concepts. These concepts categorized and organized the findings from percepts, allowing communication between different humans about their perceptual experiences and facilitating the growth of reason. In James’s words, “Feeling must have been originally self-sufficing; and thought appears as a super-added function, adapting us to a wider environment than that of which brutes take account.” (William James, “Percept and Concept – The Import of Concepts“).

All living creatures have perceptual knowledge. They use their senses and brains, however primitive, to find shelter, find and consume food, evade or fight predators, and find a suitable mate. This perceptual knowledge is partly biologically ingrained and partly learned (habitual), but it is not the conceptual knowledge that reason uses. As James noted, “Conception is a secondary process, not indispensable to life.” (Percept and Concept – The Abuse of Concepts)

Over the centuries, concepts became predominant in human thinking, but James argued that both percepts and concepts were needed to fully know reality. What concepts offered humans in the form of breadth, argued James, it lost in depth. It is one thing to know the categorical concepts “desire,” “fear,” “joy,” and “suffering,” ; it is quite another to actually experience desire, fear, joy, and suffering. Even relatively objective categories such as “water,” “stars,” “trees,” “fire,” and so forth are nearly impossible to adequately describe to someone who has not seen or felt these phenomena. Concepts had to be related to particular percepts in the real world, concluded James, or they were merely empty abstractions.

In fact, most of the other non-rational types of knowledge I am about to describe below appear to be types of perceptual knowledge, insofar as they involve perceptions and sensations in making judgments. But I have broken them out into separate categories for purposes of clarity and explanation.

 

2. Emotional knowledge

In a previous post, I discussed the reality of emotional knowledge by pointing to the studies of Professor of Neuroscience Antonio Damasio (see Descartes’ Error: Emotion, Reason, and the Human Brain). Damasio studied a number of human subjects who had lost the part of their brain responsible for emotions, whether due to an accident or a brain tumor. According to Damasio, these subjects experienced a marked decline in their competence and decision-making capability after losing their emotional capacity, even though their IQs remained above-normal. They did not lose their intellectual ability, but their emotions. And that made all the difference. They lost their ability to make good decisions, to effectively manage their time, and to navigate relationships with other human beings. Their competence diminished and their productivity at work plummeted.

Why was this? According to Damasio, when these subjects lost their emotional capacity, they also lost their ability to value. And when they lost their ability to value, they lost their capacity to assign different values to the options they faced every day, leading to either a paralysis in decision-making or to repeatedly misplaced priorities, focusing on trivial tasks rather than important tasks.

Now it’s true that merely having emotions does not guarantee good decisions. We all know of people who make poor decisions because they have anger management problems, they suffer from depression, or they seem to be addicted to risk-taking. The trick is to have the right balance or disposition of emotions. Consequently, a number of scientists have attempted to formulate “EQ” tests to measure persons’ emotional intelligence.

 

3. Common life / culture

People like to imagine that they think for themselves, and this is indeed possible — but only to a limited extent. We are all embedded in a culture, and this culture consists of knowledge and practices that stretch back hundreds or thousands of years. The average English-language speaker has a vocabulary of tens of thousands of words. So how many of those words has a typical person invented? In most cases, none – every word we use is borrowed from our cultural heritage. Likewise, every concept we employ, every number we add or subtract, every tradition we follow, every moral rule we obey is transmitted to us down through the generations. If we invent a new word that becomes widely adopted, if we come up with an idea that is both completely original and worthy, that is a very rare event indeed.

You may argue, “This may well be true. But you know perfectly well that cultures, or the ‘common life’ of peoples are also filled with superstition, with backwardness, and barbarism. Moreover, these cultures can and do change over time. The use of reason, from the most intelligent people in that culture, has overcome many backward and barbarous practices, and has replaced superstition with science.” To which, I reply, “Yes, but very few people actually have original and valuable contributions to knowledge, and their contributions are often few and in specialized fields. Even these creative geniuses must take for granted most of the culture they have lived in. No one has the time or intelligence to create a plan for an entirely new society. The common life or culture of a society is a source of wisdom that cannot be done away with entirely.”

This is essentially the insight of the eighteenth century philosopher David Hume. According to Hume, philosophers are tempted to critique all the common knowledge of society as being unfounded in reason and to begin afresh with pure deductive logic, as did Descartes.  But this can only end in total skepticism and nihilism. Rather, argues Hume, “true philosophy” must work within the common life. As Donald W. Livingstone, a former professor at Emory University, has explained:

Hume defines ‘true philosophy’ as ‘reflections on common life methodized and corrected.’ . . . The error of philosophy, as traditionally conceived—and especially modern philosophy—is to think that abstract rules or ideals gained from reflection are by themselves sufficient to guide conduct and belief. This is not to say abstract rules and ideals are not needed in critical thinking—they are—but only that they cannot stand on their own. They are abstractions or stylizations from common life; and, as abstractions, are indeterminate unless interpreted by the background prejudices of custom and tradition. Hume follows Cicero in saying that ‘custom is the great guide of life.’ But custom understood as ‘methodized and corrected’ by loyal and skillful participants. (“The First Conservative,” The American Conservative, August 10, 2011)

 

4. Tacit knowledge / Intuition

Is it possible to write a perfect manual on how to ride a bicycle, one that successfully instructs a child on how to get on a bicycle for the first time and ride it perfectly? What about a perfect cookbook, one that turns a beginner into a master chef upon reading it? Or what about reading all the books in the world about art — will that give someone what they need to create great works of art? The answer to all of these questions is of course, “no.” One must have actual experience in these activities. Knowing how to do something is definitely a form of knowledge — but it is a form of knowledge that is difficult or impossible to transmit fully through a set of abstract rules and instructions. The knowledge is intuitive and habitual. Your brain and central nervous system make minor adjustments in response to feedback every time you practice an activity, until you master it as well as you can. When you ride a bike, you’re not consciously implementing a set of explicit rules inside your head, you’re carrying out an implicit set of habits learned in childhood. Obviously, talents vary, and practice can only take us so far. Some people have a natural disposition to be great athletes or artists or chefs. They can practice the same amount as other people and yet leap ahead of the rest.

The British philosopher Gilbert Ryle famously drew a distinction between two forms of knowledge: “knowing how” and “knowing that.” “Knowing how” is a form of tacit knowledge and precedes “knowing that,” i.e., knowing an explicit set of abstract propositions. Although we can’t fully express tacit knowledge in language, symbolic logic, or mathematics, we know it exists, because people can and will do better at certain activities by learning and practicing. But they are not simply absorbing abstract propositions — they are immersing themselves in a community, they are working alongside a mentor, and they are practicing with the guidance of the community and mentor. And this method of learning how also applies to learning how to reason in logic and mathematics. Ryle has pointed out that it is possible to teach a student everything there is to know about logical proofs — and that student may be able to fully understand others’ logical proofs. And yet when it comes to doing his or her own logical proofs, that student may completely fail. The student knows that but does not know how.

A recent article on the use of artificial intelligence in interpreting medical scans points out that it is virtually impossible for humans to be fully successful in interpreting medical scans simply by applying a set of rules. The people who were best at diagnosing medical scans were not applying rules but engaging in pattern recognition, an activity that requires talent and experience but can’t be fully learned in a text. Many times when expert diagnosticians are asked how they came to a certain conclusion, they have difficulty describing their method in words — they may say a certain scan simply “looks funny.” One study described in the article concluded that pattern recognition uses a part of the brain responsible for naming things:

‘[A] process similar to naming things in everyday life occurs when a physician promptly recognizes a characteristic and previously known lesion,’ the researchers concluded. Identifying a lesion was a process similar to naming the animal. When you recognize a rhinoceros, you’re not considering and eliminating alternative candidates. Nor are you mentally fusing a unicorn, an armadillo, and a small elephant. You recognize a rhinoceros in its totality—as a pattern. The same was true for radiologists. They weren’t cogitating, recollecting, differentiating; they were seeing a commonplace object.

Oddly enough, it appears to be possible to teach computers implicit knowledge of medical scans. A computing strategy known as a “neural network” attempts to mimic the human brain by processing thousands or millions of patterns that are fed into the computer. If the computer’s answer is correct, the connection responsible for that answer is strengthened; if the answer is incorrect, that connection is weakened. Over time, the computer’s ability to arrive at the correct answer increases. But there is no set of rules, simply a correlation built up over thousands and thousands of scans. The computer remains a “black box” in its decisions.

 

5. Creative knowledge

It is one thing to absorb knowledge — it is quite another to create new knowledge. One may attend school for 15 or 20 years and diligently apply the knowledge learned throughout his or her career, and yet never invent anything new, never achieve any significant new insight. And yet all knowledge was created by various persons at one point in the past. How is this done?

As with emotional knowledge, creative knowledge is not necessarily an outcome of high intelligence. While creative people generally have an above-average IQ, the majority of creative people do not have a genius-level IQ (upper one percent of the population). In fact, most geniuses do not make significant creative contributions. The reason for this is that new inventions and discoveries are rarely an outcome of logical deduction but of a “free association” of ideas that often occurs when one is not mentally concentrating at all. Of note, creative people themselves cannot precisely describe how they get their ideas. The playwright Neil Simon once said, “I don’t write consciously . . . I slip into a state that is apart from reality.” According to one researcher, “[C]reative people are better at recognizing relationships, making associations and connections, and seeing things in an original way — seeing things that others cannot see.” Moreover, this “free association” of ideas actually occurs most effectively while a person is at rest mentally: drifting off to sleep, taking a bath or shower, or watching television.

Mathematics is probably the most precise and rigorous of disciplines, but mathematical discovery is so mysterious that mathematicians themselves have compared their insights to mysticism. The great French mathematician Henri Poincare believed that the human mind worked subliminally on problems, and his work habit was to spend no more than two hours at a time working on mathematics. Poincare believed that his subconscious would continue working on problems while he conducted other activities, and indeed, many of his great discoveries occurred precisely when he was away from his desk. John von Neumann, one of the best mathematicians of the twentieth century, also believed in the subliminal mind. He would sometimes go to sleep with a mathematical problem on his mind and wake up in the middle of the night with a solution. Reason may be used to confirm or disconfirm mathematical discoveries, but it is not the source of the discoveries.

 

6. The Moral Imagination

Where do moral rules come from? Are they handed down by God and communicated through the sacred texts — the Torah, the Bible, the Koran, etc.? Or can morals be deduced by using pure reason, or by observing nature and drawing objective conclusions, they same way that scientists come to objective conclusions about physics and chemistry and biology?

Centuries ago, a number of philosophers rejected religious dogma but came to the conclusion that it is a fallacy to suppose that reason is capable of creating and defending moral rules. These philosophers, known as the “sentimentalists,” insisted that human emotions were the root of all morals. David Hume argued that reason in itself had little power to motivate us to help others; rather sympathy for others was the root of morality. Adam Smith argued that the basis of sympathy was the moral imagination:

As we have no immediate experience of what other men feel, we can form no idea of the manner in which they are affected, but by conceiving what we ourselves should feel in the like situation. Though our brother is upon the rack, as long as we ourselves are at our ease, our senses will never inform us of what he suffers. They never did, and never can, carry us beyond our own person, and it is by the imagination only that we can form any conception of what are his sensations. . . . It is the impressions of our own senses only, not those of his, which our imaginations copy. By the imagination we place ourselves in his situation, we conceive ourselves enduring all the same torments, we enter as it were into his body, and become in some measure the same person with him, and thence form some idea of his sensations, and even feel something which, though weaker in degree, is not altogether unlike them. His agonies, when they are thus brought home to ourselves, when we have thus adopted and made them our own, begin at last to affect us, and we then tremble and shudder at the thought of what he feels. (The Theory of Moral Sentiments, Section I, Chapter I)

Adam Smith recognized that it was not enough to sympathize with others; those who behaved unjustly, immorally, or criminally did not always deserve sympathy. One had to make judgments about who deserved sympathy. So human beings imagined “a judge between ourselves and those we live with,” an “impartial and well-informed spectator” by which one could make moral judgments. These two imaginations — of sympathy and of an impartial judge — are the real roots of morality for Smith.

__________________________

 

This brings us to our final topic: the role of non-rational forms of knowledge within reason itself.

Aristotle is regarded as the founding father of logic in the West, and his writings on the subject are still influential today. Aristotle demonstrated a variety of ways to deduce correct conclusions from certain premises. Here is one example that is not from Aristotle, but which has been used as an example of Aristotle’s logic:

All men are mortal. (premise)

Socrates is a man. (premise)

Therefore, Socrates is mortal. (conclusion)

The logic is sound, and the conclusion follows from the premises. But this simple example was not at all typical of most real-life puzzles that human beings faced. And there was an additional problem.

If one believed that all knowledge had to be demonstrated through logical deduction, that rule had to be applied to the premises of the argument as well. Because if the premises were wrong, the whole argument was wrong. And every argument had to begin with at least one premise. Now one could construct another argument proving the premise(s) of the first argument — but then the premises of the new argument also had to be demonstrated, and so forth, in an infinite regress.

To get out of this infinite regress, some argued that deduced conclusions could support premises in the same way as the premises supported a conclusion, a type of circular support. But Aristotle rejected this argument as incoherent. Instead, Aristotle offered an argument that to this day is regarded as difficult to interpret.

According to Aristotle, there is another cognitive state, known as “nous.” It is difficult to find an English equivalent of this word, and the Greeks themselves seemed to use different meanings, but the word “nous” has been translated as “insight,” “intuition,” or “intelligence.” According to Aristotle, nous makes it possible to know certain things immediately without going through a process of argument or logical deduction. Aristotle compares this power to perception, noting that we have the power to discern different colors with our eyesight even without being taught what colors are. It is an ingrained type of knowledge that does not need to be taught. In other words, nous is a type of non-rational knowledge — tacit, intuitive, and direct, not requiring concepts!

A Defense of the Ancient Greek Pagan Religion

In a previous post on the topic of mythos and logos, I discussed the evolution of ancient Greek thought from its origins in imaginative legends about gods to the development of reason, philosophy, and logic. Today, every educated human being knows about the contributions of Socrates, Plato, Euclid, and Pythagoras. But the ancient Greek religion appears to us as an embarrassment, something to be passed over in silence or laughed at. Indeed, it is difficult to read about the enormous plethora of Greek gods and goddesses and the ludicrous stories about their various activities without wondering how Greek civilization ever managed to accomplish the great things it accomplished while it was so mired in superstition.

I am not going to defend ancient Greek superstition. But I will say this: Greek religion was much more than mere superstition — it was about devotion to a greater good. According to the German scholar Werner Jaeger,”Areté was the central ideal of all Greek culture.” (Paideia: The Ideal of Greek Culture, Vol. I, p. 15). The word areté means “excellence,” and although in early Greek history it referred primarily to the virtues of the warrior-hero, by the time of Homer areté referred more broadly to all types of excellence. Areté was rooted in the mythos of ancient Greece, in the epic poetry of Hesiod and Homer, with the more philosophical logos emerging later.

This devotion of the Greeks to a greater good was powerful, even fanatical. Religion was so absolutely central to Greek life, that this ancient pre-industrial civilization spent enormous sums of money on temples, statues, and religious festivals, at a time when long hours of hard physical labor were necessary simply to keep from starving. However, at the same time, Greek religion was remarkably loose and liberal in it’s set of beliefs — there was not a single accepted doctrine, a written set of rules, or even a single sacred text, similar to the Torah, Bible, or Quran. The Greeks freely created a plethora of gods and stories about the gods and revised the stories as they wished. But the Greeks did insist upon the fundamental reality of a greater good and complete devotion to it. I will argue that this devotion was responsible for the enormous contributions of ancient Greece, and that a completely secular, rational Greece would not have accomplished nearly as much.

In order to understand my defense of ancient Greek religion, I think it is important to recognize that there are different types of knowledge. There is knowledge of natural causation and knowledge of history; but there is also esthetic knowledge (knowledge of the beautiful); moral knowledge; and knowledge of the proper goals and ends of human life. Greek religion failed in understanding natural causation and history, but often succeeded in these latter forms of knowledge. Greek religion was never merely a set of statements about the origins and history of the universe and the operations of nature. Rather, Greek religion was characterized by a number of other qualities. Greek religion was experiential, symbolic, celebratory, practical, and teleological. Let’s look at each of these features more closely.

Experiential. In order to understand Greek religion — or any religion, actually — one has to do more than simply absorb a set of statements of belief. One has to experience the presence of a greater good.

athena_parthenon

statue-of-zeus-olympia

The first picture above is of a 40-feet tall statue of the Greek goddess Athena in a life-size recreation of the ancient Greek Parthenon in Nashville, Tennessee. The second picture is a depiction of the probable appearance of the statue of Zeus at the Temple of Zeus in the sanctuary of Olympia, Greece, the site of the Olympic games.

Contrary to popular belief, Greek statues were not all white, but often painted in vivid colors, and sometimes adorned with gold, ivory, and precious stones. The size and beauty of the temple statues was meant to convey grandeur, and that is precisely the effect that they had. The statue of Zeus at Olympia has been listed among the Seven Wonders of the Ancient World. A Roman general who once saw the statue of Zeus declared that he “was moved to his soul, as if he had seen the god in person.” The Greek orator and philosopher Dio Chrysostom declared that a single glimpse of the statue of Zeus would make a man forget all his earthly troubles.

Symbolic. When the Greeks created sculptures of their gods, they were not really aiming for an accurate depiction of what their gods “really” looked like. The gods were spirits or powers; the gods were responsible for creating forms, and could appear in any form they wished, but in themselves gods had no human form. Indeed, in one myth, Zeus was asked by a mortal to reveal his true form; but Zeus’s true form was a thunderbolt, so when Zeus appeared as a thunderbolt, he incinerated the unfortunate person. Rather than depict the gods “realistically,” Greek sculptors sought to depict the gods symbolically, as the most beautiful human forms imaginable, male or female. These are metaphorical or analogical depictions, using personification to represent the gods.

I am not going to argue that all Greek religion was metaphorical — clearly, most Greeks believed in the gods as real, actual personalities. But there was a strong metaphorical aspect to Greek religious thought, and it is often difficult even for scholars to tell what parts of Greek religion were metaphorical and what parts were literal. For example, we know that the Greeks actually worshiped certain virtues and desired goods, such as “Peace,” “Victory,” “Love,” “Democracy,” “Health,” “Order,” and “Wealth.” The Greeks used personal forms to represent these virtues, and created statues, temples, and alters dedicated to them, but they did not see the virtues as literal personalities. Some of this symbolic representation of virtues survives to this day: the blindfolded Lady Justice, the statue of Freedom on the top of the U.S. Capitol building, and the Statue of Liberty are several personifications widely recognized in modern America. Some scholars have suggested that the main Greek gods began as personifications (i.e., “Zeus” was the personification of the sky) but that over time the gods came to be seen as full-fledged personalities. However, the lack of written records from the early periods in Greek history make it impossible to confirm or refute this claim.

Celebratory. Religion is often seen as a strict and solemn affair, and although Greek religion had some of these aspects, there was a strong celebratory aspect to Greek religion. The Greeks not only wanted to thank the gods for life and food and drink and love, they wanted to demonstrate their thanks and celebrate through feasts, festivals, and holidays. Indeed, it is probably the case that the only time most Greeks ate meat was after a ritual sacrifice of cattle or other livestock at the altar of a god. (Greek Religion, ed. Daniel Ogden, p. 402) In ancient Athens, about half of the days in the calendar were devoted to religious festivals and each god or goddess often had more than one festival.  The most famous religious festival was the festival devoted to Zeus, held every four years at the sanctuary of Olympia. The Greeks visited the temple of Zeus and prayed to their god — but also held games, celebrated the victors, and enjoyed feasts. The Greeks also held festivals devoted to the god Dionysus, the god of wine and ecstasy. Drink, music, theater, and dancing played a central role in Dionysian festivals.

Practical. When I was doing research on Greek religion, I came across a fascinating discussion on how the Greeks performed animal sacrifice. Allegedly, when the animals were slaughtered, the Greeks were obligated to share a portion of the animal with the gods by burning it on the altar. However, when the Greeks butchered the animal, they reserved all the meat for themselves and sacrificed only the bones, covered with a deceptive layer of fat, for the gods. It’s hard not to be somewhat amused by this. Why would the powerful, all-knowing gods be satisfied with the useless, inedible portions of an animal, while the Greeks kept the best parts for themselves? The Greeks even had a myth to justify this practice: allegedly Prometheus fooled Zeus into accepting the bones and fat, and from that original act, all future sacrifices were similarly justified. As devoted to the gods as the Greeks were, they were also practical; in a primitive society, meat was a rare and expensive commodity for most. Sacrifice was a symbolic act of devotion to the gods, but the Greeks were not prepared to go hungry by sacrificing half of their precious meat.

And what of prayer to the gods? Clearly, the Greeks prayed to the gods and asked favors of them. But prayer never stopped or even slowed Greek achievements in art, architecture, athletics, philosophy, and mathematics. No Greek ever entered the Olympic games fat and out-of-shape, hoping that copious prayers and sacrifices to Zeus would help him win the games. No Greek ever believed that one did not have to train hard for war, that prayers to their deity would suffice to save their city from destruction at the hands of an enemy. Nor did the Greeks expect incompetent agriculture or engineering would be saved by prayer. The Greeks sought inspiration, strength, and assistance from the gods, but they did not believe that prayer would substitute for their personal shortcomings and neglect.

Teleological (goal-oriented). In a previous essay, I discussed the role of teleology — explanation in terms of goals or purpose — in accounting for causation. Although modern science has largely dismissed teleological causation in favor of efficient causation, I argued that teleological, or goal-oriented, causation could have a significant role in understanding (1) the long-term development of the universe and (2) the behavior of life forms. In a teleological perspective, human beings are not merely the end result of chemical or atomic mechanisms — humans are able to partially transcend the parts they are made of and work toward certain goals or ends that they choose.

We misunderstand Greek religion when we think of it as being merely a collection of primitive beliefs about natural causation that has been superseded by science. The gods were not merely causal agents of thunderstorms, earthquakes, and plagues. They were representations of areté , idealized forms of human perfection that inspired and guided the Greeks. In the pantheon of major Greek gods, only one (Poseidon) is associated solely with natural causation, being responsible for the seas and for earthquakes. Eight of the gods were associated primarily with human qualities, activities, and institutions — love, beauty, music, healing, war, hunting, wisdom, marriage, childbirth, travel, language, and the home. Three gods were associated with both natural causation and human qualities, Zeus being responsible for thunder and lightning, as well as law and justice. The Greeks also honored and worshiped mortal heroes, extraordinary persons who founded a city, overthrew a tyrant, or won a war. Inventors, poets, and athletes were worshiped as well, not because they had the powers of the gods, but because they were worthy of emulation and were sources of inspiration. (“Heroes and Hero Cults,” Greek Religion, ed. Daniel Ogden, pp. 100-14)

At this point, you may well ask, can’t we devote ourselves to the goal of excellence by using reason? There is no need to read about myths and appeal to invisible superbeings that do not exist in order to pursue excellence. This argument is partly true, but it must be pointed out that reason in itself is an insufficient guide to what goods we should be devoted to. Esthetics, imagination, and faith provide us with goals that reason by itself can’t provide. Reason is a superb tool for thinking, but it is not an all-purpose tool.

You can see the limitations of pure reason in modern, secular societies. People don’t really spend much time thinking about the greater goods they should pursue, so they fall into the trap of materialism. Religion is considered a private affair, so it is not taught in public schools, and philosophy is considered a waste of time. So people tend to borrow their life goals from their surrounding culture and peer groups; from advertisers on television and the Internet; and from movie stars and famous musicians. People end up worshiping money, technology, and celebrities; they know those things are “real” because they are material, tangible, and because their culture tells them these things are important. But this worship without religion is only a different form of irrationality and superstition. As “real” as material goods are, they only provide temporary satisfaction, and there is never an amount of money or a house big enough or a car fancy enough or a celebrity admirable enough to bring us lasting happiness.

What the early Greeks understood is that reason in itself is a weak tool for directing the passions — only passions, rightly-ordered, can rule other passions. The Greeks also knew that excellence and beauty were real, even if the symbolic forms used to represent these realities were imprecise and imperfect. Finally, the Greeks understood that faith had causal potency — not in the sense that prayers could prevent an earthquake or a plague, but in the sense that attaining the heights of human achievement was possible only by total and unwavering commitment to a greater good, reinforced by ritual and habit. For the Greeks, reality was a work-in-progress: it didn’t consist merely of static “things” but of human possibilities and potential, the ability to be more than ourselves, to be greater than ourselves. However we want to symbolize it, devotion to a greater good is the first step to realizing that good. When we skip the first step, devotion, we shouldn’t be surprised when we fail to attain it.

What is “Mythos” and “Logos”?

The terms “mythos” and “logos” are used to describe the transition in ancient Greek thought from the stories of gods, goddesses, and heroes (mythos) to the gradual development of rational philosophy and logic (logos). The former is represented by the earliest Greek thinkers, such as Hesiod and Homer; the latter is represented by later thinkers called the “pre-Socratic philosophers” and then Socrates, Plato, and Aristotle. (See the book: From Myth to Reason? Studies in the Development of Greek Thought).

In the earliest, “mythos” stage of development, the Greeks saw events of the world as being caused by a multitude of clashing personalities — the “gods.” There were gods for natural phenomena such as the sun, the sea, thunder and lightening, and gods for human activities such as winemaking, war, and love. The primary mode of explanation of reality consisted of highly imaginative stories about these personalities. However, as time went on, Greek thinkers became critical of the old myths and proposed alternative explanations of natural phenomena based on observation and logical deduction. Under “logos,” the highly personalized worldview of the Greeks became transformed into one in which natural phenomena were explained not by invisible superhuman persons, but by impersonal natural causes.

However, many scholars argue that there was not such a sharp distinction between mythos and logos historically, that logos grew out of mythos, and elements of mythos remain with us today.

For example, ancient myths provided the first basic concepts used subsequently to develop theories of the origins of the universe. We take for granted the words that we use every day, but the vast majority of human beings never invent a single word or original concept in their lives — they learn these things from their culture, which is the end-product of thousands of years of speaking and writing by millions of people long-dead. The very first concepts of “cosmos,” “beginning,” nothingness,” and differentiation from a single substance — these were not present in human culture for all time, but originated in ancient myths. Subsequent philosophers borrowed these concepts from the myths, while discarding the overly-personalistic interpretations of the origins of the universe. In that sense, mythos provided the scaffolding for the growth of philosophy and modern science. (See Walter Burkert, “The Logic of Cosmogony” in From Myth to Reason: Studies in the Development of Greek Thought.)

An additional issue is the fact that not all myths are wholly false. Many myths are stories that communicate truths even if the characters and events in the story are fictional. Socrates and Plato denounced many of the early myths of the Greeks, but they also illustrated philosophical points with stories that were meant to serve as analogies or metaphors. Plato’s allegory of the cave, for example, is meant to illustrate the ability of the educated human to perceive the true reality behind surface impressions. Could Plato have made the same philosophical point in a literal language, without using any stories or analogies? Possibly, but the impact would be less, and it is possible that the point would not be effectively communicated at all.

Some of the truths that myths communicate are about human values, and these values can be true even if the stories in which the values are embedded are false. Ancient Greek religion contained many preposterous stories, and the notion of personal divine beings directing natural phenomena and intervening in human affairs was false. But when the Greeks built temples and offered sacrifices, they were not just worshiping personalities — they were worshiping the values that the gods represented. Apollo was the god of light, knowledge, and healing; Hera was the goddess of marriage and family; Aphrodite was the goddess of love; Athena was the goddess of wisdom; and Zeus, the king of the gods, upheld order and justice. There’s no evidence at all that these personalities existed or that sacrifices to these personalities would advance the values they represented. But a basic respect for and worshipful disposition toward the values the gods represented was part of the foundation of ancient Greek civilization. I don’t think it was a coincidence that the city of Athens, whose patron goddess was Athena, went on to produce some of the greatest philosophers the world has seen — love of wisdom is the prerequisite for knowledge, and that love of wisdom grew out of the culture of Athens. (The ancient Greek word philosophia literally means “love of wisdom.”)

It is also worth pointing out that worship of the gods, for all of its superstitious aspects, was not incompatible with even the growth of scientific knowledge. Modern western medicine originated in the healing temples devoted to the god Asclepius, the son of Apollo, and the god of medicine. Both of the great ancient physicians Hippocrates and Galen are reported to have begun their careers as physicians in the temples of Asclepius, the first hospitals. Hippocrates is widely regarded as the father of western medicine and Galen is considered the most accomplished medical researcher of the ancient world. As love of wisdom was the prerequisite for philosophy, reverence for healing was the prerequisite for the development of medicine.

Karen Armstrong has written that ancient myths were never meant to be taken literally, but were “metaphorical attempts to describe a reality that was too complex and elusive to express in any other way.” (A History of God) I am not sure that’s completely accurate. I think it most likely that the mass of humanity believed in the literal truth of the myths, while educated human beings understood the gods to be metaphorical representations of the good that existed in nature and humanity. Some would argue that this use of metaphors to describe reality is deceptive and unnecessary. But a literal understanding of reality is not always possible, and metaphors are widely used even by scientists.

Theodore L. Brown, a professor emeritus of chemistry at the University of Illinois at Urbana-Champaign, has provided numerous examples of scientific metaphors in his book, Making Truth: Metaphor in Science. According to Brown, the history of the human understanding of the atom, which cannot be directly seen, began with a simple metaphor of atoms as billiard balls; later, scientists compared atoms to plum pudding; then they compared the atom to our solar system, with electrons “orbiting” around a nucleus. There has been a gradual improvement in our models of the atom over time, but ultimately, there is no single, correct literal representation of the atom. Each model illustrates an aspect or aspects of atomic behavior — no one model can capture all aspects accurately. Even the notion of atoms as particles is not fully accurate, because atoms can behave like waves, without a precise position in space as we normally think of particles as having. The same principle applies to models of the molecule as well. (Brown, chapters, 4-6)  A number of scientists have compared the imaginative construction of scientific models to map-making — there is no single, fully accurate way to map the earth (using a flat surface to depict a sphere), so we are forced to use a variety of maps at different scales and projections, depending on our needs.

Sometimes the visual models that scientists create are quite unrealistic. The model of the “energy landscape” was created by biologists in order to understand the process of protein folding — the basic idea was to imagine a ball rolling on a surface pitted with holes and valleys of varying depth. As the ball would tend to seek out the low points on the landscape (due to gravity), proteins would tend to seek the lowest possible free energy state. All biologists know the energy landscape model is a metaphor — in reality, proteins don’t actually go rolling down hills! But the model is useful for understanding a process that is highly complex and cannot be directly seen.

What is particularly interesting is that some of the metaphorical models of science are frankly anthropomorphic — they are based on qualities or phenomena found in persons or personal institutions. Scientists envision cells as “factories” that accept inputs and produce goods. The genetic structure of DNA is described as having a “code” or “language.” The term “chaperone proteins” was invented to describe proteins that have the job of assisting other proteins to fold correctly; proteins that don’t fold correctly are either treated or dismantled so that they do not cause damage to the larger organism — a process that has been given a medical metaphor: “protein triage.” (Brown, chapters 7-8) Even referring to the “laws of physics” is to use a metaphorical comparison to human law. So even as logos has triumphed over the mythos conception that divine personalities rule natural phenomena, qualities associated with personal beings have continued to sneak into modern scientific models.

The transition of a mythos-dominated worldview to a logos-dominated worldview was a stupendous achievement of the ancient Greeks, and modern philosophy, science, and civilization would not be possible without it. But the transition did not involve a complete replacement of one worldview with another, but rather the building of additional useful structures on top of a simple foundation. Logos grew out of its origins in mythos, and retains elements of mythos to this day. The compatibilities and conflicts between these two modes of thought are the thematic basis of this website.

Related: A Defense of the Ancient Greek Pagan Religion