The Value of Myth in Depicting the Conflict Between Good and Evil

In October 2013, three young friends living in the Washington DC area — a male-female couple and a male friend — went out to a number of local bars to celebrate a birthday. The friends drank copiously, and then returned to a small studio apartment at 2 a.m. An hour later, one of the men stabbed his male friend to death. When police arrived, they found the surviving male covered in blood, with the floor and wall also covered in blood. “I caught my buddy and my girl cheating,” said the man. “I killed my buddy.” The man was subsequently found guilty of murder and sentenced to life in prison.

How did this happen? Was murder inevitable? It seems unlikely. The killing was not pre-planned. No one in the group had a prior record of violence or criminal activity. All three friends were well-educated and successful, with bright futures ahead of them. It’s true that all were extremely drunk, but drunkenness very rarely leads to murder.

This case is noteworthy, not because murders are unusual — murders happen all the time — but because this particular murder seems to have been completely unpredictable. It’s the normality of the persons and circumstances that disturbs the conscience. Under slightly different circumstances, the murder would not have happened at all, and all three would conceivably have lived long, happy lives.

Most of us are law-abiding citizens. We believe we are good, and despise thieves, rapists, and murderers. But what happens when the normal conditions under which we live change, when we are humiliated or outraged, when there is no security for our lives or property, when our opportunities for happiness are snatched from us for no good reason? How far will we go to avenge ourselves, and what violence will we justify in order to restore our perceived notion of justice?

The conflict between good and evil tendencies within human beings is a frequent theme in both philosophy and religion. However, philosophy has had a tendency to attribute evil tendencies within humanity to a deficiency of reason. In the view of many philosophers, reason alone should be able to establish that human rights are universal, and that impulses to violence, conquest, and enslavement are irrational. Furthermore, they argue that when reason establishes its dominance over the passions within human beings, societies become freer and more peaceful. (Notably, the great philosophers David Hume and Adam Smith rejected this argument.)

The religious interpretation of the conflict between good and evil, on the other hand, is based more upon myth and faith. And while the myths of religion are not literally accurate in terms of history or science, these myths often have insights into the inner turmoil of human beings that are lost in straightforward descriptions of fact and an emphasis on rationality.

The Christian scholar Paul Elmer More argued in his book, The Religion of Plato, that the dualism between good and evil within the human soul was very effectively described by the Greek philosopher Plato, but that this description relied heavily on the picturesque elements of myth, as found in the The Republic, Laws, Timaeus, and other works. In Plato’s view, there was a struggle within all human beings between a higher nature and a lower nature, the higher nature being drawn to a vision of ideal forms and the lower nature being dominated by the flux of human passions and desires. According to More,

It is not that pleasure or pain, or the desires and emotions connected with them, are totally depraved in themselves . . . but they contain the principle of evil in so far as they are radically unlimited, belonging by nature to what in itself is without measure and tends by inertia to endless expansion. Hence, left to themselves, they run to evil, whereas under control they may become good, and the art of life lies in the governing of pleasure and pain by a law exterior to them, in a man’s becoming master of himself, or better than himself. (pp. 225-6)

What are some of the myths Plato discusses? In The Republic, Plato tells the story of Gyges, a lowly shepherd who discovers a magic ring that bestows the power of invisibility. With this invisibility, Gyges is able to go wherever he wants undetected, and to do what he wants without anyone stopping him. Eventually, Gyges kills the king of his country and obtains absolute power for himself. In discussing this story, Glaucon, a student of Socrates, argues that with the awesome power of invisibility, no man would be able to remain just, in light of the benefits one could obtain. However, Socrates responds that being a slave to one’s desires actually does not bring long-term happiness, and that the happy man is one who is able to control his desires.

In the Phaedrus, Plato relates the dialogue between Socrates and his pupil Phaedrus on whether friendship is preferable to love. Socrates discusses a number of myths throughout the dialogue, but appears to use these myths as metaphorical illustrations of the internal struggle within human beings between their higher and lower natures. It is the nature of human beings, Socrates notes, to pursue the good and the beautiful, and this pursuit can be noble or ignoble depending on whether reason is driving one toward enlightenment or desire takes over and drives one to excessive pleasure-seeking. Indeed, Socrates describes love as a type of “madness” — but he argues that this madness is a source of inspiration that can result in either good or evil depending on how one directs the passions. Socrates proceeds to employ a figurative picture of a charioteer driving two horses, with one horse being noble and the other ignoble. The noble horse pulls the charioteer toward heaven, while the ignoble horse pulls the charioteer downward, toward the earth and potential disaster. Even so, the human being in love is influenced by the god he or she follows; the followers of Ares, the god of war, are inclined to violence if they feel wronged by their lover; the followers of Zeus, on the other hand, use love to seek philosophical wisdom.

The nature and purpose of love is also discussed in the Symposium. In this dialogue, Socrates relates a fantastical myth about human beings originally being created with two bodies attached at the back, with two heads, four arms, and four legs. These beings apparently threatened the gods, so Zeus cut the beings in two; henceforth, humans spent their lives trying to find their other halves. Love inspires wisdom and courage, according to the dialogue, but only when it encourages companionship and the exchange of knowledge, and is not merely the pursuit of sexual gratification.

Illustration of the original humans described in Plato’s Symposium:

In the Timaeus, Plato discusses the creation of the universe and the role of human beings in this universe. Everything proceeds from the Good, argued Plato. However, the Good is not some lifeless abstraction, but a power with a dynamic element. According to More, Plato gave the name of God to this dynamic element. God fashions the universe according to an ideal pattern, but the end result is always less than perfect because of the resistance of the materials and the tendency of material things to always fall short of their perfect ends.

Plato argues that there are powers of good and powers of evil in the universe — and within human beings — and Plato personifies these powers as gods or daemons. There is a struggle between good and evil that all humans participate in, and all are subject to judgment at the ends of their lives (Plato believed in reincarnation and posited that deeds in one’s recent life determined one’s station in the next life.) Here, we see myth and faith enter again into Plato’s philosophy, and More defends the use of these stories and symbols as a means of illustrating the dramas of moral conflict:

In this last stage the essential truth of philosophy as a concern of the individual soul, is rendered vivid and convincing by clothing it in the imaginative garb of fiction — fiction which may yet be a veil, more or less transparent, through which we behold the actual events of the spirit world; and this aid of the imagination is needed just because the dualism of the human consciousness cannot be grasped by the reason, demands indeed a certain abatement of that rationalizing tendency of the mind which, if left to itself, inevitably seeks its satisfaction in one or the other form of monism. (p. 199)

What’s fascinating about Plato’s use of myths in philosophy is that while he recognizes that many of the myths are literally dubious or false, they seem to point to truths that are difficult or impossible to express in literal language. Love really does seem to be a desire to unite with one’s missing half, and falling in love really is akin to madness, a madness that can lead to disaster if one is not careful. Humankind does seem to be afflicted by an internal struggle between a higher, noble nature and a lower nature, with the lower nature inclined to self-centeredness and grasping for ever more wealth, power, and pleasure.

Plato had enormous influence on Western civilization, but More argues that the successors to Plato erred by abandoning Plato’s use of myth to illustrate the duality of human nature. Over the years, Greek philosophy became increasingly rationalistic and prone to a monism that was unable to cope with the reality of human dualism. (For an example of this extreme monism, see the works of Plotinus, who argued for an abstract “One” as the ultimate source of all things.) Hence, argued More, Christianity was in fact the true heir of Platonism, and not the Greek philosophers that came after Plato.

Myth is “the drama of religion,” according to More, not a literally accurate description of a sequence of events. Reason and philosophy can analyze and discuss good and evil, but to fully understand the conflict between good and evil, within and between human beings, requires a dramatic depiction of our swirling, churning passions. In More’s words, “A myth is false and reprehensible in so far as it misses or distorts the primary truth of philosophy and the secondary truth of theology; it becomes more probable and more and more indispensable to the full religious life as it lends insistence and reality to those truths and answers to the daily needs of the soul.” (p. 165) The role of Christian myths in illustrating the dramatic conflict between good and evil will be discussed in the next essay.

Knowledge without Reason

Is it possible to gain real and valuable knowledge without using reason? Many would scoff at this notion. If an idea can’t be defended on rational grounds, it is either a personal preference that may not be held by others or it is false and irrational. Even if one acknowledges a role for intuition in human knowledge, how can one trust another person’s intuition if that person does not provide reasons for his or her beliefs?

In order to address this issue, let’s first define “reason.” The Encyclopedia Britannica defines reason as “the faculty or process of drawing logical inferences,” that is, the act of developing conclusions through logic. Britannica adds, “Reason is in opposition to sensation, perception, feeling, desire, as the faculty . . .  by which fundamental truths are intuitively apprehended.” The New World Encyclopedia defines reason as “the ability to form and operate upon concepts in abstraction, in accordance with rationality and logic. ” Wikipedia states: “Reason is the capacity of consciously making sense of things, applying logic, and adapting or justifying practices, institutions, and beliefs based on new or existing information.”

Fundamental to all these definitions is the idea that knowledge must be based on explicit concepts and statements, in the form of words, symbols, or mathematics. Since human language is often ambiguous, with different definitions for the same word (I could not even find a single, widely-accepted definition of “reason” in standard reference texts), many intellectuals have believed that mathematics, science, and symbolic logic are the primary means of acquiring the most certain knowledge.

However, there are types of knowledge not based on reason. These types of knowledge are difficult or impossible to express in explicit concepts and statements, but we know that they are types of knowledge because they lead to successful outcomes. In these cases, we don’t know how exactly a successful outcome was reached — that remains a black box. But we can judge that the knowledge is worthwhile by the actor’s success in achieving that outcome. There are at least six types of non-rational knowledge:

 

1. Perceptual knowledge

In a series of essays in the early twentieth century, the American philosopher William James drew a distinction between “percepts” and “concepts.” According to James, originally all human beings, like the lower life forms, gathered information from their environment in the form of perceptions and sensations (“percepts”). It was only later in human evolution that human beings created language and mathematics, which allowed them to form concepts. These concepts categorized and organized the findings from percepts, allowing communication between different humans about their perceptual experiences and facilitating the growth of reason. In James’s words, “Feeling must have been originally self-sufficing; and thought appears as a super-added function, adapting us to a wider environment than that of which brutes take account.” (William James, “Percept and Concept – The Import of Concepts“).

All living creatures have perceptual knowledge. They use their senses and brains, however primitive, to find shelter, find and consume food, evade or fight predators, and find a suitable mate. This perceptual knowledge is partly biologically ingrained and partly learned (habitual), but it is not the conceptual knowledge that reason uses. As James noted, “Conception is a secondary process, not indispensable to life.” (Percept and Concept – The Abuse of Concepts)

Over the centuries, concepts became predominant in human thinking, but James argued that both percepts and concepts were needed to fully know reality. What concepts offered humans in the form of breadth, argued James, it lost in depth. It is one thing to know the categorical concepts “desire,” “fear,” “joy,” and “suffering,” ; it is quite another to actually experience desire, fear, joy, and suffering. Even relatively objective categories such as “water,” “stars,” “trees,” “fire,” and so forth are nearly impossible to adequately describe to someone who has not seen or felt these phenomena. Concepts had to be related to particular percepts in the real world, concluded James, or they were merely empty abstractions.

In fact, most of the other non-rational types of knowledge I am about to describe below appear to be types of perceptual knowledge, insofar as they involve perceptions and sensations in making judgments. But I have broken them out into separate categories for purposes of clarity and explanation.

 

2. Emotional knowledge

In a previous post, I discussed the reality of emotional knowledge by pointing to the studies of Professor of Neuroscience Antonio Damasio (see Descartes’ Error: Emotion, Reason, and the Human Brain). Damasio studied a number of human subjects who had lost the part of their brain responsible for emotions, whether due to an accident or a brain tumor. According to Damasio, these subjects experienced a marked decline in their competence and decision-making capability after losing their emotional capacity, even though their IQs remained above-normal. They did not lose their intellectual ability, but their emotions. And that made all the difference. They lost their ability to make good decisions, to effectively manage their time, and to navigate relationships with other human beings. Their competence diminished and their productivity at work plummeted.

Why was this? According to Damasio, when these subjects lost their emotional capacity, they also lost their ability to value. And when they lost their ability to value, they lost their capacity to assign different values to the options they faced every day, leading to either a paralysis in decision-making or to repeatedly misplaced priorities, focusing on trivial tasks rather than important tasks.

Now it’s true that merely having emotions does not guarantee good decisions. We all know of people who make poor decisions because they have anger management problems, they suffer from depression, or they seem to be addicted to risk-taking. The trick is to have the right balance or disposition of emotions. Consequently, a number of scientists have attempted to formulate “EQ” tests to measure persons’ emotional intelligence.

 

3. Common life / culture

People like to imagine that they think for themselves, and this is indeed possible — but only to a limited extent. We are all embedded in a culture, and this culture consists of knowledge and practices that stretch back hundreds or thousands of years. The average English-language speaker has a vocabulary of tens of thousands of words. So how many of those words has a typical person invented? In most cases, none – every word we use is borrowed from our cultural heritage. Likewise, every concept we employ, every number we add or subtract, every tradition we follow, every moral rule we obey is transmitted to us down through the generations. If we invent a new word that becomes widely adopted, if we come up with an idea that is both completely original and worthy, that is a very rare event indeed.

You may argue, “This may well be true. But you know perfectly well that cultures, or the ‘common life’ of peoples are also filled with superstition, with backwardness, and barbarism. Moreover, these cultures can and do change over time. The use of reason, from the most intelligent people in that culture, has overcome many backward and barbarous practices, and has replaced superstition with science.” To which, I reply, “Yes, but very few people actually have original and valuable contributions to knowledge, and their contributions are often few and in specialized fields. Even these creative geniuses must take for granted most of the culture they have lived in. No one has the time or intelligence to create a plan for an entirely new society. The common life or culture of a society is a source of wisdom that cannot be done away with entirely.”

This is essentially the insight of the eighteenth century philosopher David Hume. According to Hume, philosophers are tempted to critique all the common knowledge of society as being unfounded in reason and to begin afresh with pure deductive logic, as did Descartes.  But this can only end in total skepticism and nihilism. Rather, argues Hume, “true philosophy” must work within the common life. As Donald W. Livingstone, a former professor at Emory University, has explained:

Hume defines ‘true philosophy’ as ‘reflections on common life methodized and corrected.’ . . . The error of philosophy, as traditionally conceived—and especially modern philosophy—is to think that abstract rules or ideals gained from reflection are by themselves sufficient to guide conduct and belief. This is not to say abstract rules and ideals are not needed in critical thinking—they are—but only that they cannot stand on their own. They are abstractions or stylizations from common life; and, as abstractions, are indeterminate unless interpreted by the background prejudices of custom and tradition. Hume follows Cicero in saying that ‘custom is the great guide of life.’ But custom understood as ‘methodized and corrected’ by loyal and skillful participants. (“The First Conservative,” The American Conservative, August 10, 2011)

 

4. Tacit knowledge / Intuition

Is it possible to write a perfect manual on how to ride a bicycle, one that successfully instructs a child on how to get on a bicycle for the first time and ride it perfectly? What about a perfect cookbook, one that turns a beginner into a master chef upon reading it? Or what about reading all the books in the world about art — will that give someone what they need to create great works of art? The answer to all of these questions is of course, “no.” One must have actual experience in these activities. Knowing how to do something is definitely a form of knowledge — but it is a form of knowledge that is difficult or impossible to transmit fully through a set of abstract rules and instructions. The knowledge is intuitive and habitual. Your brain and central nervous system make minor adjustments in response to feedback every time you practice an activity, until you master it as well as you can. When you ride a bike, you’re not consciously implementing a set of explicit rules inside your head, you’re carrying out an implicit set of habits learned in childhood. Obviously, talents vary, and practice can only take us so far. Some people have a natural disposition to be great athletes or artists or chefs. They can practice the same amount as other people and yet leap ahead of the rest.

The British philosopher Gilbert Ryle famously drew a distinction between two forms of knowledge: “knowing how” and “knowing that.” “Knowing how” is a form of tacit knowledge and precedes “knowing that,” i.e., knowing an explicit set of abstract propositions. Although we can’t fully express tacit knowledge in language, symbolic logic, or mathematics, we know it exists, because people can and will do better at certain activities by learning and practicing. But they are not simply absorbing abstract propositions — they are immersing themselves in a community, they are working alongside a mentor, and they are practicing with the guidance of the community and mentor. And this method of learning how also applies to learning how to reason in logic and mathematics. Ryle has pointed out that it is possible to teach a student everything there is to know about logical proofs — and that student may be able to fully understand others’ logical proofs. And yet when it comes to doing his or her own logical proofs, that student may completely fail. The student knows that but does not know how.

A recent article on the use of artificial intelligence in interpreting medical scans points out that it is virtually impossible for humans to be fully successful in interpreting medical scans simply by applying a set of rules. The people who were best at diagnosing medical scans were not applying rules but engaging in pattern recognition, an activity that requires talent and experience but can’t be fully learned in a text. Many times when expert diagnosticians are asked how they came to a certain conclusion, they have difficulty describing their method in words — they may say a certain scan simply “looks funny.” One study described in the article concluded that pattern recognition uses a part of the brain responsible for naming things:

‘[A] process similar to naming things in everyday life occurs when a physician promptly recognizes a characteristic and previously known lesion,’ the researchers concluded. Identifying a lesion was a process similar to naming the animal. When you recognize a rhinoceros, you’re not considering and eliminating alternative candidates. Nor are you mentally fusing a unicorn, an armadillo, and a small elephant. You recognize a rhinoceros in its totality—as a pattern. The same was true for radiologists. They weren’t cogitating, recollecting, differentiating; they were seeing a commonplace object.

Oddly enough, it appears to be possible to teach computers implicit knowledge of medical scans. A computing strategy known as a “neural network” attempts to mimic the human brain by processing thousands or millions of patterns that are fed into the computer. If the computer’s answer is correct, the connection responsible for that answer is strengthened; if the answer is incorrect, that connection is weakened. Over time, the computer’s ability to arrive at the correct answer increases. But there is no set of rules, simply a correlation built up over thousands and thousands of scans. The computer remains a “black box” in its decisions.

 

5. Creative knowledge

It is one thing to absorb knowledge — it is quite another to create new knowledge. One may attend school for 15 or 20 years and diligently apply the knowledge learned throughout his or her career, and yet never invent anything new, never achieve any significant new insight. And yet all knowledge was created by various persons at one point in the past. How is this done?

As with emotional knowledge, creative knowledge is not necessarily an outcome of high intelligence. While creative people generally have an above-average IQ, the majority of creative people do not have a genius-level IQ (upper one percent of the population). In fact, most geniuses do not make significant creative contributions. The reason for this is that new inventions and discoveries are rarely an outcome of logical deduction but of a “free association” of ideas that often occurs when one is not mentally concentrating at all. Of note, creative people themselves cannot precisely describe how they get their ideas. The playwright Neil Simon once said, “I don’t write consciously . . . I slip into a state that is apart from reality.” According to one researcher, “[C]reative people are better at recognizing relationships, making associations and connections, and seeing things in an original way — seeing things that others cannot see.” Moreover, this “free association” of ideas actually occurs most effectively while a person is at rest mentally: drifting off to sleep, taking a bath or shower, or watching television.

Mathematics is probably the most precise and rigorous of disciplines, but mathematical discovery is so mysterious that mathematicians themselves have compared their insights to mysticism. The great French mathematician Henri Poincare believed that the human mind worked subliminally on problems, and his work habit was to spend no more than two hours at a time working on mathematics. Poincare believed that his subconscious would continue working on problems while he conducted other activities, and indeed, many of his great discoveries occurred precisely when he was away from his desk. John von Neumann, one of the best mathematicians of the twentieth century, also believed in the subliminal mind. He would sometimes go to sleep with a mathematical problem on his mind and wake up in the middle of the night with a solution. Reason may be used to confirm or disconfirm mathematical discoveries, but it is not the source of the discoveries.

 

6. The Moral Imagination

Where do moral rules come from? Are they handed down by God and communicated through the sacred texts — the Torah, the Bible, the Koran, etc.? Or can morals be deduced by using pure reason, or by observing nature and drawing objective conclusions, they same way that scientists come to objective conclusions about physics and chemistry and biology?

Centuries ago, a number of philosophers rejected religious dogma but came to the conclusion that it is a fallacy to suppose that reason is capable of creating and defending moral rules. These philosophers, known as the “sentimentalists,” insisted that human emotions were the root of all morals. David Hume argued that reason in itself had little power to motivate us to help others; rather sympathy for others was the root of morality. Adam Smith argued that the basis of sympathy was the moral imagination:

As we have no immediate experience of what other men feel, we can form no idea of the manner in which they are affected, but by conceiving what we ourselves should feel in the like situation. Though our brother is upon the rack, as long as we ourselves are at our ease, our senses will never inform us of what he suffers. They never did, and never can, carry us beyond our own person, and it is by the imagination only that we can form any conception of what are his sensations. . . . It is the impressions of our own senses only, not those of his, which our imaginations copy. By the imagination we place ourselves in his situation, we conceive ourselves enduring all the same torments, we enter as it were into his body, and become in some measure the same person with him, and thence form some idea of his sensations, and even feel something which, though weaker in degree, is not altogether unlike them. His agonies, when they are thus brought home to ourselves, when we have thus adopted and made them our own, begin at last to affect us, and we then tremble and shudder at the thought of what he feels. (The Theory of Moral Sentiments, Section I, Chapter I)

Adam Smith recognized that it was not enough to sympathize with others; those who behaved unjustly, immorally, or criminally did not always deserve sympathy. One had to make judgments about who deserved sympathy. So human beings imagined “a judge between ourselves and those we live with,” an “impartial and well-informed spectator” by which one could make moral judgments. These two imaginations — of sympathy and of an impartial judge — are the real roots of morality for Smith.

__________________________

 

This brings us to our final topic: the role of non-rational forms of knowledge within reason itself.

Aristotle is regarded as the founding father of logic in the West, and his writings on the subject are still influential today. Aristotle demonstrated a variety of ways to deduce correct conclusions from certain premises. Here is one example that is not from Aristotle, but which has been used as an example of Aristotle’s logic:

All men are mortal. (premise)

Socrates is a man. (premise)

Therefore, Socrates is mortal. (conclusion)

The logic is sound, and the conclusion follows from the premises. But this simple example was not at all typical of most real-life puzzles that human beings faced. And there was an additional problem.

If one believed that all knowledge had to be demonstrated through logical deduction, that rule had to be applied to the premises of the argument as well. Because if the premises were wrong, the whole argument was wrong. And every argument had to begin with at least one premise. Now one could construct another argument proving the premise(s) of the first argument — but then the premises of the new argument also had to be demonstrated, and so forth, in an infinite regress.

To get out of this infinite regress, some argued that deduced conclusions could support premises in the same way as the premises supported a conclusion, a type of circular support. But Aristotle rejected this argument as incoherent. Instead, Aristotle offered an argument that to this day is regarded as difficult to interpret.

According to Aristotle, there is another cognitive state, known as “nous.” It is difficult to find an English equivalent of this word, and the Greeks themselves seemed to use different meanings, but the word “nous” has been translated as “insight,” “intuition,” or “intelligence.” According to Aristotle, nous makes it possible to know certain things immediately without going through a process of argument or logical deduction. Aristotle compares this power to perception, noting that we have the power to discern different colors with our eyesight even without being taught what colors are. It is an ingrained type of knowledge that does not need to be taught. In other words, nous is a type of non-rational knowledge — tacit, intuitive, and direct, not requiring concepts!

Omnipotence of God

 Is [God] willing to prevent evil, but not able? then is he impotent. Is he able, but not willing? then is he malevolent. Is he both able and willing? whence then is evil?   —  David Hume

The passage  above from the Scottish philosopher David Hume succinctly summarizes the reasoning behind the decisions of many to adopt the position of atheism, whether they are aware of Hume or not.  In fact, the challenge posed by this short argument has rarely been answered to the complete satisfaction of many.  Theodicy is the term that has been used to denote philosophies that have attempted to reconcile God and the existence of evil.

In the Judeo-Christian tradition, God is conceived as having the attribute of omnipotence, which is usually defined as unlimited power, the ability to do whatever one wants.  However, this conception of God was not held by many ancient Greek and Roman philosophers, who saw God as being very powerful but not all-powerful.  The Roman physician and philosopher Galen argued that God was limited by necessity and matter — God could not do whatever he wanted, and this conception was different from the Jewish and Christian conception:

This is where our opinion, and that of Plato and all others among the Greeks who correctly deal with the rationality of Nature, differs from that of Moses.  For Moses, it is sufficient to say merely that God “willed” to order the universe in a certain way, and it was done.  For he [Moses] thinks that everything is possible for God, even if he wanted to make a horse or a bull out of ashes.  But we know that is not the case.  We say, on the contrary, that certain things are impossible by nature.  God does not even attempt those things, but from what is possible, he chooses the best to come about.  (On the Usefulness of the Parts of the Body, quoted in Dale B. Martin, Inventing Superstition )

In this view, the existence of evil does not pose a problem for the existence of God, because God is not all-powerful to begin with — God is simply very powerful.

In the contemporary, popular view, the conception of God as being less than all-powerful is regarded as blasphemous, ridiculous, or self-contradictory.  However, the notion of limitations on God’s power is found in a number of prominent Christian theologians, including Paul Tillich, Reinhold Niebuhr, Edgar S. Brightman, and Martin Luther King, Jr.  In fact, there is an entire school of thought known as “Process Theology,” that conceives of God as being limited in power and acting gradually on the world over time.  This alternative conception of God is not an entirely satisfactory answer to the problem of evil, but I would argue that it holds much fewer difficulties than the popular conception of a God who can do whatever He wants but chooses not to.  It is also superior to an atheism that sees the universe as being composed merely of a set of physical laws and random events with no underlying, unifying, intelligent order.