In modernity, the question “how can we know?” has often had an anxious and restrictive ring, as if we are in search of the one true method of knowledge acquisition, excluding all others (Descartes [1637]). But the suggestion in our conference title is that knowing is a social project in which we can work together to make knowledge even in perilous times. Perhaps, then, we do not need to be so anxious. It may be that modernity, and anxious modern thinkers like Descartes, do not have the last word on how we make knowledge. But if modernity does not have the last word, then we have evidently entered a landscape that deserves the label “postmodern.”

What does it mean to be postmodern? In philosophy, as well as in other sectors of Western culture, one can identify two prominent ways of getting beyond modernity, which I will call left‐wing and right‐wing postmodernism. Both are ways of responding to what I call “the postmodern insight.” This is a self‐critical insight, which modern thinkers apply to themselves. Postmodernism thus arises within modernity when the postmodern insight leads modern thinkers in the West to tell a story about themselves that undermines the self‐image of modernity. Since I advocate a version of right‐wing postmodernism, I have a story to tell: a narrative of modernity implying that modernity is not what it thinks it is. The story has important implications for how we conceive the relation between science and religion.

AFTER THE POSTMODERN INSIGHT

Modernity in the life of thought originates from Western intellectual traditions, such as secular liberalism and the modern sciences, which are averse to any robust notion of the formative power of traditions. This means that modern traditions of thought come to a kind of crisis when they recognize that they too are traditions. This recognition is the postmodern insight. It begins with the broader recognition that intellectual life always arises within, is formed by, and cannot be understood apart from particular social and historical contexts—contexts for which the most illuminating epistemological label is, Alasdair MacIntyre suggests, “traditions of enquiry” or, as I will put it, intellectual traditions. The crisis arises when modern thinkers apply this recognition not just to ancient religions or primitive tribes but to themselves. It means recognizing that their own thinking is not the outgrowth of pure reason freeing itself from the traditional prejudices of the past but is itself traditional, belonging to one competing tradition among others, involving its own distinctive set of historically conditioned prejudices, prejudgments, or taken‐for‐granted preconceptions. Postmodernism is thus modernity coming to self‐knowledge in a way that makes it hard to remain modern.

Modern thought is antitraditional inasmuch as it regards traditions and their prejudices as essentially irrational, unreflective loyalties that stand in the way of scientific progress, technological advancement, social justice, and political liberation. When a playwright has a traditional character tell the audience that no one knows why there is a fiddler on the roof—it's tradition!—we are encountering a modernist portrayal of a particular tradition. For the modernist writer, tradition means things do not change and no one knows why. Traditional thinking embraces ignorance. It means social life resists becoming rational and free: daughters must not decide for themselves whom to marry but let matchmakers make them a match, and must certainly keep from marrying anyone with the wrong religion. Tradition, in short, means oppression as well as irrationality.

The recognition that traditions are inevitable—that our thinking is always shaped, in more ways than we know, by our history and social context—poses a challenge to the self‐image of modernity, once modern thinkers start applying it to themselves. After this postmodern insight, modernity cannot remain fully, unselfconsciously modern, as if “modern” always meant “better.” For if modernity is not the outgrowth of the progress of pure reason, then it does not occupy a superior position from which it may safely pass judgment on the irrationality and oppressiveness of the traditional thinking of others.

Postmodernism comes in two kinds because there are two ways of responding to the postmodern insight, depending on whether you retain the modern aversion to traditions. A left‐wing postmodernist retains the modern notion that traditional thinking is inherently irrational, which means that the inevitability of tradition implies the inevitability of irrationality. If there is no such thing as pure reason, then there is no escaping irrationality, no neutral ground to find that is free of traditional prejudices. The modern quest for rational progress then comes to seem quixotic, if not oppressive, driven more by prejudices such as those of Western imperialism than by the pursuit of truth. In popular culture left‐wing postmodernism typically takes the form of relativism, rejecting the very idea that we can all know the same truths. In philosophy, two prominent postmodernists are Michel Foucault, who taught us to be conscious of the power games that are at play in the concept of truth (Foucault ), and Jacques Derrida, the founding figure of deconstructionism, who began his career by deconstructing Edmund Husserl's modernist project of providing a rigorous philosophical foundation for the sciences (Derrida [1967]; Derrida [1962]; Caputo ).

A right‐wing postmodernist, by contrast, rejects the modern prejudice against tradition, convinced instead that tradition can be the home of rationality. To be precise, because there are many diverse intellectual traditions, there are many diverse forms of rationality. Human rationality, like human language and culture, is not one single thing but a multitude of ways to speak, think, learn and reason. There is no common ground or neutral territory between them all, in which rival traditions and cultures can meet on equal terms. What one can hope for instead is (to use a favorite postmodern metaphor) a kind of hospitality, where members of one tradition welcome others onto their home turf, making them feel at home in alien territory where they do not have power but will still be heard. In the practice of hospitality, members of rival traditions can learn one another's thinking the way one learns to become fluent in a foreign language. This is always possible, for the simple and inescapable reason that traditions, like languages, are learned. There is no a priori reason why someone cannot learn to participate in a new tradition the way they learn to speak a new language. When that happens, a member of one tradition comes to understand another tradition like a native speaker, and it becomes possible to resolve conflicts between rival traditions in a way that does not inevitably come down to a contest of power. This means irrationality is not inevitable, for even when incommensurable traditions come into conflict, it is possible to learn the truth by honest reasoning.

Two influential right‐wing postmodernists are Hans‐Georg Gadamer, who gave us a critique of the modern “prejudice against prejudice” as well an account of the “fusion of horizons” that can mediate between traditions (Gadamer [1960], 239–53, 267–74, 333–41), and Alasdair MacIntyre, who presents an elaborate defense of the rationality of traditions and an account of how conflict between traditions can be rationally resolved despite the lack of common ground (MacIntyre , 349–88). In what follows I shall be telling a broadly MacIntyrean story about the relation of science and religion after the postmodern insight.

TRADITION AND AUTHORITY

To understand a tradition from within, one must heed its authorities. As used within Western intellectual traditions, the term “authority” originally applied not to rulers but to teachers. In the Middle Ages, rulers had power or command (potestas or imperium), but teachers had authority (auctoritas). We still use the term in this sense when we say a teacher is “an authority on her subject.” In a similar usage, a “master” was originally a teacher, one who had mastered a subject and therefore could teach it; hence we still give masters degrees. A schoolmaster is a master in the original sense; a slavemaster is not. In ancient Rome the name for the latter is not “master” (magister) but “lord” (dominus). It is telling that in modernity words for teaching have become words for domination and oppression. Modernity aims to liberate itself from obedience to its premodern teachers and their authority (Stout ).

Before modernity, authority had an honored and explicit role to play in the life of reason. Most influentially, Augustine (354–430 AD), the preeminent theologian of the Latin Christian tradition, argued that “In the order of nature, when we learn anything, authority precedes reasoning” (Augustine [387], sec. 3). In any intellectual discipline, we begin our education by believing what we are told by our teachers, who (if they are good teachers) are authorities deserving our trust. But that is only the beginning. In the end we want to arrive at our own understanding—to see the truth for ourselves rather than simply hear about it at second hand. (Think of the difference between believing that a mathematical formula is true when you copy it down during a lecture, and learning enough so that you actually understand why it is true.) So coming to intellectual maturity means moving from believing something an authority tells you to understanding it by the use of your own reason. Hence, to combine two key pairs of terms from Augustine, faith seeks understanding, because learning is a process that begins with authority but proceeds by using reason (Augustine [391], sec. 20–35). Hence every academic discipline in the Middle Ages was characterized by its preeminent authorities, such as Aristotle in philosophy, Cicero in rhetoric, Galen in medicine, and Augustine himself in theology.

To begin with authority is to belong to a tradition. Tradition (from Latin traditio, a handing down or handing over) means a transmission of skills and knowledge from one generation to another, which begins with students heeding the authorities who are acknowledged masters of the tradition. As long as we are not done with the process of learning (and when are we ever done?) the beginning is still with us as part of the history that has shaped and is still shaping us. But a living tradition is not just a relation to a past, like preserving an inheritance; it always involves progress and learning new things, investing the resources of the inheritance in new projects, which of course is not without risk. In this sense, both science and religion consist of traditions. This is something they have in common, which modernity, that antitraditional tradition, tends to obscure.

ENLIGHTENMENT AND MODERNITY

In the story of modernity a central role belongs to the Enlightenment, the great antitraditional and secularizing movement of eighteenth‐century Europe. Near the end of the century, the German philosopher Immanuel Kant contributed memorably to the movement's self‐understanding by defining “enlightenment” as a kind of coming of age, when people are no longer willing to “remain in lifelong immaturity” but develop the courage needed “to use one's understanding without guidance from another” (Kant [1784], 41). Enlightenment, in other words, means living by reason, not authority. As Kant proceeded to show in his famous little essay answering the question “What is Enlightenment?,” what he had in mind is the situation of intellectuals in eighteenth‐century Prussia, who needed to be emancipated from political censorship and ecclesiastical control in order to pursue their scholarship in an atmosphere of academic freedom. In effect, it was time for scholars to be grown‐ups, not acting like children under the tutelage of church and state. It was a declaration of independence for the rising professoriate, which came to fruition in the German research universities of the nineteenth century.

Kant's notion of enlightenment undermined any intellectual authority other than science or scholarship (the German term Wissenschaft includes both). It did not simply eliminate authority, however, but vested real intellectual authority in the university, rather than in the established church with its claim to possess divine revelation. The new research universities of Germany pursued an ideal of Wissenschaft that replaced the theological commitments of earlier “confessional” universities, whose identity was determined by official allegiance to documents such as the Augsburg Confession of the Lutheran church (Howard ). This new social context for academic disciplines, explicitly freed from traditional religious commitments and ecclesiastical supervision, tended to obscure the extent to which the modern sciences themselves are traditions with their own authorities, involving the transmission of inherited skills and knowledge from one generation of researchers to another, along with specific problems and inquiries and also specific ways of disciplining practitioners who step out of bounds. One of the reasons Thomas Kuhn's work had such a controversial impact is that it reminded scientists of the messy history of their own disciplines, which was never a simple story of the advance of pure reason, freed from authority and inherited prejudices (Kuhn [1962], 1–9, 136–43). Kuhn made science look more like “traditional knowledge” than many scientists or philosophers were comfortable with.

SCIENCE AND RELIGION

Kant's notion of enlightenment can be taken as the intellectual heart of modernity. As a general proposition, it means that the intellectual life of civilization needs science and scholarship but can do without “traditional” authorities. Hence it is a bit of a shock when the context of science itself, like every other form of scholarship, begins to look like a tradition determined by its own sociohistorical context, prejudices, and authorities. There are various ways of describing the sociohistorical context of science: Kuhn calls it a paradigm, Imre Lakatos calls it a research program, and Alasdair MacIntyre, generalizing from these two philosophers of science to give a broader epistemological account, calls it a tradition (MacIntyre [1977]). If MacIntyre is right, the location of scientific work within ongoing intellectual traditions is something that it has in common with the great religious traditions. Indeed it is something that all forms of modern intellectual life, including not only the sciences but liberal political theory, have in common with traditional authority in general (MacIntyre , 326–48).

The fact that both sciences and religions are traditions does not mean there is no difference between them. It does mean, however, that the difference is not between rationality and irrationality, or reason and faith. By MacIntyre's reckoning, both scientific theory and religious faith generate forms of self‐critical rationality, which inevitably has its home in some tradition or other. There are irrational traditions, of course, but tradition as such is not a form of irrationality—of blind faith, unthinking prejudice, or unquestioned authority. On the contrary, a healthy intellectual tradition is a social context in which individuals begin by learning from the authority of their teachers but are quite capable of coming to think for themselves, often by thinking critically about what they have learned. Such traditions involve inquiry, argument, and conflict—which means that the modernist picture of traditional life as homogenous, monolithic, unchanging, and unreflective is naive, a modernist caricature like that in The Fiddler on the Roof. To be clear, what makes the latter modern is not that it portrays traditional characters as unreflective—it does not—but rather that it portrays reflectiveness as leading always in a modernizing direction, making the characters less traditional and more like us.

If both religion and science can take the form of traditions harboring rationality, how should we distinguish them? Both involve inquiry, argument, and criticism, including self‐criticism and the revision of previous views. Both learn new things. But science, we could say, is all about learning new things, about making new discoveries, gathering new observations and coming up with new theories to explain them. Religion is different because, although every religious tradition has to learn new things in order to survive and be intellectually healthy, learning new things is not the fundamental purpose of religion.

Consider the three great Western monotheisms. All of them have an essential loyalty to their own past, which is not so essential in scientific work. They are based on what Judaism calls divine “instruction” (torah), what Christianity calls “the word of God,” and what Islam calls a history of divine “revelations.” Each of them sees itself as the recipient of a gift of truth that must be guarded like a deposit that has been entrusted to them; it is a truth to obey and live by, which means in practice that it is a gift whose meaning must be continually interpreted. Hence, these traditions are shaped by the ongoing intellectual work that we call theology.

I suppose the Eastern traditions are rather different, in part because I am one of those scholars who thinks the Western label “religion” does not quite fit them (Cavanaugh , 57–122). Confucianism and Daoism, for example, are both more like an ethos than a religion. “Hinduism” refers to a wide variety of devotional practices, philosophical inquiries, cultural mores, and social obligations, which only artificially (and under Western pressure) have been made into a single system of religion. And many forms of Buddhism look to Westerners more like a form of meditation than a religion. Rather than seeking something they all have in common that fits the label “religion,” I would look within the cultural histories of the East for ongoing traditions of inquiry, with common texts and shared practices to interpret. There are surely more than one of these traditions within the huge cultural phenomenon that Westerners have labeled “Hinduism,” and likewise within “Buddhism.” Even so, it is not clear to me that we can identify in them the kind of argumentative doctrinal and legal tradition that is characteristic of Judaism, Christianity, and Islam. So from here on, I will use the term “religion” to refer only to these three monotheist traditions. They provide a useful contrast with modern science, both because they have played a formative role in Western culture and because they make truth claims that can be compared and perhaps contrasted with scientific claims. They therefore raise questions about the relation of science and religion that I am not sure really arise within the Eastern traditions.

The three great Western monotheisms make ambitious truth claims, argue for and against them, and sometimes change their minds about them. Each harbors rich traditions of distinctive forms of reasoning, which we can see on display in Talmudic debates, in Christian theological arguments, and in Islamic jurisprudence. Like the scientific traditions, each one keeps having new challenges to face, new conflicts to resolve, and new questions to answer. They cannot simply invent new ways to defend old doctrines, like Ptolemaic astronomers inventing new epicycles. Like the sciences, they must keep learning new things.

But unlike the sciences, learning new things is not their raison d'etre. The religious traditions have a gift of truth to cherish, a deposit of faith to guard, and it is precisely in order to guard it that they need to keep learning new things, facing new challenges, being self‐critical, engaging in their distinctive forms of reasoning. There is a kind of conservatism here, a devotion to their own past that one does not see in the sciences. Yet I think the label “conservatism” does not get us to the heart of the matter.

The heart of the matter is that the religious traditions never outgrow the authority of their particular gift of truth. Enlightenment, in the sense of learning to think for oneself, remains a goal but it cannot be the whole goal. The reason why, as Søren Kierkegaard saw, is that in this case God is the teacher, the one who speaks with authority (Kierkegaard [1844], 9–36). The importance of this is not simply that God knows more than any other teacher, but that there is nothing more important to know than the teacher himself. Here, teacher and truth are one. To outgrow the authority of this teacher could only mean to give up the pursuit of truth.

In the three great monotheist religions, what we ultimately want to know is a person. And when it comes to knowing persons, there is no getting around their authority to speak for themselves. This is not a hindrance or a limitation, but the very nature of relations between persons. Knowing others as persons means being drawn into their self‐knowledge, so it requires heeding the account they give of themselves (Cary ). In this sense there is always a kind of second‐hand structure to knowing other persons. You cannot know them without their say‐so, which means knowing other persons is not a matter of seeing for yourself but of trusting in the authority of testimony: the testimony of the other concerning himself. (It is different if you are “seeing through” a liar, as we put it, but this should not be the paradigm for knowing other persons.) Hence, the gift of truth at the basis of the three great Western religious traditions derives from the conception that what we want to know is a person who gives himself to be known by speaking for himself. That is the root of the authority these traditions claim for themselves. And that is why their own past has a kind of authority for them: it is where they find the primal gift of truth in which the person they obey and love has spoken and given himself to be known.

THE DIFFERENCE SOCRATES MAKES

The differences between science and religion—or, to speak more precisely, the sciences and the religions—should not be allowed to obscure what they have in common: all of them are forms of rationality embedded in traditions that learn from their own authorities, the great teachers of the tradition who give them a distinctive set of prejudices, which in turn give them a distinctive set of problems and tasks for inquiry. Their intellectual projects have a trajectory into the future that cannot be understood without telling the story of their progress in the past (MacIntyre [1977], 21–23). Yet they are never simply imprisoned in their past, because they are forms of self‐critical rationality. As they solve new problems, they also learn to question what they had earlier taken for granted, thus sparking arguments within the tradition about what actually belongs to the tradition. Intellectual traditions argue about themselves; they are characterized by conflict about where the boundaries of the tradition ought to be (MacIntyre , 12). Thus, theologians argue about what constitutes orthodox teaching rather than heresy, or scientists argue about whether a particular theory counts as real science. The presence of such conflict in all intellectual traditions signals the fact that, whether they like it or not, they are open to falsification and subject to revision. I want to give a label to this self‐critical aspect of intellectual traditions: call it the “Socratic” element.

Every healthy intellectual tradition has people in it who do what Socrates did: ask questions that teach people to turn what had once been taken for granted (what Gadamer calls the tradition's “prejudices” and MacIntyre its “fundamental agreements”) into a matter for critical inquiry. The portrait of Socrates in the writings of Plato is the most memorable Western representation of this kind of questioning. The figure of Socrates is particularly important for my purposes because his legacy is an element in the history of both of Western sciences and Western religions. The legacy of Socrates is taken up by Plato's Academy and continues with Plato's most Socratic student, Aristotle, who was a master of the art of identifying and questioning what is taken for granted (Nussbaum , 240–63), and after Aristotle it gradually becomes essential to the whole of Western intellectual life. With Socrates, something irreversibly self‐critical gets into the Western bloodstream.

Socrates, notoriously, spent his life asking questions. What was new about his questions is that they were not the rhetorical questions of political speech, court cases, or prophecy, designed to challenge, push, or accuse an opponent; their aim was rather to initiate a shared inquiry. Socrates' questions, as taken up by Plato, opened out onto a systematic investigation into the nature of things. In place of cosmic speculations that were best embodied in poetry (Parmenides, Empedocles) or gnomic sayings (Heraclitus), Plato's writings made philosophy look like a drama, full of comic ironies and reversals but persistent in its ultimate aim of uncovering the truth. Plato's brilliant dialogues, in which Socrates was typically the main character, did not merely present the results of inquiry but dramatized the process of inquiring together, turning it into a shared social project. His key label for this process was “dialectic” (Greek dialektike), which originally meant dialogue or conversation, but after Plato came to mean specifically the kind of critical give‐and‐take that Socrates initiated: question and answer, objection and reply, refutation and revision. The very notion of “critical” thought, as the West uses the phrase, has its roots here: in the need to make a judgment (krisis in Greek, from which we get our word “critical” as well as “crisis”) in response to the kind of question Socrates asked, which is to say a judgment about what is really true.

The students of Plato learned to ask and answer Socratic questions, and in the process invented the new type of inquiry that came to be called science, a social project of investigating the nature of things whose discipline spanned the generations. We can see the project emerging as Aristotle outlines a whole series of dialectical exercises in his Topics, the largest of his treatises on logic: it is not hard to imagine him using these exercises as Plato's teaching assistant in the Academy or later in his own school, the Lyceum. At any rate, at some point he began to write up an account of the formal structure of dialectical give‐and‐take and thus to formulate the discipline of logic, analyzing the nature of logical argumentation in his account of syllogisms (in Prior Analytics) and the use of syllogisms to demonstrate scientific conclusions (in Posterior Analytics). In other words, the practice of Socratic dialectic led to the invention of logic, which in turn led to the invention of the Western notion of science. The Socratic legacy handed down through Plato developed in fact into a whole set of intellectual traditions, as Aristotle invented new disciplines such as physics and zoology, by combining empirical investigation with dialectical inquiry and syllogistic proofs.

The legacy of Socrates in Athens turned the pursuit of wisdom into a science, which meant that the pursuit of wisdom itself became a new thing. As a result, there emerged a distinctively Western form of philosophy (from Greek philosophia, meaning the love of wisdom) that combined two features, one universal and the other a particular outgrowth of Socratic dialectic.

First, the universal feature is the pursuit of wisdom. Every culture has some kind of wisdom tradition, a legacy of understanding about how to live well, how to speak well, how to grow up, how to raise children, how to judge disputes between contending parties. One looks for this wisdom in the elders of the tribe, in its sages, judges, and teachers. It is what one expects in the culture's fathers and mothers and hopes for in its rulers. To understand a culture is indeed in large part to understand its wisdom, its way of teaching people how to live well. To deny that a culture has a wisdom tradition is probably to deny it is a culture at all, or else to denigrate it as a culture savage and degenerate. To fail to recognize the pursuit of wisdom in a culture other than one's own is the mark of ethnocentrism at best and racism at worst.

Second, the distinctive feature is the formal discipline of logic. In the legacy of Socrates, the Greek wisdom tradition met logic, as formulated for the first time by Aristotle. People could think logically before Aristotle, of course, just as they could speak grammatically before grammar books. But putting the way people speak and think—when they do so correctly and cogently—into books that then guide actual practice introduces a new kind of discipline into their thought and speech. When Plato subjected the Greek wisdom tradition to the dialectical back‐and‐forth of Socratic inquiry, and then Aristotle used his new account of logic to put the results of this shared inquiry into scientific form, the upshot was philosophy as a new kind of social project, a distinctively Western wisdom tradition. The legacy of Socrates turned into an intellectual tradition that could dialectically engage any culture, beginning with its own, making critical judgments that turned the pursuit of wisdom into a systematic, logically disciplined project of philosophy that became central to Western intellectual life. The tradition spread well beyond Athens, as Hellenistic culture invaded the Eastern Mediterranean world along with Alexander the Great and was furthered by the expansion of the Roman empire, which took up the Socratic legacy of Greek philosophy and science and made it its own.

THE SOCRATIC ELEMENT IN WESTERN RELIGIONS

Fatefully—or for those of us within the Christian tradition, the better term may be “providentially” (John Paul II , 92)—the expansion of the Hellenistic empire found its most difficult and articulate opponents in the little area at the eastern end of the Mediterranean that was then called Judaea, the land of the Judaeans (Ioudaioi in Greek, which our translations of the New Testament typically render, rather anachronistically, as “the Jews”). They were a tough nut for the empire to crack, because they already had their own written wisdom tradition, unlike most other nations around the Mediterranean (the great exception being Egypt). The Judaeans' literary tradition, unlike Egypt's, included powerful stories of liberation, such as the book of Exodus, as well as a written law that demanded exclusive worship of a jealous deity. Attempts by Alexander's successors to replace the practice of that law and worship with something more Greek precipitated the Judaean revolt led by the family of the Maccabees. The Roman governors of Jesus' day tried to prevent a similar revolt, with no more success in the long run.

The Judaeans' political resistance to the empires that tried to assimilate them, both Greek and Roman, was supported by cultural resistance to Hellenism and its classical culture. But the cultural resistance was far from total. By the time of Jesus there was already a rich literature of Judaean or Jewish literature written in Greek. For example, Philo of Alexandria was a first‐century Judaean intellectual who drew extensively on classical philosophy as he wrote dozens of commentaries, in Greek, on the Judaean sacred Scriptures (what Christians later called “the Old Testament”). Thus, the legacy of Socrates already had a place in Judaean thought by the time Jesus was hanging on a Roman cross that called him “King of the Judaeans” (Matthew 27:37) and it played a significant role in the earliest Christian writings, those of another Hellenized Judaean whom we know as St. Paul, which of course became part of the collection of Greek documents called the New Testament.

What this means is that the Christian tradition originates with the legacy of Socrates already in its bloodstream, and goes on from there. Beginning in the next century, Christian thought was developed mainly by Gentile “church fathers,” intellectuals such as the second‐century Platonist philosopher Justin Martyr and the fifth‐century bishop Augustine, whose key task was to interpret a set of Judaean Scriptures, which we now call the Bible, by means of classical culture, philosophy, and rhetoric, incorporating yet more of the legacy of Socrates. It is not so surprising, then—and certainly not alien to the spirit of the Christian tradition—when a thousand years later Thomas Aquinas (1225–1274) puts Roman Catholic theology into the logical framework of an Aristotelian science, demonstrated by means of syllogisms and sandwiched between dialectical objections and replies, in his Summa Theologica as well as his many treatises on disputed questions. By this time—in the High Middle Ages—the legacy of Socrates is deeply embedded in the Christian tradition.

But it was not just Christians. While the Gentile church fathers were wrestling with their Judaean books, the tradition of the original Judaean teachers still working in Hebrew and Aramaic developed dialectical inquiries of their own, which were later incorporated into the text we know as the Talmud. Though theirs was a very different intellectual and rhetorical style from the classical Hellenism that was native to the church fathers, they too had clearly taken up something of the legacy of Socrates. These teachers told some memorable stories but they made no prophecies; mostly they argued about what properly belonged to the law given to Moses in the oral and written torah, and thus formed the distinctive tradition of self‐critical rationality known as rabbinic Judaism.

The legacy of Socratic dialectic was also taken up by the Islamic tradition in its own distinctive way, beginning with the massive translation project in eighth‐century Baghdad, which turned Greek texts into Syriac and Arabic and thus made classical philosophy readily available to Muslim thinkers (Adamson and Taylor , 1–71). From very early on, Islam was shaped by a robust self‐critical rationality that included arguments about metaphysics and science as well as exegesis and jurisprudence, which later played an important role in the intellectual reinvigoration of the Western Middle Ages and in Christian writers like Aquinas.

THE SOCRATIC LEGACY IN THE MODERN WEST

A tradition that welcomes the legacy of Socrates includes a rationality that can question itself. Above all, it can learn to look critically at its own prejudices, its taken‐for‐granted preconceptions, and revise them. This is what Socrates does in many of Plato's dialogues. He asks questions like “What is virtue?” “What is piety?” “What is justice?” and “What is knowledge?” (the central topics of Plato's Meno, Euthyphro, Republic, and Theaetetus, respectively) and, asking still more questions, lures his conversation partners into a critical examination that ends up refuting one possible answer after another, with the result that they come to recognize that they do not really know what they thought they knew. After Socrates shows up, cultural values that had once appeared obvious come to seem difficult to define. Words that every Athenian had been using since childhood (“virtue,” “justice,” “knowledge,” and so on) appear opaque, having a meaning that demands further inquiry. To encounter Socrates is to find that you still have much to learn about the shape of your own life, which you used to be able to take for granted.

Socratic questioning can put a person off balance. This is a regular experience of young people going to college in our day. It has undermined many a naive belief, but it also leads to a strengthening of healthy intellectual traditions. To pass through the process of Socratic dialectical inquiry is to know, at a much deeper level than before, why you think the way you think and live the way you live. Socrates is unsettling, but he is good for us. Without him we would not have Western science, logic, and philosophy. And the great monotheist traditions, too, would be unimaginable without him. Without Socrates, I think, there would be no Aquinas or Augustine or even St. Paul; no rabbinic dialectic, no Talmud, and no Maimonides; no Avicenna or Al‐Farabi or richly argued tradition of Islamic jurisprudence.

As a result of their shared Socratic legacy, Western science and religion have not, on the whole, encountered each other in the way the Kantian picture of enlightenment expects: as pure reason pitted against a tradition of authority. Rather, when they conflict, I think we can see rival traditions of rationality in critical dialogue with one another. Hence there is much in the modern notion of the relation of science and religion that the religious traditions, if they are to be intellectually healthy, should not accept. From my right‐wing postmodernist perspective, that notion is vitiated by modernity's failure to understand itself, a very un‐Socratic tendency not to recognize one's own prejudices as belonging to one tradition among others, which results in a failure to recognize the rationality of traditions whose roots are other than modern.

I think members of the Christian tradition are in a particularly good position to recognize this failure, because so much of modernity grows out of the Western Christian tradition. In particular, modern Western secularism is predominantly secularized Christendom, still containing the residue—often a quite massive residue—of Christian beliefs, habits, and values. Hence, critical discussion between science and religion is a commonplace within Western culture, as illustrated by the existence of organizations such as IRAS and journals such as Zygon: Journal of Religion and Science, both of which also illustrate the important point that many individuals are members of multiple traditions, scientific as well as religious. By the same token, however, those who see no value in traditions that are other than modern have tended to be particularly hostile to Christianity, which represents for them the past from which they want to be freed. An example is the spate of books ten years ago by “new atheists,” whose atheism was in fact not new, but rather outdated precisely because it is so modern, more at home in the eighteenth century Enlightenment attack on orthodox Christianity than in the postmodern present (Dawkins ; Dennett ; Hitchens ; for a right‐wing postmodernist reply, see Hart ). It seems to me important not to let this kind of naive and outdated modernism define the terms of the discussion between science and religion, as if it were a debate between representatives of reason on the one hand and faith on the other. It is in fact a conversation between members of different traditions of rationality with a long history of learning from one another, which continues today.

Notes

  1. A version of this article was presented at the 62nd Annual Summer Conference of the Institute for Religion in an Age of Science (IRAS) entitled “How Can We Know? Co‐Creating Knowledge in Perilous Times” held on Star Island, New Hampshire, from June 25 to July 2, 2016.

References

Adamson, Peter, and Richard C.Taylor, eds. 2005. The Cambridge Companion to Arabic Philosophy. Cambridge, UK: Cambridge University Press.

Augustine of Hippo. [391]1953. The Usefulness of Belief  . In Augustine: Earlier Writings, edited and translated by J.H.S.Burleigh, 291–323. London, UK: SCM Press.

Augustine of Hippo. [387]1983. On the Morals of the Catholic Church  , translated by Richard Stothert. In Nicene and Post‐Nicene Fathers, First Series, Vol. 4, edited by PhilipSchaff, 37–63. Grand Rapids, MI: William B. Eerdmans.

Caputo, John D.1987. Radical Hermeneutics: Repetition, Deconstruction, and the Hermeneutic Project. Bloomington: Indiana University Press.

Cary, Phillip. 1996. “Believing the Word: A Proposal about Knowing Other Persons.” Faith and Philosophy  13:78–90.

Cavanaugh, William T.2009. The Myth of Religious Violence. Oxford, UK: Oxford University Press.

Dawkins, Richard. 2006. The God Delusion. Boston, MA: Houghton Mifflin.

Dennett, Daniel. 2006. Breaking the Spell: Religion as a Natural Phenomenon. New York, NY: Penguin.

Derrida, Jacques. [1967]1973. Speech and Phenomena: And Other Essays on Husserl's Theory of Signs. Evanston, IL: Northwestern University Press.

Derrida, Jacques. [1962]1989. Edmund Husserl's Origin of Geometry: An Introduction. Lincoln: University of Nebraska Press.

Descartes, René. [1637]1988. Discourse on the Method of Rightly Conducting One's Reason and Seeking the Truth in the Sciences  , translated by JohnCottingham, RobertStoothoff, and DugaldMurdoch, 20–56. Cambridge, UK: Cambridge University Press.

Foucault, Michel. 1980. “Truth and Power  ,” translated by Colin Gordon. In Power/Knowledge: Selected Interviews and Other Writings, edited by ColinGordon, 109–33. New York, NY: Pantheon.

Gadamer, Hans‐Georg. [1960]1985. Truth and Method. New York, NY: Crossroad.

Hart, David Bentley. 2009. Atheist Delusions: The Christian Revolution and Its Fashionable Enemies. New Haven, CT: Yale University Press.

Hitchens, Christopher. 2007. God Is Not Great: How Religion Spoils Everything. New York, NY: Twelve.

Howard, Thomas Albert. 2009. Protestant Theology and the Making of the Modern German University. Oxford, UK: Oxford University Press.

John, Paul II. 1998. Fides et Ratio: On the Relationship between Faith and Reason. Boston, MA: Pauline Books and Media.

Kant, Immanuel. [1784]1983. “An Answer to the Question: What Is Enlightenment  ?” In Perpetual Peace and Other Essays, translated by TedHumphrey, 41–48. Indianapolis, IN: Hackett.

Kierkegaard, Søren. [1844]1985. Philosophical Fragments. Princeton, NJ: Princeton University Press.

Kuhn, Thomas S. [1962]1970. The Structure of Scientific Revolutions. Chicago, IL: The University of Chicago Press.

MacIntyre, Alasdair. 1988. Whose Justice? Which Rationality? Notre Dame, IN: University of Notre Dame Press.

MacIntyre, Alasdair. [1977]2007. “Epistemological Crises, Dramatic Narrative, and the Philosophy of Science  .” In The Tasks of Philosophy: Selected Essays, Vol. 1, 3–23. Cambridge, UK: Cambridge University Press.

Nussbaum, Martha C.1986. The Fragility of Goodness. Cambridge, UK: Cambridge University Press.

Stout, Jeffrey. 1981. Flight from Authority: Religion, Morality, and the Quest for Autonomy. Notre Dame, IN: University of Notre Dame Press.