by Dr. Martin J. Verhoeven
Religion East and West, Issue 1, June 2001, pp. 77-97

Western interest in Eastern religions, especially Buddhism, historically coincided with the rise of modern science and the corresponding perceived decline of religious orthodoxy in the West. Put simply: Modern science initiated a deep spiritual crisis that led to an unfortunate split between faith and reason—a split yet to be reconciled. Buddhism was seen as an “alternative altar,” a bridge that could reunite the estranged worlds of matter and spirit. Thus, to a large extent Buddhism’s flowering in the West during the last century came about to satisfy post-Darwinian needs to have religious beliefs grounded in new scientific truth.

As science still constitutes something of a “religion” in the West, the near-absolute arbiter of truth, considerable cachet still attends the linking of Buddhism to science. Such comparison and assimilation is inevitable and in some ways, healthy. At the same time, we need to examine more closely to what extent the scientific paradigm actually conveys the meaning of Dharma. Perhaps the resonance between Buddhism and Western science is not as significant as we think. Ironically, adapting new and unfamiliar Buddhist conceptions to more ingrained Western thought-ways, like science, renders Buddhism more popular and less exotic; it also threatens to dilute its impact and distort its content.

Historians since the end of World War II, have suggested that the encounter between East and West represents the most significant event of the modern era. Bertrand Russell pointed to this shift at the end of World War II when he wrote, “If we are to feel at home in the world, we will have to admit Asia to equality in our thoughts, not only politically, but culturally. What changes this will bring, I do not know. But I am convinced they will be profound and of the greatest importance.”

More recently, the historian Arthur Versluis, in a new book, American Transcendentalism and Asian Religions (1993), pieced together five or six major historical views on this subject, and presented this by way of conclusion:

However much people today realize it, the encounter of Oriental and Occidental religious and philosophical traditions, of Buddhist and Christian and Hindu and Islamic perspectives, must be regarded as one of the most extraordinary meetings of our age. . . . Arnold Toynbee once wrote that of all the historical changes in the West, the most important—and the one whose effects have been least understood—is the meeting of Buddhism in the Occident. . . . And when and if our era is considered in light of larger societal patterns and movements, there can be no doubt that the meeting of East and West, the mingling of the most ancient traditions in the modern world, will form a much larger part of history than we today with our political-economic emphases, may think.

These are not isolated opinions. Many writers, scholars, intellectuals, scientists, and theologians have proclaimed the importance of the meeting of East and West. Occidental interest in the Orient predates the modern era. There is evidence of significant contact between East and West well before the Christian era. Even in the New World, curiosity and interchange existed right from the beginning, as early as the 1700s. One can find allusions to Asian religions in Cotton Mather, Benjamin Franklin, Walt Whitman, and of course, more developed expressions in Henry David Thoreau, and Ralph Waldo Emerson.

By the mid-twentieth century this growing fascination with Asian thought led Arnold Toynbee to envision a new world civilization emerging from a convergence of East and West. He anticipated that the spiritual philosophies of Asia would touch profoundly on the three basic dimensions of human existence: Our relationships with each other (social); with ourselves (psychological); and, with the physical world (natural). What is the shape and significance of this encounter? What does Buddhism contribute to the deeper currents of Western thought; and more specifically, to our struggle to reconcile faith with reason, religion with science?

Science was already the ascendant intellectual sovereign when Buddhism made its first serious entry on the American scene in the latter decades of the 19th century. A World’s Parliament of Religions, held in conjunction with the 1893 Colombian Exposition in Chicago, brought to America for the first time a large number of Asian representatives of the Buddhist faith. These missionaries actively and impressively participated in an open forum with Western theologians, scientists, ministers, scholars, educators, and reformers. This unprecedented ecumenical event in the American heartland came at a most opportune time. America was ready and eager for a new source of inspiration, ex orient lux, the ‘light of Asia.’

By the 1890s America was caught in the throes of a spiritual crisis affecting Christendom worldwide. Modern scientific discoveries had so undermined a literal interpretation of sacred scripture, that for many educated and thoughtful people, it was no longer certain that God was in his heaven and that all was right with the world. These rapid changes and transformations in almost every aspect of traditional faith, had such irreversible corrosive effects on religious orthodoxy, that they were dubbed, “acids of modernity.” They ate away at received convictions, and ushered in an unprecedented erosion of belief. People like my grandparents, brought up with rock-solid belief in the infallible word of God, found their faith shaken to its very foundations. It was as if overnight they suddenly awoke to a new world governed not by theological authority but by scientists. New disclosures from the respected disciplines of geology, biology, and astronomy challenged and shattered Biblical accounts of the origins of the natural world and our place and purpose in it. Sigmund Freud captured the spirit of the age well when he said “the self-love of mankind has been three times wounded by science.” The Copernican Revolution, continued by Galileo, took our little planet out of the center position in the universe. The Earth, held to be the physical and metaphysical center of the Universe, was reduced to a tiny speck revolving around a sun. Then Darwin all but eliminated the divide between animal and man, and with it the “special creation” status enjoyed by humans. Darwin, moreover, diminished God. The impersonal forces of natural selection kept things going; no divine power was necessary. Nor, from what any competent scientist could demonstrate with any factual certainty, was any Divinity even evident—either at the elusive “creation,” or in the empirical present. Karl Marx people portrayed people as economic animals grouped into competing classes driven by material self-interest. Finally, Freud himself characterized religious faith as an evasion of truth, a comforting illusion sustained by impulses and desires beyond the reach of the rational intellect. Nietzsche’s famous declaration that “God is Dead” may have seemed extreme, but few would have denied that God was ailing. And certainly the childhood version of a personal, all-powerful God that created the world and ruled over it with justice and omniscience was for many a comforting vision lost forever.

One of the lingering side effects of this loss has been the unfortunate disjunction of matter and spirit that afflicts the modern age. It can assume many forms: a split between matter and spirit, a divorce between faith and reason, a dichotomy between facts and values. At a more personal level, it manifests as a mind-body dualism. An unwelcome spiritual and psychological legacy from the late 19th and early 20th centuries, it is still very much with us today, something that haunts our psyches.

Much of today’s near-obsession with therapy in the West, and even the shift toward psychologizing religion (including the “New Age” phenomenon) could be seen as attempts to heal this deep sense of alienation. The pragmatic philosopher, John Dewey, wrote: “The pathological segregation of facts and value, matter and spirit, or the bifurcation of nature, this integration [i. e. the problem of integrating this] poses the deepest problem of modern life.” This problem both inspires and confounds contemporary philosophy and religion. Wholeness eludes us while the split endures; and yet, almost tragically, the very means we have available to heal it insure its continuation. For, all of our philosophies, academic disciplines, therapies, and even religious traditions are informed by and rooted in aspects of this dualism. Perhaps the most visible expression of this pathological segregation is the gap between science and religion.

Thus, when the eminent philosopher and mathematician Alfred North Whitehead scanned the broad outlines of our time, he wrote: “The future course of history would center on this generation’s resolving the issue of the proper relationship between science and religion, so fundamental are the religious symbols through which people give meaning to their lives and so powerful the scientific knowledge through which we shape and control our lives.” And it is in regard to this troubling issue, I think, that Eastern religions, particularly Buddhism, are seen to hold out the promise of achieving some resolution. The idea dates back over a hundred years.

After the 1893 Chicago Parliament of World Religions, one Paul Carus, a Chicago-based editor of the Open Court Press, invited some of the influential Japanese Buddhist delegates to a week-long discussion at the home of Carus’s father-in-law, Edward Hegeler. Both deeply felt the spiritual crisis of the times. Both were trying to reform Christianity to bring it in line with current thought; in short, to make religion scientific. It occurred to them that Buddhism was already compatible with science, and could be used to nudge Christianity in the same direction. Toward this end, Carus wanted to support a Buddhist missionary movement to the United States from Asia. His thinking was to create something of a level playing field. Carus had witnessed the most ambitious missionary undertaking in modern history that send thousands of Protestant missionaries abroad to convert the people ‘sitting in darkness.’ He wished to conduct a Darwinian experiment of ‘survival of the fittest.” His goal: to bring Buddhist missionaries to America where they could engage in healthy competition with their Christian counterparts in the East, and thus determine the “fittest” to survive.

With the aid of his wealthy father-in-law who put up money, they sponsored a number of Eastern missionaries to the United States: Anagarika Dharmapala, from what was then Ceylon, now Sri Lanka; Swami Vivekananda, from India representing the Ramakrishna Vedanta movement; and Soyen Shaku, a Japanese Buddhist monk, and Shaku’s young disciple D.T. Suzuki. During his stay in the United States in the late 1890s and early 1900s, Suzuki lived in the small town of LaSalle/Peru, Illinois. He was in his twenties then, and for about eleven years he worked closely with Paul Carus translating Buddhist texts into English and putting out inexpensive paperback editions of the Asian classics. Suzuki later became the leading exponent of Zen in the West, when he returned in the 1950s on a Rockefeller grant to lecture extensively at East Coast colleges. He influenced writers and thinkers like Carl Jung, Karen Horney, Erich Fromm, Martin Heidegger, Thomas Merton, Alan Watts, and the “beat Buddhists”—Jack Kerouac, Alan Ginsberg, and Gary Snyder. Suzuki died in 1966 in Tokyo. His influence in the West was profound—making Zen an English word, translating Asian texts into English, stimulating a scholarly interest in the Orient among American intellectuals, and deepening American respect and enthusiasm for Buddhism. The historian Lynn White Jr. praised Suzuki as someone who broke through the “shell of the Occident” and made the West’s thinking global. His introduction to the West came about through the hands of Paul Carus.

These early missionaries of Buddhism to the West, including Carus himself, all shared the same modern, reformist outlook. They translated Buddhism into a medium and a message compatible and resonant with the scientific and progressive spirit of the Age. They selectived passages of text to favor that slant, and carefully presented the Buddhist teachings in such a way as to appeal to modern sensibilities—empirical, rational, and liberal. Americans wanted religion to “make sense,” to accord with conventional wisdom. Then, as now, our primary mode of making sense of things was positivist—reliable knowledge based on natural phenomena as verified by empirical sciences. So firmly entrenched is the scientific outlook that it has for all practical purposes taken on a near-religious authority. Few, then or now, critically question our faith in science; we presume its validity and give it an almost unquestioned place as the arbiter of truth.

Thus, the early missionaries of Buddhism to America purposely stripped Buddhism of any elements that might appear superstitious, mythological, even mystical. Dharmapala, Suzuki, and Vivekananda clearly ascertained that Americans measured truth in science, and science posed little theological threat to a Buddhist and Hindu worldview. After all, Buddhism had unique advantages for someone who rejected their faith (Christian) due to its authoritarianism and unscientific outlook:

1) Buddhism did not assert or depend upon the existence of a God

2) Buddhism was a superstition-free moral ideal; it conformed to the scientific view of an ordered universe ruled by law (Dharma)—a system both moral and physical where everything seemed to work itself out inexorably over vast periods of time without divine intervention (karma)

3) Buddhism posited no belief in gods who could alter the workings of this natural law

4) Buddhism was a religion of self-help with all depending on the individual working out his/her own salvation

5) “Original” Buddhism was seen as the “Protestantism of Asia,” and Buddha as another Luther who swept away the superstitions and rituals of an older, corrupted form and took religion back to its pure and simple origins

6) Buddhism presented an attractive personal founder who led life of great self-sacrifice; parallels were drawn between Jesus and Buddha as the inspiration of a personal figure exerted strong appeal to seekers who had given up on theology and metaphysics.

Thus, Buddhism was packaged and presented in its most favorable light viz a viz the current spiritual crisis in the West; and, not surprisingly, Buddhism seemed immensely reasonable and appealing to Americans. Darwinism might be undermining Biblical Christianity, but it only enhanced Buddhism’s standing.

In fact, Darwin’s theory of evolution, which struck the most severe blow to the Judaeo-Christian edifice, was taken up as the leading banner for Buddhist propagation. With Darwin the concept of evolution became enshrined in the popular mind. Everything was evolutionary—species, races, nations, economies, religions, the universe—from the micro to the macro. Social Darwinists even saw evolution operating behind the vicissitudes of free-market capitalism. As the constant interaction of stimulus and response in nature, evolution seemed to match nicely with the notion of karma—the cyclical unfolding of events governed by the law of cause and effect. So Anagarika Dharmapala could announce in Chicago to his largely Judaeo-Christian audience that “the theory of evolution was one of the ancient teachings of the Buddha.” As it was in nature (at least in the new natural world of Darwin), so it was in the Buddhist universe.

Most people drawn to Eastern religions did not examine very closely the supposed identity of Darwin’s evolution and the Buddhist concept of karma. They were content, even predisposed, to imagine them the same. Buddhists ardent to convert Americans to Buddhism, as well as Christians eager to find some correspondence between modern science and their beleaguered faith, were happy to say, “Yes, the similarities are close enough; look, how the ancient Eastern religions anticipated our modern science!” Vivekananda, the charismatic and eloquent Ramakrishna delegate from India, met only hurrahs of affirmation when he proclaimed to a Chicago audience that the latest discoveries of science seemed “like the echoes from the high spiritual flights of Vedantic philosophy.”

This facile view that Buddhism and science were cut of the same cloth accorded nicely with the longing to reconnect the sacred and the secular. It held out hope that religion could once again assume its rightful place alongside (if no longer in the lead of) the emerging disciplines of biology, geology, and physics. It also fit neatly with the presumed “unity of truth” that Victorians held to so dearly—there could only be one truth, not two. The very nature of reality demanded that the truths of science and religion be one and the same. Carus called his new system of thought “the Religion of Science,” and Max Muller called his new theology “the Science of Religion.”

This trend linking Buddhism to science continued, even accelerated, into the 20th century. Einstein’s work and further developments in the new cutting-edge physics seemed to provide even further evidence that science and Buddhism were merely different rivers leading to the same sea. Where the old theologies crumbled under the juggernaut of science, Buddhism seemed to hold its own, even thrive. The early (and even contemporary) exponents of Buddhism pushed this idea. It remains an area of great promise and interest; but it is not one without difficulties.

One of the first to question this marriage, interestingly, was also one of its earliest proponents, D.T. Suzuki. When Suzuki came to the United States to collaborate with Paul Carus, both were outspoken advocates of the link between Buddhism and science. Suzuki’s early writings make virtually no distinction between Buddhism and science. For Suzuki, Buddhism was eminently modern and progressive, compatible with the latest discoveries in Western psychology and philosophy. It was, in a word, scientifically sound.

By the time Suzuki returned to the United States in the 1950s, however, he had experienced a change of heart. He then wrote that his initial thinking—that religion must be based on scientific grounds and that Christianity was based on too much mythology—was a little ill-founded. An older, perhaps wiser Suzuki, came to doubt the sufficiency of a religion based on science, and even saw the need for religion to critique science. In 1959, Suzuki wrote that his early modernist agreement with Hegeler and Carus that “religion must stand on scientific grounds…Christianity was based too much on mythology,” was ill-founded. “If it were possible for me to talk with them now,” he reflected, “I would tell them that my ideas have changed from theirs somewhat. I now think that a religion based solely on science is not enough. There are certain ‘mythological’ elements in every one of us, which cannot be altogether lost in favor of science. This is a conviction I have come to.”

What had changed? First of all, two world wars. As the contemporary writer Kurt Vonnegut has wryly observed, “We took scientific truth and dropped it on the people of Hiroshima.” Suzuki was, of course, Japanese; he felt directly the negative weight of modern science. Having survived the brutal experience of a war initiated, carried out, and ended with weapons of mass destruction born of modern science, he was left less sanguine about the idyllic marriage with religion and science that he had heralded at the turn of the century. Suzuki was enjoying the wisdom of hindsight; but in fairness to Suzuki, so were many other people.

Since Suzuki’s turnabout in 1959, there have been even further, more fundamental challenges to the presumed closeness of Buddhism and science. Questions have arisen in two areas. One, as a society we have come to reassess the blessings and the promise of modern science in terms of the socio-psychological impact. While people are mesmerized by science and dream about what science can do for them, they also have nightmares about what science can do to them. This bittersweet realization lingers in the contemporary psyche: we dream about all the wonderful things science is going to do for us; at the same time we are haunted by unsettling specters of the dreadful things science could do to us. This concern and troubling ambivalence seems to grow, not diminish, with each scientific advance.

At the popular level, movies and television play on variations of the Frankenstein, Godzilla, the X-Files motif, reflecting anxieties over science-gone-wrong. These “monsters” give form (albeit imaginary) to some of humanity’s deepest fears. They reflect not only the apprehension of Pandora’s box unearthed, but more significantly, the hubris of human pride and lust for power unrestrained. Nowhere is this more evident than in the new field of biotechnology—the actual manipulation of life at the subtle genetic source. Scientists now talk of the end of evolution, the end of nature, in the sense that humans will soon replace nature to direct the course of creation themselves. Doctor Panayiotis Zavos, who is now actively engaged in producing the first human clone, announced proudly, “Now that we have crossed into the third millennium, we have the technology to break the rules of nature.”

Thus, the development and unleashing of “advanced” weapons of mass destruction through two World Wars, the Cold War, and now almost daily in “hot spots” throughout the world; the unenlightened tampering with nature that has brought about widespread environmental pollution; the almost cavalier experiments with human reproduction, cloning, genetically engineered life, chemical-biological warfare—all threaten to make reality more frightening than fiction.

The second area of doubt regarding modern science arises from within the scientific community itself. The last decades of the 20th century have seen an internal reexamination take place within almost every scientific discipline, as each has been forced to question its own foundations and exclusive claims to truth. We are in the midst of a major paradigm shift, the outcome of which still remains unclear. It revolves around a loss of the positivistic certainty that science once enjoyed and now finds slipping away. Ironically, the scientific “establishment” finds itself confronting a challenge to its exclusive authority that in many ways mirrors the spiritual crisis that religious orthodoxy faced with the triumph of modern science.

Sigmund Freud exemplifies this ironic shift. Perhaps more than any modern thinker, he contributed to the undermining of religious certainty. He stated quite unequivocally that “an illusion would be to suppose that what science would not give us, we can get elsewhere.” Elsewhere, of course, refers to religion, as he made clear in his pessimistic indictment of religion in The Future of an Illusion. And yet his own psychoanalytic theory has become a matter of intense debate, and has come under the critical scrutiny of the very scientific system he felt would validate his ideas. But it is in areas other than psychology, most notably in physics, and increasingly in the life sciences, that a growing body of new knowledge is beginning to strain existing models of explanation and understanding.

With the ground-breaking work of Niels Bohr, Heisenberg, and Sir Arthur Eddington, the rock-solid presupposition central to that classical scientific thought began to crumble. With the “new science” that started to emerge in the post-World War II era, the observer and the observed could not be presumed separate and distinct. Gone too was the neat subject/object distinction that had come to define classical science. This shift away from the study of the “outside” objective world of nature to the “inner” subjective world of the observer is a hallmark of the new science. As Heisenberg observed, “Even in science, the object of research is no longer nature itself, but man’s investigation of nature.”

For example, Heisenberg pointed out that the very act of measurement interfered with what one was attempting to measure. You cannot separate the subject from the object of the experiment. So, if the scientist changes the very nature of the “reality” he or she investigates, then what is truth? What is purely objective fact? Where does the boundary lie (indeed, if there is one) between the mind and the external world? Consequently, the quantum theory of the new physics no longer claims to be describing “reality.” It describes probable realities. The new physics looks for possible realities and finds them so elusive that no one model can exhaustively account for everything. The indeterminacy of models has replaced earlier certainties.

Some, like Thomas Kuhn, even questioned the notion of science as an objective progression towards truth. In The Structure of Scientific Revolutions (1962), Kuhn observed that science, like religion, becomes heavily encumbered with its own baggage of non-rational procedures. Science accumulates its peculiar set of presuppositions, doctrines, and even heresies. Kuhn essentially demolished the logical empiricist and purist view that science personified the impartial progression towards a universal truth. Instead, he saw it as a series of shifting “paradigms”—a global way of seeing things which is relatively immune from disconfirmation by experience. One paradigm would hold sway for awhile, only to be displaced in a “revolution” by another conceptual worldview. These paradigms, both self-contained and self-perpetuating, tended to conserve and perpetuate their own ideas, just as religion tends to conserve and perpetuate its own beliefs.

For example, Galileo declared in the early 1600s that Copernicus was correct: The earth moves, and the sun is the center of our galaxy. The Church denounced these views as heresies and dangerous to the Faith. They forced Galileo to recant during a trial under the Inquisition. Although he was publicly compelled to affirm the existing “scientific” paradigm, Galileo still defied the authorities. After getting up from his knees, he is said to have mumbled “E pur si muove” (nevertheless it still moves). Placed under house arrest, Galileo lived out the rest of his life in seclusion.

The world, of course, shifted paradigms to accept the Copernican worldview. The Church, however, lagged behind, and only in 1992 did the Vatican lift the 1616 ban on the Copernican teaching. Einstein, whose theory of relativity was at first met with skepticism and doubt, later became an icon of scientific genius. And yet, even Einstein found himself resisting the new theories of the quantum physicists towards the end of his life—once again adding credibility to Kuhn’s thesis.

Whether Kuhn is correct or not is beside the point. His critique illustrates a larger trend: the suspicion that science does not have absolute answers, nor even ultimate authority. Thus, modern science presents less of a unified front, less of a final bastion of truth. Certainly many people still see themselves as living in a black and white world. But, in general, many scientists are coming to define their discipline in a more humble and tentative way. Science, for people at the turn of the century, stood for absolute, fixed truths and principles that held good forever; it embraced and explained an unchanging reality, or at least a reality that was changing according to constant and predictable laws. Today we are more modest, less presumptuous. A better working definition of science now might be “a form of inquiry into natural phenomena; a consensus of information held at any one time and all of which may be modified by new discoveries and new interpretations at any moment.” In contemporary science, uncertainty seems to be the rule.

Thus, it grows increasingly difficult to believe in an external world governed by mechanisms that science discloses once and for all. Thoughtful people find themselves hesitant, unmoored, with an up-in-the-air kind of feeling regarding the most basic facts of life. It is said that “we live in an age when anything is possible and nothing is certain.” This post-modern dilemma highlights the felt need to reconcile facts and values, morals and machines, science with spirituality. And while traditional Judaeo-Christian theologies struggle to address this particularly contemporary malaise, Buddhism maneuvers this tricky terrain with apparent ease and finds itself sought after with renewed interest and popularity.

Moreover, some observers have puzzled over this anomaly: Asia accelerates in its secular and material modernization (read “Westernization”), while the West shows signs of a spiritual revitalization drawing on largely Asian sources—especially Buddhism. Buddhism is being ‘Westernized’ to be seen as a teaching that can mesh with both the good life and mitigate the stress of the faith/reason divide. Part of Buddhism’s immense appeal lies in its analysis of the mind, the subject/self—exactly the area where modern science now senses the next breakthroughs are to be made.

The Buddha, well before Aquinas or Heisenberg, stressed the primacy of the mind in the perception and even “creation” of reality. A central concept of Buddhism is the idea that “everything is made from the mind.” Any distinction between subject and object is false, imagined, at best an expedient nod to demands of conventional language. In the Avatamsaka Sutra, the Buddha uses metaphor to elucidate: “The mind is like an artist/It can paint an entire world. . . If a person knows the workings of the mind/As it universally creates the world/This person then sees the Buddha/And understands the Buddha’s true and actual nature.” (Chap. 20) We think we are observing nature, but what we are observing is our own mind at work. We are the subject and object of our own methodology. Moreover, this mind encompasses the entirety of the universe; there is nothing outside of it, nothing it does not contain, according to the Buddha.

Such insights early on intrigued Western thinkers, as Buddhism hinted of a new avenues of travel through the mind/matter maze. It led scientists like Albert Einstein to declare:

The religion of the future will be cosmic religion. It should transcend a personal God and avoid dogmas and theology. Covering both the natural and the spiritual, it should be based on a religious sense arising from the experience of all things, natural and spiritual and a meaningful unity. Buddhism answers this description. . . If there is any religion that would cope with modern scientific needs, it would be Buddhism.

The Nobel Prize winner was not alone in his positive assessment of the Buddhism’s potential for going beyond the boundaries of Western thought. The British mathematician, philosopher Alfred North Whitehead declared, “Buddhism is the most colossal example in the history of applied metaphysics.” His contemporary Bertrand Russell, another Nobel Prize winner, found in Buddhism the greatest religion in history because “it has had the smallest element of persecution.” But beyond the freedom of inquiry he attributed to the Buddha’s teaching, Russell discovered a superior scientific method—one that reconciled the speculative and the rational while investigating the ultimate questions of life:

Buddhism is a combination of both speculative and scientific philosophy. It advocates the scientific method and pursues that to a finality that may be called Rationalistic. In it are to be found answers to such questions of interest as: ‘What is mind and matter? Of them, which is of greater importance? Is the universe moving towards a goal? What is man’s position? Is there living that is noble?’ It takes up where science cannot lead because of the limitations of the latter’s instruments. Its conquests are those of the mind.

As early as the 1940’s, the pioneering physicist Niels Bohr sensed this congruence between modern science and what he called “Eastern mysticism.” As he investigated atomic physics and searched for a unified field of reality, he often used the Buddha and Lao Tzu in his discussions on physics in his classes. He made up his own coat of arms with the yin/yang symbol on it. The American physicist J. Robert Oppenheimer also saw in Buddhism a scientific parallel to the puzzling riddles of modern physics; his cutting-edge discoveries seemed to echo the enigmatic wisdom of the ancient sage. Wrote Oppeheimer:

If we ask, for instance, whether the position of the electron remains the same, we must say ‘no;’ if we ask whether the electron’s position changes with time, we must say ‘no;’ if we ask whether the electron is at rest, we must say ‘no;’ if we ask whether it is in motion, we must say ‘no.’ The Buddha has given such answers when interrogated as to the conditions of man’s self after his death; but they are not familiar answers for the tradition of seventeenth and eighteenth-century science.

In the 1970s, in The Tao Of Physics: An Exploration of the Parallels Between Modern Physics and Eastern Mysticism, Fritjof Capra expanded on some of Bohr’s and Oppenheimer’s tentative impressions. He argued that modern science and Eastern mysticism offer parallel insights into the ultimate nature of reality. But, beyond this, Capra suggested that the profound harmony between these concepts as expressed in systems language and the corresponding ideas of Eastern mysticism was impressive evidence for a remarkable claim: That mystical philosophy offers the most consistent background to our modern scientific theories.

In the 1970s this notion came as something of a bombshell. Suddenly religion and science reunited—though in a rather unexpected way—Eastern religion and Western science. This echoed the excitement of a hundred years previous that Carus and other late Victorians sensed in Buddhism’s potential. Then, however, the emphasis was on how Buddhism could help establish religion on a more scientific basis; now, it seems the other way around—that science is seeking Buddhism to stake out its spiritual or metaphysical claims.

Regardless, those familiar with Buddhist texts immediately saw (or thought they saw) the correctness of Capra’s revelation. Certain Buddhist scriptures in fact seemed most solidly to confirm the linking of science and Dharma. The most oft-quoted is the famous teaching called the Kalama Sutta.

In this short discourse, we find the Buddha in his wanderings coming upon the village of the Kalamas. Religious seekers themselves, the Kalamas were bewildered by the plethora of divergent philosophies and teachers vying for their attention. They proceeded to ask the Buddha a series of questions. Here is the relevant portion of the text:

The Buddha once visited a small town called Kesaputta in the kingdom of Kosala. The inhabitants of this town were known by the common name Kalama. When they heard that the Buddha was in their town, the Kalamas paid him a visit, and told him:

“Sir, there are some recluses and brahmanas who visit Kesaputta. They explain and illumine only their own doctrines, and despise, condemn and spurn others’ doctrines. Then come other recluses and brahmanas, and they, too, in their turn, explain and illumine only their own doctrines, and despise, condemn and spurn others’ doctrines. But, for us, Sir, we have always doubt and perplexity as to who among these venerable recluses and brahmanas spoke the truth, and who spoke falsehood.”

“Yes, Kalamas, it is proper that you have doubt, that you have perplexity, for a doubt has arisen in a matter which is doubtful. Now, look you Kalamas, do not be led by reports, or tradition, or hearsay. Be not led by the authority of religious texts, not by mere logic or inference, nor by considering appearances, nor by the delight in speculative opinions, nor by seeming possibilities, nor by the idea: ‘this is our teacher’. But O Kalamas, when you know for yourselves that certain things are unwholesome (akusala), and wrong, and bad, then give them up…And when you know for yourselves that certain things are wholesome (kusala) and good, then accept them and follow them.”

The Kalamas voiced their doubts, their perplexity in determining truth or falsehood, as a result of having been exposed to all the competing teachers and doctrines of India at the time: not unlike our modern world today. Each teacher, each school, expounded different and often conflicting notions of the truth. The Buddha’s response was to set down a methodology that was in many ways ahead of its time in anticipating the skeptical empiricism of the modern scientific method.

He said, “Do not be led by reports, or tradition, or hearsay. Don’t be led by the authority even of religious texts, nor by mere logic or inference, nor by considering appearances”—all of which eliminate exclusive reliance on cultural convention, received tradition, and deductive speculation, as well as mere sense impressions. Also rejected were opinions and “seeming possibilities”—the stuff of preconceived bias and subjective imagination and fancy. (Some might argue that being “led by appearances” would include a narrow scientific method, at least as it has come to be popularly understood—i.e. an exaggerated reliance on natural phenomena as the only basis of what is true or real. It would also dismiss the equally exaggerated claim that scientific knowledge is the only valid kind of knowledge.The Buddha even discounts blind faith in one’s teacher.

So what’s left? Here the Buddha lays out a subtle and quite unique epistemology: “Oh Kalamas, when you know for yourselves that certain things are unwholesome and wrong and bad, then give them up. And when you know that certain things are wholesome and good, then accept them and follow them.” But how to interpret this key passage?

Many scholars and believers, both recently and at the turn of the century, jumped at this passage as confirmation that ancient Buddhist wisdom validates modern science. Early popularizers of Eastern religions in America like Anagarika Dharmapala, D. T. Suzuki, Paul Carus, and even Vedantists like Vivekananda, generally waxed enthusiastic about the compatibility of Eastern spirituality and Western science. They saw in passages like the Kalama Sutta proof positive that the Buddha prefigured the modern scientific outlook. Buddhism seemed eminently scientific: detached skeptical investigation of empirically testable phenomena; no faith, no dogma, no revelation. Experiments carried out by and confirmed by individuals regardless of time or place suggested “intersubjective testability”—one of the hallmarks of the scientific method. I do it, you do it; anyone can do it and obtain the same results. That Buddhism and science should be so nearly identical was understandably immensely appealing; it is also misleading.

While American thinkers and newly converted Western Buddhists thought they saw a natural fit between Buddhism and science, Buddhist teachers more steeped in the traditional discipline were less apologetic and often more critical of such facile comparisons. Two notable contemporary examples come to mind: Master Hsuan Hua, from the Mahayana tradition, and Wapola Rahula, a Theravada scholar-monk, both threw cold water on this notion.

The Venerable Hsuan Hua, a Ch’an and Tripitika master from China, arrived in America in the early 1960s to propagate the Dharma in the West. As he observed and studied the trends and currents of contemporary thought, he showed little enthusiasm for what seemed to him the exaggerated claims of modern science—theoretical or applied. He said, “Within the limited world of the relative, that is where science is. It’s not an absolute Dharma. Science absolutely cannot bring true and ultimate happiness to people, neither spiritually nor materially.” This is strong criticism that portrays science as a discipline limited to relative truths, and as an unsatisfactory way of life. In another essay, he wrote:

Look at modern science. Military weapons are modernized every day and are more and more novel every month. Although we call this progress, it’s nothing more than progressive cruelty. Science takes human life as an experiment, as child’s play, as it fulfills its desires through force and oppression.

In 1989, Venerable Walpola Rahula, a Theravadin monk from Sri Lanka, also warned that daily life is being permeated by science. He cautioned, “We have almost become slaves of science and technology; soon we shall be worshipping it.” His comments come well into the final decades of the twentieth century, when many people had in effect turned science into a religious surrogate. The Venerable monk observed, “Early symptoms are that they tend to seek support from science to prove the validity of our religions.” Walpola Rahula elaborated on this point:

We justify them [i.e. religions] and make them modern, up-to-date, respectable, and accessible. Although this is somewhat well intentioned, it is ill-advised. While there are some similarities and parallel truths, such as the nature of the atom, the relativity of time and space, or the quantum view of the interdependent, interrelated whole, all these things were developed by insight and purified by meditation.

Rahula’s critique goes to the heart of the matter: the capitulation of religion to scientific positivism; the yielding of almost all competing schemes of values to the scientific juggernaut. Huston Smith, the eminent scholar on the worlds religions, recently said that the weakness of modern religions in the West stems from their successful accommodation to culture. The contribution that Buddhism and other religions can make to the spiritual crisis facing modern society, therefore, may not lie in their compatibility with science, but in their ability to offer something that science cannot.

More importantly, as Rahula argues, Dharma, or abiding spiritual truths, were discovered without the help of any external instrument. Rahula concluded, “It is fruitless, meaningless to seek support from science to prove religious truth. It is incongruous and preposterous to depend on changing scientific concepts to prove and support perennial religious truths.” Moreover, he echoes the deeper moral concerns expressed by Master Hua regarding the unexamined aims and consequences of the scientific endeavor:

Science is interested in the precise analysis and study of the material world, and it has no heart. It knows nothing about love or compassion or righteousness or purity of mind. It doesn’t know the inner world of humankind. It only knows the external, material world that surrounds us.

Rahula then suggests that the value of Buddhism redoubles, not as it can be made to seem more scientific, but in its reaffirming a different sensibility, an overarching and unyielding vision of humanity’s higher potential. He concludes emphatically:

On the contrary, religion, particularly Buddhism, aims at the discovery and the study of humankind’s inner world: ethical, spiritual, psychological, and intellectual. Buddhism is a spiritual and psychological discipline that deals with humanity in total. It is a way of life. It is a path to follow and practice. It teaches man how to develop his moral and ethical character, which in Sanskrit is sila, and to cultivate his mind, samadhi, and to realize the ultimate truth, prajna wisdom, Nirvana.

Both of these eminent monks pre-date and, in many ways, stand outside the popularization and “Westernization” of Buddhism. Unlike the Western-leaning translators of Buddhism Carus, Suzuki, Dharmapala, et al., they emerged from a monastic discipline grounded in a more traditional understanding, one less enamored of modern science and more critical of Western philosophy. They would not so readily concur with Sir Edwin Arnold, who wrote in his best-selling Light of Asia (1879) that “between Buddhism and modern science there exists a close intellectual bond.”

With this in mind, it would do well to take another look at the passage quoted above from the Kalama Sutta:

But O Kalamas, when you know for yourselves that certain things are unwholesome (akusala), and wrong, and bad, then give them up…And when you know for yourselves that certain things are wholesome (kusala) and good, then accept them and follow them.

These lines, I believe, hold the key to understanding the difference between Buddhism and modern science. The passage needs to be understood not simply as a nod to Western empiricism, but within a specific context of moral inquiry. This “knowing for yourself” locates knowledge (‘scientia’) firmly within the moral sphere, both in its aims and its outcomes. It employs a meditative form of insight to penetrate the ultimate nature of reality. It implies a concept quite foreign to modern science: that the knower and what is known, the subject and object, fact and value, are not merely non-dual, but that knowledge itself is inescapably influenced by our moral and ethical being. Perhaps this is exactly what Suzuki intuited was lacking in modern science when he wrote in 1959, “I now think that a religion based solely on science is not enough. There are certain ‘mythological’ elements in every one of us, which cannot be altogether lost in favor of science.”

Regardless, none of this critical reassessment should come as a surprise to thoughtful Buddhists. The Shurangama Sutra clearly notes, “when the seed planted is crooked, the fruit will be distorted.” The close link between intention and result, cause and effect, is central to all Buddhist philosophy. It should be obvious and expected that the very fabric of modern science, lacking as it does a firm grounding in the moral sphere, would result in deleterious discoveries and incomplete uses. Tragic examples abound attesting to the ill-fated marriage of scientific technology and human ignorance.

Nor, from a Buddhist perspective, can these examples be seen as unintended consequences or accidents—they are, rather, unavoidable and logical outcomes of a partial though powerful system of thought. There is nothing in science per se that would lead one to equate its advancement with increased social benefits and enhanced human values. And certainly the absence of ethical imperatives should alert any knowledgeable Buddhist to a fundamental flaw in equating the Eightfold Way with the practice of science. In fact, a close reading of the Buddhist sources, it seems, would lead one to question: Is science in itself sufficient for describing reality? Is it capable of meeting human needs?

Thus, the aforementioned Kalamas passage, depending on one’s frame of reference, could be seen more as a critique of than a correspondence with modern science. The key to understanding this difference lies in a correct Buddhist interpretation of “know for yourselves,” “wholesome,” and “unwholesome.” As Walpola Rahula indicates, these concepts are part of a specific and disciplined form or methodology of self-cultivation which, when diligently practiced, leads to true knowledge and wisdom. This method is referred to in Buddhism as the “three non-outflow science” (san wu lou xue), and consists of morality, concentration, and wisdom (Sanskrit: sila, samadhi, prajna).

The ethical component cannot be overemphasized, as “seeing things as they really are” entails an indispensable preliminary: “purification of the mind.” This clarity of mind and concentrated awareness in turn begins with and must be sustained by moral conduct. The Visuddhimagga (Path of Purification), an early Buddhist manual compiled in the 4th century by Buddhagosha, lists the Buddha’s “science” of inquiry as an interrelated three-step exercise of virtue, meditation, and insight. This is quite a different approach to knowledge than a modern-day scientist would presume or pursue. It is interesting that these ancient wisdom traditions considered moral purity as the absolute prerequisite of true knowledge, and that we today regard it as immaterial, if not downright irrelevant. Thus, fundamental and qualitatively different views of what constitutes knowledge and the acquisition of knowledge separate Buddhism and science.

Aspects of the above epistemological formula appear throughout the Asian religious traditions. For example, Taoism speaks of cultivating the mind (hsin), regarding it as the repository of perceptions and knowledge—it rules the body, it is spiritual and like a divinity that will abide “only where all is clean.” Thus the Kuan Tzu (4 to 3rd century B.C.) cautions that “All people desire to know, but they do not inquire into that whereby one knows.” It specifies:

What all people desire to know is that (i.e., the external world),

But their means of knowing is this (i.e. oneself);

How can we know that?

Only by the perfection of this. [1]

Are we studying ourselves when we think we are studying nature? Will the “new science” eventually come to Kuan Tzu’s conclusion that only “by perfecting this,” can we truly know that? These ancient writings raise an interesting question: How accurate and objective can be the observation if the observer is flawed and imperfect? Is the relationship between “consciousness” and matter as distinct as we are inclined to believe?

The “perfection” mentioned above refers to the cultivation of moral qualities and in Buddhist terminology, the elimination of “afflictions” (klesa) such as greed, anger, ignorance, pride, selfishness, and emotional extremes. It seems less an alteration of consciousness than a purification and quieting of the mind. Mencius talks of obtaining an “unmoving mind” at age forty, again referring to the cultivation of an equanimity resulting from the exercise of moral sense. He distinguished between knowledge acquired from mental activity and knowledge gained from intuitive insight. This latter knowledge he considered superior as it gives noumenal as well as phenomenal understanding. Advaita Vedanta, the philosophical teaching of Hinduism, as well emphasizes that jnana (knowledge) requires a solid basis in ethics (Dharma). Chuang Tzu, spoke of acquiring knowledge of “the ten thousand things” (i.e., of all nature) through virtuous living and practicing stillness: “to a mind that is ‘still’ the whole universe surrenders.” [2] Even Confucius’s famous passage concerning the highest learning (da xue) connects utmost knowledge of the universe to the cultivation of one’s person and the rectification of one’s mind. [3]

The challenge from these eminent Buddhist teachers to the nearly ex cathedra authority generally accorded to science should give pause to anyone attempting a facile identification of Buddhism with science. Their aims and methods, though tantalizingly parallel, upon closer analysis diverge. Correspondences do exist, but fundamental differences inhere as well. To gloss over them not only encourages sloppy thinking, but approaches hubris. So we must ask: to what extent is our conception of science as the arbiter of knowledge culture-bound, even myopic? Could our near total faith in science blind us to an inherent bias in such a stance: we presume that the logic, norms, and procedures of the scientific method are universally applicable and their findings are universally valid. Science may not only have limited relevance for interpreting Buddhism, but may distort our very understanding of its meaning.

Thus, in a quest to reach an easy and elegant reconciliation of faith and reason, we may unwittingly fall prey to “selective perception”—noticing and embracing only those elements of Buddhism that seem consonant with our way of thinking and giving short shrift to the rest. Overplaying the similarities between science and Buddhism can lead into a similar trap, where our dominant Western thought-way (science) handicaps rather than helps us to understand another worldview. In Buddhism, this is called “the impediment of what is known.”

It may prove more salutary to allow Buddhism to “rub us the wrong way” — to challenge our preconceptions and habitual ways, to remain strange and different from anything to which we have been accustomed. To borrow a metaphor from Henry Clarke Warren, we might enjoy a “walking in Fairyland” in shoes that do not quite fit:

A large part of the pleasure that I have experienced in the study of Buddhism has arisen from what I may call the strangeness of the intellectual landscape. All the ideas, the modes of argument, even the postulates assumed and not argued about, have always seemed so strange, so different from anything to which I have been accustomed, that I felt all the time as though walking in Fairyland. Much of the charm that the Oriental thoughts and ideas have for me appears to be because they so seldom fit into Western categories. [4]

1 ArthurWaley, The Way And Its Power: A Study of the Tao Te Ching and Its Place in Chinese Thought (New York: Grove Press, 1958), 47.

2 ibid, 58.

3 James Legge, Confucius: Confucian Analects, The Great Learning, and The Doctrine of the Mean [Translated by James Legge], (New York: Dover, 1893, 1971), 4-7.

4 Henry Clarke Warren, Buddhism In Translations (Cambridge: Harvard University Press, 1896), 283-84.

THE STONE November 17, 2012, 3:24 PM
By CHRISTY WAMPOLE

If irony is the ethos of our age — and it is — then the hipster is our archetype of ironic living.

The hipster haunts every city street and university town. Manifesting a nostalgia for times he never lived himself, this contemporary urban harlequin appropriates outmoded fashions (the mustache, the tiny shorts), mechanisms (fixed-gear bicycles, portable record players) and hobbies (home brewing, playing trombone). He harvests awkwardness and self-consciousness. Before he makes any choice, he has proceeded through several stages of self-scrutiny. The hipster is a scholar of social forms, a student of cool. He studies relentlessly, foraging for what has yet to be found by the mainstream. He is a walking citation; his clothes refer to much more than themselves. He tries to negotiate the age-old problem of individuality, not with concepts, but with material things.

He is an easy target for mockery. However, scoffing at the hipster is only a diluted form of his own affliction. He is merely a symptom and the most extreme manifestation of ironic living. For many Americans born in the 1980s and 1990s — members of Generation Y, or Millennials — particularly middle-class Caucasians, irony is the primary mode with which daily life is dealt. One need only dwell in public space, virtual or concrete, to see how pervasive this phenomenon has become. Advertising, politics, fashion, television: almost every category of contemporary reality exhibits this will to irony.

Take, for example, an ad that calls itself an ad, makes fun of its own format, and attempts to lure its target market to laugh at and with it. It pre-emptively acknowledges its own failure to accomplish anything meaningful. No attack can be set against it, as it has already conquered itself. The ironic frame functions as a shield against criticism. The same goes for ironic living. Irony is the most self-defensive mode, as it allows a person to dodge responsibility for his or her choices, aesthetic and otherwise. To live ironically is to hide in public. It is flagrantly indirect, a form of subterfuge, which means etymologically to “secretly flee” (subter + fuge). Somehow, directness has become unbearable to us.

How did this happen? It stems in part from the belief that this generation has little to offer in terms of culture, that everything has already been done, or that serious commitment to any belief will eventually be subsumed by an opposing belief, rendering the first laughable at best and contemptible at worst. This kind of defensive living works as a pre-emptive surrender and takes the form of reaction rather than action.

Life in the Internet age has undoubtedly helped a certain ironic sensibility to flourish. An ethos can be disseminated quickly and widely through this medium. Our incapacity to deal with the things at hand is evident in our use of, and increasing reliance on, digital technology. Prioritizing what is remote over what is immediate, the virtual over the actual, we are absorbed in the public and private sphere by the little devices that take us elsewhere.

Furthermore, the nostalgia cycles have become so short that we even try to inject the present moment with sentimentality, for example, by using certain digital filters to “pre-wash” photos with an aura of historicity. Nostalgia needs time. One cannot accelerate meaningful remembrance.

While we have gained some skill sets (multitasking, technological savvy), other skills have suffered: the art of conversation, the art of looking at people, the art of being seen, the art of being present. Our conduct is no longer governed by subtlety, finesse, grace and attention, all qualities more esteemed in earlier decades. Inwardness and narcissism now hold sway.

Born in 1977, at the tail end of Generation X, I came of age in the 1990s, a decade that, bracketed neatly by two architectural crumblings — of the Berlin Wall in 1989 and the Twin Towers in 2001 — now seems relatively irony-free. The grunge movement was serious in its aesthetics and its attitude, with a combative stance against authority, which the punk movement had also embraced. In my perhaps over-nostalgic memory, feminism reached an unprecedented peak, environmentalist concerns gained widespread attention, questions of race were more openly addressed: all of these stirrings contained within them the same electricity and euphoria touching generations that witness a centennial or millennial changeover.

But Y2K came and went without disaster. We were hopeful throughout the ’90s, but hope is such a vulnerable emotion; we needed a self-defense mechanism, for every generation has one. For Gen Xers, it was a kind of diligent apathy. We actively did not care. Our archetype was the slacker who slouched through life in plaid flannel, alone in his room, misunderstood. And when we were bored with not caring, we were vaguely angry and melancholic, eating anti-depressants like they were candy.

FROM this vantage, the ironic clique appears simply too comfortable, too brainlessly compliant. Ironic living is a first-world problem. For the relatively well educated and financially secure, irony functions as a kind of credit card you never have to pay back. In other words, the hipster can frivolously invest in sham social capital without ever paying back one sincere dime. He doesn’t own anything he possesses.

Obviously, hipsters (male or female) produce a distinct irritation in me, one that until recently I could not explain. They provoke me, I realized, because they are, despite the distance from which I observe them, an amplified version of me.

I, too, exhibit ironic tendencies. For example, I find it difficult to give sincere gifts. Instead, I often give what in the past would have been accepted only at a White Elephant gift exchange: a kitschy painting from a thrift store, a coffee mug with flashy images of “Texas, the Lone Star State,” plastic Mexican wrestler figures. Good for a chuckle in the moment, but worth little in the long term. Something about the responsibility of choosing a personal, meaningful gift for a friend feels too intimate, too momentous. I somehow cannot bear the thought of a friend disliking a gift I’d chosen with sincerity. The simple act of noticing my self-defensive behavior has made me think deeply about how potentially toxic ironic posturing could be.

First, it signals a deep aversion to risk. As a function of fear and pre-emptive shame, ironic living bespeaks cultural numbness, resignation and defeat. If life has become merely a clutter of kitsch objects, an endless series of sarcastic jokes and pop references, a competition to see who can care the least (or, at minimum, a performance of such a competition), it seems we’ve made a collective misstep. Could this be the cause of our emptiness and existential malaise? Or a symptom?

Throughout history, irony has served useful purposes, like providing a rhetorical outlet for unspoken societal tensions. But our contemporary ironic mode is somehow deeper; it has leaked from the realm of rhetoric into life itself. This ironic ethos can lead to a vacuity and vapidity of the individual and collective psyche. Historically, vacuums eventually have been filled by something — more often than not, a hazardous something. Fundamentalists are never ironists; dictators are never ironists; people who move things in the political landscape, regardless of the sides they choose, are never ironists.

Where can we find other examples of nonironic living? What does it look like? Nonironic models include very young children, elderly people, deeply religious people, people with severe mental or physical disabilities, people who have suffered, and those from economically or politically challenged places where seriousness is the governing state of mind. My friend Robert Pogue Harrison put it this way in a recent conversation: “Wherever the real imposes itself, it tends to dissipate the fogs of irony.”

Observe a 4-year-old child going through her daily life. You will not find the slightest bit of irony in her behavior. She has not, so to speak, taken on the veil of irony. She likes what she likes and declares it without dissimulation. She is not particularly conscious of the scrutiny of others. She does not hide behind indirect language. The most pure nonironic models in life, however, are to be found in nature: animals and plants are exempt from irony, which exists only where the human dwells.

What would it take to overcome the cultural pull of irony? Moving away from the ironic involves saying what you mean, meaning what you say and considering seriousness and forthrightness as expressive possibilities, despite the inherent risks. It means undertaking the cultivation of sincerity, humility and self-effacement, and demoting the frivolous and the kitschy on our collective scale of values. It might also consist of an honest self-inventory.

Here is a start: Look around your living space. Do you surround yourself with things you really like or things you like only because they are absurd? Listen to your own speech. Ask yourself: Do I communicate primarily through inside jokes and pop culture references? What percentage of my speech is meaningful? How much hyperbolic language do I use? Do I feign indifference? Look at your clothes. What parts of your wardrobe could be described as costume-like, derivative or reminiscent of some specific style archetype (the secretary, the hobo, the flapper, yourself as a child)? In other words, do your clothes refer to something else or only to themselves? Do you attempt to look intentionally nerdy, awkward or ugly? In other words, is your style an anti-style? The most important question: How would it feel to change yourself quietly, offline, without public display, from within?

Attempts to banish irony have come and gone in past decades. The loosely defined New Sincerity movements in the arts that have sprouted since the 1980s positioned themselves as responses to postmodern cynicism, detachment and meta-referentiality. (New Sincerity has recently been associated with the writing of David Foster Wallace, the films of Wes Anderson and the music of Cat Power.) But these attempts failed to stick, as evidenced by the new age of Deep Irony.

What will future generations make of this rampant sarcasm and unapologetic cultivation of silliness? Will we be satisfied to leave an archive filled with video clips of people doing stupid things? Is an ironic legacy even a legacy at all?

The ironic life is certainly a provisional answer to the problems of too much comfort, too much history and too many choices, but it is my firm conviction that this mode of living is not viable and conceals within it many social and political risks. For such a large segment of the population to forfeit its civic voice through the pattern of negation I’ve described is to siphon energy from the cultural reserves of the community at large. People may choose to continue hiding behind the ironic mantle, but this choice equals a surrender to commercial and political entities more than happy to act as parents for a self-infantilizing citizenry. So rather than scoffing at the hipster — a favorite hobby, especially of hipsters — determine whether the ashes of irony have settled on you as well. It takes little effort to dust them away.

Christy Wampole is an assistant professor of French at Princeton University. Her research focuses primarily on 20th- and 21st-century French and Italian literature and thought.

Stanford Ovshinsky may not be a household name, but his inventions have the power to change the world

Nov 30th 2006 | from the print edition

“THE ages of mankind have been classified by the materials they use—the Bronze Age, the Iron Age, the Age of Silicon. We are at the dawn of the Hydrogen Age.” So proclaims Stanford Ovshinsky, co-founder of Energy Conversion Devices (ECD), a company based near Detroit, Michigan. “What is more,” he says, “the hydrogen economy is happening already.”

There have been plenty of grandiose but unsubstantiated claims made over the past five years about the potential for hydrogen to replace fossil fuels as an energy carrier, so some scepticism is certainly in order. In particular, President George Bush and the big carmakers have been trumpeting hydrogen fuel cells—electrochemical devices that turn hydrogen into electricity and water vapour—as the replacement for the internal-combustion engine. But the date of commercialisation seems forever slipping just beyond the horizon.

That has prompted a backlash from advocates of rival technologies (such as ethanol-based engines and novel batteries) and from greens, who argue that hydrogen is just a cynical long-term diversion used by Mr Bush and Detroit to avoid short-term action on fuel-economy standards, plug-in hybrids and other here-and-now options. And yet here is Mr Ovshinsky, still trumpeting hydrogen’s virtues despite bitter opposition.

Three things set Mr Ovshinsky apart from the hydrogen hypesters. First of all, he is no newcomer. He first outlined his vision for what he calls a “hydrogen loop” some five decades ago as an alternative to fossil fuels. (The loop goes from water to stored hydrogen via solar-powered electrolysis, and from hydrogen back to water, generating electricity in the process, via a fuel cell.) Unlike others, he can hardly be accused of opportunistically seizing upon this obscure techno-fix for political reasons.

The second difference is that Mr Ovshinsky’s green credentials are impeccable. He and his wife Iris, who died recently, founded ECD in 1960 with the explicitly stated goal of “using creative science to solve societal problems”. Astonishingly, they had the foresight to predict—long before the oil shocks of the 1970s—that the world’s addiction to oil would have unacceptable side effects, from resource wars to climate change. Spend time with Mr Ovshinsky and his employees, and it becomes plain that his social values permeate his organisation.

But what lifts Mr Ovshinsky into the league of genius inventors is something rather less common: success. He is the inventor of the nickel-metal hydride (NiMH) battery, which is used to power everything from portable electronics to hybrid cars; around 1 billion such batteries are sold every year. He has also made advances in information technology (he calls information “encoded energy”) and holds critical patents relating to thin-film solar cells, rewriteable optical discs, a new form of non-volatile memory and flat-panel displays. These technologies are being commercialised through deals with Intel, Samsung, STMicroelectronics, General Electric, Chevron, United Solar Ovonic, and others.

 

Innovation from disorder

 

What all these apparently disparate inventions have in common is that they rely on Mr Ovshinsky’s path-breaking discoveries in the field of disordered or “amorphous” materials, since named “ovonics” in his honour. Such materials can be used for energy generation (in fuel cells and solar cells), for energy storage (in batteries), for computing (to store data on discs or in chips) and to create custom materials with novel properties.

Mr Ovshinsky has spent the past five decades devising actual working products, based on amorphous materials, that fill every niche in his hydrogen loop, from thin-film solar panels to solid-hydrogen storage tanks to “regenerative” fuel cells that can store energy captured while a car is braking. ECD has even “hacked” a Toyota Prius hybrid car so that it runs on pure hydrogen rather than petrol, which he says proves that “we don’t have to wait for fuel cells to move into the hydrogen economy.”

All this makes it tempting to compare ECD’s co-founder with Thomas Edison, the great inventor from another age who founded General Electric. Both established themselves early on not only as brilliant innovators, but inventors with their feet firmly planted on the ground. Both arose from humble roots: Edison was not born to privilege, while Mr Ovshinsky’s father collected scrap by buggy. Mr Ovshinsky did not even go to college, and credits his vast knowledge of science to the public libraries of his native Ohio. He likes to say, “invention comes to the prepared mind.” And Edison, like Mr Ovshinsky, straddled the fields of energy and information technology: he originally made his name with the invention of the quadruplex, a device that increased the capacity of telegraph lines, before moving on to electrification.

Another similarity between the two inventors is that both thought of their inventions as entire systems. They had the verve to envisage a radically different world, but were good at inventing the practical things needed to get there. In Edison’s case, his vision was that of mass electrification. He was not the first to make a light bulb, but he vastly improved it and, more importantly, created the generation and distribution technologies needed to make it work, from power stations to electricity meters. His company, now called GE, helped to light up America and then the world.

Despite his lack of formal training, the charming, soft-spoken Mr Ovshinsky is not at all threatened by scientists with fancy degrees: he hires many of them, and has hosted lively debates around a round table at ECD with such prominent scientists as Hellmut Fritzsche and Morrel Cohen of the University of Chicago, David Adler of MIT and Sir Neville Mott of Cambridge University (who went on to win a Nobel prize for work on amorphous materials). Ask him whether he expects his own Nobel, and he responds matter of factly: “Oh, never. I’ve been nominated before, and Mott gave me credit when he won his, but I’ll never get one.” Without a hint of bitterness he adds softly, “I’m not a part of their world.”

Mr Ovshinsky’s vision for a hydrogen loop was just a blackboard exercise five decades ago. But since then he has produced the inventions needed to make it work. “Stan starts with a vision, and then goes out to invent what we need to get from here to there,” says Joachim Doehler, a senior scientist at ECD. Doing this requires more than scientific theory: it requires a practical engineer’s mind too. “Stan is a very good toolmaker,” says Robert Stempel, ECD’s chairman (and a former boss of General Motors, a big carmaker). Mr Ovshinsky’s collaborators say that he has an astonishing ability to juggle the permutations of eight or ten novel materials in his head, which gives him an intuitive grasp of which scientific leads to follow. That said, his colleagues joke, he still sometimes cannot remember names correctly.

The best evidence of Mr Ovshinsky’s systems approach at work is his shiny new solar factory in Michigan. Several decades ago, he argued that solar panels ought to be made not as brittle crystalline panels in costly batch processes—how everyone else does it today—but in a continuous process, “by the mile”. He was ridiculed. But he refused to yield, and asked his team to devise processes for producing miles of thin-film solar material. Dr Doehler, a veteran of AT&T’s legendary Bell Labs research centre, recalls telling his boss it was impossible. The boss proved him wrong, personally designing much of the solar factory from scratch. Crucially, his approach does not require the expensive silicon used in conventional solar panels.

 

A sunny future

 

Mr Ovshinsky points to the happy result on the shop floor: a flexible, self-adhesive strip of solar material that makes power even on cloudy days and is virtually indestructible. The factory, which Mr Bush visited in February, has an order backlog of six months and profit margins approaching 30%, he says. He has another factory in the works nearby, and plans for more: “I see ECD’s future as a factory for factories. That’s how you build entirely new industries for the future.” So does he see ECD as the GE of the 21st century? “Oh, ECD will be much more than that,” says Mr Ovshinsky merrily. “Energy and information are the twin pillars of the global economy, after all.”

How justified is this boast? Few question his intellect, but some do challenge his record as a corporate boss. An article in Forbes magazine asked in 2003 why investors “keep giving money to Stan Ovshinsky, the inventor who can create anything but profits.” ECDhas lost money for most of the 40-plus years that it has been a public company. As even one of Mr Ovshinsky’s loyal lieutenants confesses, “This company would have gone bust six times already if it were not for the personal loyalty people felt for Stan and Iris; we went the extra mile for them because this place is unique.”

Inspired by the family’s links to the peace and civil-rights movements, the Ovshinsky motto is “with the oppressed, against the oppressor”, and ECD retains the feel of a family firm with those values. What is more, ECD is visibly committed to clean energy—and Mr Ovshinsky is clearly not motivated by money. The New York Times recently analysed executive pay in America and found that bosses typically get 500 times the salary of the average worker at their firms; the ratio at ECD is five to one. He even points out that he is “probably the only chief executive that is a union member”.

The loss of his wife, collaborator and co-founder has clearly devastated Mr Ovshinsky, but do not expect to see him retire anytime soon. He may be 84, but he evidently has plenty of unfinished business to attend to. He still rises early, dresses in natty suits, and moves with the agility and energy of a young man. His intellectual curiosity appears entirely undiminished by a life of learning: his desk at ECD is buried under neat stacks of annotated scientific papers, business plans and other reading material. And he remains as audaciously inventive as ever.

He has worked out how his next generation of solar films will be produced not at 2.5 feet per minute, he says, but 100 times faster. He is convinced he can radically improve the efficiency of fuel-cell electrodes. He thinks he will be able to scale up his firm’s hydrogen-storage system to megawatt scale, thus enabling grid storage of renewable power. And so on. As your correspondent departed at the end of a day-long visit, Mr Ovshinsky still had a dinner interview with a television crew, and then planned to work on a cosmology paper at home. As I.I. Rabi, a Nobel prize-winning physicist, is reported to have said when asked if his friend was another Edison: “He’s an Ovshinsky, and he’s brilliant.”

from the print edition | Technology Quarterly

By WOLFGANG SAXON
Published: September 12, 1992

Dr. George Crile Jr., the Cleveland surgeon who angered the medical establishment by insisting that some radical procedures for breast cancer and other diseases met the surgeon’s needs rather than the patient’s, died yesterday at the Cleveland Clinic Foundation. He was 84 years old and lived in Cleveland Heights.

He died of lung cancer, the clinic said.

Dr. Crile’s battle against unnecessary surgery affected the lives of uncounted people in this country, but particularly women stricken with breast cancers.

The controversy of two decades ago swirled around Dr. Crile’s campaign against radical mastectomy — removal of the entire breast and of surrounding lymph notes and major chest muscle — which was routinely performed on breast cancer patients for a century. Instead, he preferred to combat the cancer with a simple mastectomy or, in early stages, with a lumpectomy, in which the tumor and a minimal amount of surrounding tissue is removed by a local incision. Simpler and Safer

This conservative approach was far more common in Europe at the time, and many doctors were concerned when Dr. Crile championed it here. He aroused anger with intimations that some surgeons performed heroics of the scalpel for professional glory, reveling in their skill, or even for the large fees they could command.

Earlier, Dr. Crile had pursued his policy of keeping intrusive surgery to a minimum while specializing in diseases of the thyroid. Among the alternatives he advanced were treatments with new radioactive iodines able to control certain types of thyroid cancer.

His research made surgery simpler and safer. And he brought a simmering medical debate out into the open by encouraging patients to demand information so they might make informed decisions rather than be treated like children who would not understand.

Dr. Bernadine Healy, director of the National Institutes of Health, said yesterday that Dr. Crile was an “unsung hero” who had been the object of “ridicule and scorn” by his peers and had touched millions of American women in an “extraordinarily positive way.” Now, an Accepted Wisdom

“Now lumpectomy is a mainstream and humane treatment for women,” she said in a statement from Bethesda, Md.

Dr. Crile, known as Barney to distinguish him from his illustrious father, who helped found the Cleveland Clinic, spent decades searching for nonsurgical solutions to medical problems. His aversion to routinely performed radical mastectomy is now shared by most doctors.

“I came home from World War II convinced that operations in many fields of surgery were either too radical, or not even necessary,” he once said. “Universal acceptance of a procedure does not necessarily make it right.”

Dr. Crile, who was associated with the Cleveland Clinic for over half a century, retired as head of the department of general surgery in 1968 but continued as senior consultant and, since 1972, as emeritus consultant. In addition to working in his office, he remained a writer, compulsive diarist, world traveler, diver and film maker.

George Washington Crile Jr. was born in Cleveland on Nov. 3, 1907. His father was a distinguished surgeon of the respiratory system who contributed to the study of surgical shock. He also developed the nerve-block anesthesia and was an early user of blood transfusion. A founding partner, he was known at the Cleveland Clinic Foundation as “the Chief.”

Following in his father’s footsteps, the son graduated from Yale University and earned an M.D. summa cum laude at Harvard Medical School in 1929. After his residency at the Cleveland Clinic, he joined the surgical staff in 1937.

He served in the United States Navy in World War II with a team of enlistees from the clinic. Wartime research on ruptured appendixes showed them to be less life-threatening than commonly believed. That suggested that unsupervised emergency appendectomies aboard submarines, while courageous, could do more harm than good.

From that experience grew his impulse to take up the cudgels against orthodoxy. His work with thyroid cancers convinced him that less intrusive alternatives often could take the place of surgery, and his approach succeeded in reducing the need for it.

“With fewer thyroid operations to do,” he recalled, “I looked about for other fruitful fields.” He focused on breast cancer treatments that disfigured thousands of women every year. Sparking a Patients’ Revolt

Originally a firm believer in radical mastectomy, he was influenced by Dr. Reginald Murley, a Scottish physician who combined partial mastectomy with radiation treatment. Dr. Robert S. Dinsmorea, who then headed the surgical staff in Cleveland, was similarly persuaded. Dr. Crile performed his last radical mastectomy in 1954.

He published his first paper on the subject in 1961 to demonstrate that survival rates for lumpectomy or simple mastectomy were comparable to those for radical mastectomy. Doubtful colleagues asserted that this was only because he limited his treatments to women in the early stages of the disease.

But within years, more and more women revolted against the way their surgeons treated them as “cases” in the doctor-knows-best tradition. And a growing number of surgeons began to agree with them and Dr. Crile.

Dr. Crile’s books included “What Women Should Know About the Breast Cancer Controversy” (Macmillan, 1973) and “Surgery, Your Choices, Your Alternatives” (Delacorte, 1978).

Four months ago, the Cleveland Clinic named a new building in honor of the father and the son.

Dr. Crile lost his first wife, the former Jane Halle, to cancer in 1963. He is survived by his second wife, the former Helga Sandburg, daughter of the poet Carl Sandburg; three daughters, Ann Crile Esselstyn of Cleveland, Joan Foster of Atlanta, Ga., and Susan Crile of Manhattan, a son, George Crile 3d of Manhattan, a CBS News producer for “60 Minutes”; a sister, Margaret Garretson of Cleveland; 12 grandchildren and one great-grandchild.

……

To Ima and Sheila, and each of you — in the darkest hours of your lives, you may have felt utterly alone, and it seemed like nobody cared. And the important thing for us to understand is there are millions around the world who are feeling that same way at this very moment.

Right now, there is a man on a boat, casting the net with his bleeding hands, knowing he deserves a better life, a life of dignity, but doesn’t know if anybody is paying attention. Right now, there’s a woman, hunched over a sewing machine, glancing beyond the bars on the window, knowing if just given the chance, she might some day sell her own wares, but she doesn’t think anybody is paying attention. Right now, there’s a young boy, in a brick factory, covered in dust, hauling his heavy load under a blazing sun, thinking if he could just go to school, he might know a different future, but he doesn’t think anybody is paying attention. Right now, there is a girl, somewhere trapped in a brothel, crying herself to sleep again, and maybe daring to imagine that some day, just maybe, she might be treated not like a piece of property, but as a human being.

And so our message today, to them, is — to the millions around the world — we see you. We hear you. We insist on your dignity. And we share your belief that if just given the chance, you will forge a life equal to your talents and worthy of your dreams. (Applause.)

Our fight against human trafficking is one of the great human rights causes of our time, and the United States will continue to lead it — in partnership with you. The change we seek will not come easy, but we can draw strength from the movements of the past. For we know that every life saved — in the words of that great Proclamation — is “an act of justice,” worthy of “the considerate judgment of mankind, and the gracious favor of Almighty God.”

That’s what we believe. That’s what we’re fighting for. And I’m so proud to be in partnership with CGI to make this happen.

…………………….

For complete speech, see:
http://www.whitehouse.gov/the-press-office/2012/09/25/remarks-president-clinton-global-initiative

September 25, 2012
 Following is a text of President Obama’s speech to the United Nations General Assembly on Tuesday, as released by the White House:

Mr. President, Mr. Secretary General, fellow delegates, ladies and gentleman: I would like to begin today by telling you about an American named Chris Stevens.

Chris was born in a town called Grass Valley, California, the son of a lawyer and a musician. As a young man, Chris joined the Peace Corps, and taught English in Morocco. And he came to love and respect the people of North Africa and the Middle East. He would carry that commitment throughout his life. As a diplomat, he worked from Egypt to Syria, from Saudi Arabia to Libya. He was known for walking the streets of the cities where he worked — tasting the local food, meeting as many people as he could, speaking Arabic, listening with a broad smile.

Chris went to Benghazi in the early days of the Libyan revolution, arriving on a cargo ship. As America’s representative, he helped the Libyan people as they coped with violent conflict, cared for the wounded, and crafted a vision for the future in which the rights of all Libyans would be respected. And after the revolution, he supported the birth of a new democracy, as Libyans held elections, and built new institutions, and began to move forward after decades of dictatorship.

Chris Stevens loved his work. He took pride in the country he served, and he saw dignity in the people that he met. And two weeks ago, he traveled to Benghazi to review plans to establish a new cultural center and modernize a hospital. That’s when America’s compound came under attack. Along with three of his colleagues, Chris was killed in the city that he helped to save. He was 52 years old.

I tell you this story because Chris Stevens embodied the best of America. Like his fellow Foreign Service officers, he built bridges across oceans and cultures, and was deeply invested in the international cooperation that the United Nations represents. He acted with humility, but he also stood up for a set of principles — a belief that individuals should be free to determine their own destiny, and live with liberty, dignity, justice, and opportunity.

The attacks on the civilians in Benghazi were attacks on America. We are grateful for the assistance we received from the Libyan government and from the Libyan people. There should be no doubt that we will be relentless in tracking down the killers and bringing them to justice. And I also appreciate that in recent days, the leaders of other countries in the region — including Egypt, Tunisia and Yemen — have taken steps to secure our diplomatic facilities, and called for calm. And so have religious authorities around the globe.

But understand, the attacks of the last two weeks are not simply an assault on America. They are also an assault on the very ideals upon which the United Nations was founded — the notion that people can resolve their differences peacefully; that diplomacy can take the place of war; that in an interdependent world, all of us have a stake in working towards greater opportunity and security for our citizens.

If we are serious about upholding these ideals, it will not be enough to put more guards in front of an embassy, or to put out statements of regret and wait for the outrage to pass. If we are serious about these ideals, we must speak honestly about the deeper causes of the crisis — because we face a choice between the forces that would drive us apart and the hopes that we hold in common.

Today, we must reaffirm that our future will be determined by people like Chris Stevens — and not by his killers. Today, we must declare that this violence and intolerance has no place among our United Nations.

It has been less than two years since a vendor in Tunisia set himself on fire to protest the oppressive corruption in his country, and sparked what became known as the Arab Spring. And since then, the world has been captivated by the transformation that’s taken place, and the United States has supported the forces of change.

We were inspired by the Tunisian protests that toppled a dictator, because we recognized our own beliefs in the aspiration of men and women who took to the streets.

We insisted on change in Egypt, because our support for democracy ultimately put us on the side of the people.

We supported a transition of leadership in Yemen, because the interests of the people were no longer being served by a corrupt status quo.

We intervened in Libya alongside a broad coalition, and with the mandate of the United Nations Security Council, because we had the ability to stop the slaughter of innocents, and because we believed that the aspirations of the people were more powerful than a tyrant.

And as we meet here, we again declare that the regime of Bashar al-Assad must come to an end so that the suffering of the Syrian people can stop and a new dawn can begin.

We have taken these positions because we believe that freedom and self-determination are not unique to one culture. These are not simply American values or Western values — they are universal values. And even as there will be huge challenges to come with a transition to democracy, I am convinced that ultimately government of the people, by the people, and for the people is more likely to bring about the stability, prosperity, and individual opportunity that serve as a basis for peace in our world.

So let us remember that this is a season of progress. For the first time in decades, Tunisians, Egyptians and Libyans voted for new leaders in elections that were credible, competitive, and fair. This democratic spirit has not been restricted to the Arab world. Over the past year, we’ve seen peaceful transitions of power in Malawi and Senegal, and a new President in Somalia. In Burma, a President has freed political prisoners and opened a closed society, a courageous dissident has been elected to parliament, and people look forward to further reform. Around the globe, people are making their voices heard, insisting on their innate dignity, and the right to determine their future.

And yet the turmoil of recent weeks reminds us that the path to democracy does not end with the casting of a ballot. Nelson Mandela once said: “To be free is not merely to cast off one’s chains, but to live in a way that respects and enhances the freedom of others.”

True democracy demands that citizens cannot be thrown in jail because of what they believe, and that businesses can be opened without paying a bribe. It depends on the freedom of citizens to speak their minds and assemble without fear, and on the rule of law and due process that guarantees the rights of all people.

In other words, true democracy — real freedom — is hard work. Those in power have to resist the temptation to crack down on dissidents. In hard economic times, countries must be tempted — may be tempted to rally the people around perceived enemies, at home and abroad, rather than focusing on the painstaking work of reform.

Moreover, there will always be those that reject human progress — dictators who cling to power, corrupt interests that depend on the status quo, and extremists who fan the flames of hate and division. From Northern Ireland to South Asia, from Africa to the Americas, from the Balkans to the Pacific Rim, we’ve witnessed convulsions that can accompany transitions to a new political order.

At time, the conflicts arise along the fault lines of race or tribe. And often they arise from the difficulties of reconciling tradition and faith with the diversity and interdependence of the modern world. In every country, there are those who find different religious beliefs threatening; in every culture, those who love freedom for themselves must ask themselves how much they’re willing to tolerate freedom for others.

That is what we saw play out in the last two weeks, as a crude and disgusting video sparked outrage throughout the Muslim world. Now, I have made it clear that the United States government had nothing to do with this video, and I believe its message must be rejected by all who respect our common humanity.

It is an insult not only to Muslims, but to America as well — for as the city outside these walls makes clear, we are a country that has welcomed people of every race and every faith. We are home to Muslims who worship across our country. We not only respect the freedom of religion, we have laws that protect individuals from being harmed because of how they look or what they believe. We understand why people take offense to this video because millions of our citizens are among them.

I know there are some who ask why we don’t just ban such a video. And the answer is enshrined in our laws: Our Constitution protects the right to practice free speech.

Here in the United States, countless publications provoke offense. Like me, the majority of Americans are Christian, and yet we do not ban blasphemy against our most sacred beliefs. As President of our country and Commander-in-Chief of our military, I accept that people are going to call me awful things every day — (laughter) — and I will always defend their right to do so.

Americans have fought and died around the globe to protect the right of all people to express their views, even views that we profoundly disagree with. We do not do so because we support hateful speech, but because our founders understood that without such protections, the capacity of each individual to express their own views and practice their own faith may be threatened. We do so because in a diverse society, efforts to restrict speech can quickly become a tool to silence critics and oppress minorities.

We do so because given the power of faith in our lives, and the passion that religious differences can inflame, the strongest weapon against hateful speech is not repression; it is more speech — the voices of tolerance that rally against bigotry and blasphemy, and lift up the values of understanding and mutual respect.

Now, I know that not all countries in this body share this particular understanding of the protection of free speech. We recognize that. But in 2012, at a time when anyone with a cell phone can spread offensive views around the world with the click of a button, the notion that we can control the flow of information is obsolete. The question, then, is how do we respond?

And on this we must agree: There is no speech that justifies mindless violence. There are no words that excuse the killing of innocents. There’s no video that justifies an attack on an embassy. There’s no slander that provides an excuse for people to burn a restaurant in Lebanon, or destroy a school in Tunis, or cause death and destruction in Pakistan.

In this modern world with modern technologies, for us to respond in that way to hateful speech empowers any individual who engages in such speech to create chaos around the world. We empower the worst of us if that’s how we respond.

More broadly, the events of the last two weeks also speak to the need for all of us to honestly address the tensions between the West and the Arab world that is moving towards democracy.

Now, let me be clear: Just as we cannot solve every problem in the world, the United States has not and will not seek to dictate the outcome of democratic transitions abroad. We do not expect other nations to agree with us on every issue, nor do we assume that the violence of the past weeks or the hateful speech by some individuals represent the views of the overwhelming majority of Muslims, any more than the views of the people who produced this video represents those of Americans. However, I do believe that it is the obligation of all leaders in all countries to speak out forcefully against violence and extremism.

It is time to marginalize those who — even when not directly resorting to violence — use hatred of America, or the West, or Israel, as the central organizing principle of politics. For that only gives cover, and sometimes makes an excuse, for those who do resort to violence.

That brand of politics — one that pits East against West, and South against North, Muslims against Christians and Hindu and Jews — can’t deliver on the promise of freedom. To the youth, it offers only false hope. Burning an American flag does nothing to provide a child an education. Smashing apart a restaurant does not fill an empty stomach. Attacking an embassy won’t create a single job. That brand of politics only makes it harder to achieve what we must do together: educating our children, and creating the opportunities that they deserve; protecting human rights, and extending democracy’s promise.

Understand America will never retreat from the world. We will bring justice to those who harm our citizens and our friends, and we will stand with our allies. We are willing to partner with countries around the world to deepen ties of trade and investment, and science and technology, energy and development — all efforts that can spark economic growth for all our people and stabilize democratic change.

But such efforts depend on a spirit of mutual interest and mutual respect. No government or company, no school or NGO will be confident working in a country where its people are endangered. For partnerships to be effective our citizens must be secure and our efforts must be welcomed.

A politics based only on anger — one based on dividing the world between “us” and “them” — not only sets back international cooperation, it ultimately undermines those who tolerate it. All of us have an interest in standing up to these forces.

Let us remember that Muslims have suffered the most at the hands of extremism. On the same day our civilians were killed in Benghazi, a Turkish police officer was murdered in Istanbul only days before his wedding; more than 10 Yemenis were killed in a car bomb in Sana’a; several Afghan children were mourned by their parents just days after they were killed by a suicide bomber in Kabul.

The impulse towards intolerance and violence may initially be focused on the West, but over time it cannot be contained. The same impulses toward extremism are used to justify war between Sunni and Shia, between tribes and clans. It leads not to strength and prosperity but to chaos. In less than two years, we have seen largely peaceful protests bring more change to Muslim-majority countries than a decade of violence. And extremists understand this. Because they have nothing to offer to improve the lives of people, violence is their only way to stay relevant. They don’t build; they only destroy.

It is time to leave the call of violence and the politics of division behind. On so many issues, we face a choice between the promise of the future, or the prisons of the past. And we cannot afford to get it wrong. We must seize this moment. And America stands ready to work with all who are willing to embrace a better future.

The future must not belong to those who target Coptic Christians in Egypt — it must be claimed by those in Tahrir Square who chanted, “Muslims, Christians, we are one.” The future must not belong to those who bully women — it must be shaped by girls who go to school, and those who stand for a world where our daughters can live their dreams just like our sons.

The future must not belong to those corrupt few who steal a country’s resources — it must be won by the students and entrepreneurs, the workers and business owners who seek a broader prosperity for all people. Those are the women and men that America stands with; theirs is the vision we will support.

The future must not belong to those who slander the prophet of Islam. But to be credible, those who condemn that slander must also condemn the hate we see in the images of Jesus Christ that are desecrated, or churches that are destroyed, or the Holocaust that is denied.

Let us condemn incitement against Sufi Muslims and Shiite pilgrims. It’s time to heed the words of Gandhi: “Intolerance is itself a form of violence and an obstacle to the growth of a true democratic spirit.” Together, we must work towards a world where we are strengthened by our differences, and not defined by them. That is what America embodies, that’s the vision we will support.

Among Israelis and Palestinians, the future must not belong to those who turn their backs on a prospect of peace. Let us leave behind those who thrive on conflict, those who reject the right of Israel to exist. The road is hard, but the destination is clear — a secure, Jewish state of Israel and an independent, prosperous Palestine. Understanding that such a peace must come through a just agreement between the parties, America will walk alongside all who are prepared to make that journey.

In Syria, the future must not belong to a dictator who massacres his people. If there is a cause that cries out for protest in the world today, peaceful protest, it is a regime that tortures children and shoots rockets at apartment buildings. And we must remain engaged to assure that what began with citizens demanding their rights does not end in a cycle of sectarian violence.

Together, we must stand with those Syrians who believe in a different vision — a Syria that is united and inclusive, where children don’t need to fear their own government, and all Syrians have a say in how they are governed — Sunnis and Alawites, Kurds and Christians. That’s what America stands for. That is the outcome that we will work for — with sanctions and consequences for those who persecute, and assistance and support for those who work for this common good. Because we believe that the Syrians who embrace this vision will have the strength and the legitimacy to lead.

In Iran, we see where the path of a violent and unaccountable ideology leads. The Iranian people have a remarkable and ancient history, and many Iranians wish to enjoy peace and prosperity alongside their neighbors. But just as it restricts the rights of its own people, the Iranian government continues to prop up a dictator in Damascus and supports terrorist groups abroad. Time and again, it has failed to take the opportunity to demonstrate that its nuclear program is peaceful, and to meet its obligations to the United Nations.

So let me be clear. America wants to resolve this issue through diplomacy, and we believe that there is still time and space to do so. But that time is not unlimited. We respect the right of nations to access peaceful nuclear power, but one of the purposes of the United Nations is to see that we harness that power for peace. And make no mistake, a nuclear-armed Iran is not a challenge that can be contained. It would threaten the elimination of Israel, the security of Gulf nations, and the stability of the global economy. It risks triggering a nuclear-arms race in the region, and the unraveling of the non-proliferation treaty. That’s why a coalition of countries is holding the Iranian government accountable. And that’s why the United States will do what we must to prevent Iran from obtaining a nuclear weapon.

We know from painful experience that the path to security and prosperity does not lie outside the boundaries of international law and respect for human rights. That’s why this institution was established from the rubble of conflict. That is why liberty triumphed over tyranny in the Cold War. And that is the lesson of the last two decades as well.

History shows that peace and progress come to those who make the right choices. Nations in every part of the world have traveled this difficult path. Europe, the bloodiest battlefield of the 20th century, is united, free and at peace. From Brazil to South Africa, from Turkey to South Korea, from India to Indonesia, people of different races, religions, and traditions have lifted millions out of poverty, while respecting the rights of their citizens and meeting their responsibilities as nations.

And it is because of the progress that I’ve witnessed in my own lifetime, the progress that I’ve witnessed after nearly four years as President, that I remain ever hopeful about the world that we live in. The war in Iraq is over. American troops have come home. We’ve begun a transition in Afghanistan, and America and our allies will end our war on schedule in 2014. Al Qaeda has been weakened, and Osama bin Laden is no more. Nations have come together to lock down nuclear materials, and America and Russia are reducing our arsenals. We have seen hard choices made — from Naypyidaw to Cairo to Abidjan — to put more power in the hands of citizens.

At a time of economic challenge, the world has come together to broaden prosperity. Through the G20, we have partnered with emerging countries to keep the world on the path of recovery. America has pursued a development agenda that fuels growth and breaks dependency, and worked with African leaders to help them feed their nations. New partnerships have been forged to combat corruption and promote government that is open and transparent, and new commitments have been made through the Equal Futures Partnership to ensure that women and girls can fully participate in politics and pursue opportunity. And later today, I will discuss our efforts to combat the scourge of human trafficking.

All these things give me hope. But what gives me the most hope is not the actions of us, not the actions of leaders — it is the people that I’ve seen. The American troops who have risked their lives and sacrificed their limbs for strangers half a world away; the students in Jakarta or Seoul who are eager to use their knowledge to benefit mankind; the faces in a square in Prague or a parliament in Ghana who see democracy giving voice to their aspirations; the young people in the favelas of Rio and the schools of Mumbai whose eyes shine with promise. These men, women, and children of every race and every faith remind me that for every angry mob that gets shown on television, there are billions around the world who share similar hopes and dreams. They tell us that there is a common heartbeat to humanity.

So much attention in our world turns to what divides us. That’s what we see on the news. That’s what consumes our political debates. But when you strip it all away, people everywhere long for the freedom to determine their destiny; the dignity that comes with work; the comfort that comes with faith; and the justice that exists when governments serve their people — and not the other way around.

The United States of America will always stand up for these aspirations, for our own people and for people all across the world. That was our founding purpose. That is what our history shows. That is what Chris Stevens worked for throughout his life.

And I promise you this: Long after the killers are brought to justice, Chris Stevens’s legacy will live on in the lives that he touched — in the tens of thousands who marched against violence through the streets of Benghazi; in the Libyans who changed their Facebook photo to one of Chris; in the signs that read, simply, “Chris Stevens was a friend to all Libyans.”

They should give us hope. They should remind us that so long as we work for it, justice will be done, that history is on our side, and that a rising tide of liberty will never be reversed.

Thank you very much.