1 Zulkim

New Humanist Literary Criticism Essay

We see it across the world: a growing assumption that knowledge should not be pursued for its own sake, but instead for its market value. Just look at the UK, where funding for the arts and humanities in universities has been slashed, while funding for science and technologies has remained steady. Increasingly, education is framed primarily as an economic advantage, rather than as a way to enrich democracy with educated and engaged citizens. In his new book "Knowledge for Sale: The Neoliberal Takeover of Higher Education", Lawrence Busch, professor of sociology at Michigan State University, challenges this market-driven approach to education and the pursuit of knowledge. Here, he explains some of his arguments.

You describe how the first neoliberals wanted to promote markets to prevent totalitarianism. Could you expand?

Neoliberalism was born in the years after World War II. Its proponents were well aware of the horrors of totalitarianism in Nazi Germany, Fascist Italy and the Communist Soviet Union. Moreover, they saw the growth of the welfare state in Britain, France and the United States as part of a slippery slope towards totalitarianism. In contrast, they saw the market as an alternative form of governance that would maximise freedom. Each person would be able to choose among the alternatives to be found in the marketplace. But, unlike classical liberals who adopted a laissez-faire approach, neoliberals believed that the state had to actively promote markets. Where markets were not possible, then competitions would do. In both instances, according to the neoliberals, freedom of choice would be expanded and the need for a strong central government would be diminished.

How does this neoliberal thinking first find its way into the world of universities, and how does it manifest here?

There has not been any single path or any single event that marked the rise of neoliberalism in universities around the world. We could point to the rise of Margaret Thatcher in Britain or Ronald Reagan in the US as inflection points, but even before their arrival on the scene, neoliberal ideas influenced governments strapped for cash. It was relatively easy to gradually replace permanent faculty with part-time and temporary faculty who were poorly paid and easily dismissed. Such persons would compete for just-in-time jobs while keeping costs down. Not surprisingly, their very precarious status minimised their institutional loyalty.

Similarly, audits of all kinds could be introduced on grounds of increasing efficiency. After all, it is hard to reject audits; taxpayers should know if their investments are successful ones. But few people asked whether the audits were measuring the right things. Did student evaluations of faculty improve teaching and learning? Did counting publications and citations enhance research? Even now, rarely is an audit of the auditors demanded.

In addition, it was easier to shift part of the costs of higher education to students than to raise taxes. And, as students began to pay more, a market emerged for private, for-profit higher education. No one asked how the goals of higher education might change if it was no longer seen as a public good but as a private good only of real value to individual students. No one asked what the consequences of student debt might be for the larger society. What about students who could not find employment after graduation that allowed them to pay off their debts? What about for-profit institutions that provided little in the way of real education?

How does this vary in higher education institutions around the world?

Neoliberals are not all of one mind. Nor are political institutions uniform around the world. Hence, different nations have adopted different aspects of neoliberal perspectives. In the US where education is largely the responsibility of each state, there is considerable variation across states. However, the burden on students has risen markedly in all states. All have introduced greater auditing burdens on faculty. All rely much more heavily on poorly-paid temporary faculty than they did in the past.

England has introduced fees of up to £9000 per annum at its universities. In both the US and the UK, league tables of universities abound, all based on the erroneous assumption that all universities should serve the same kinds of students and prepare them for the same kinds of careers. The UK has developed an elaborate – some would say cumbersome – procedure currently known as the Research Excellence Framework by which to judge university performance; nothing comparable exists in the US.

France and Germany have to date avoided introducing huge fees, but they have also reorganised their universities along neoliberal lines. France has created a national auditing agency, AERES (Evaluation Agency for Research and Higher Education). Among its many tasks, it produces lists of scholarly journals in which scholars from particular disciplines are expected to publish.

Several nations now provide monetary bonuses to faculty who publish papers in Science or Nature. As a result, these journals are inundated with manuscripts. Others have attempted to import well-established foreign scientists so as to kick start their universities.

Despite the differences, nearly all nations have redefined higher education as preparation for jobs, rather than as preparation of informed citizens. Nearly all of these varied approaches introduce competitions of various sorts into university life. Competitions require audits; someone must set the rules, see that they are enforced and rank students, faculty members, departments and institutions. Hence, competitions create new bureaucracies. And, in an ironic twist, the rankings of institutions rarely changes significantly.

You highlight the fact that different countries who share little politically – the US, China, Pakistan, France – have embraced aspects of this. Why?

As I have been telling my students for many years, when the Soviet Union collapsed in 1989, it took the future with it. Of course, the Soviet Union was a failed experiment, but its very existence implied that there was more than one way to organise the world. Once it ceased to exist, it appeared that the slogan bandied about by Prime Minister Margaret Thatcher might be true: There is no alternative. Hence, nations as different as the United States and Pakistan, China and Chile, France and India, embraced some aspects of neoliberalism. Cuba, North Korea and Venezuela hardly represent alternatives. When China’s rulers realised that its adherence to Mao’s version of communism was condemning that nation to endless poverty, the solution was to adopt neoliberal policies, albeit ‘with Chinese characteristics.’ A Chinese university now publishes the Shanghai Global Rankings of Universities. When Pinochet took power in Chile, he adopted neoliberal policies. Among other things, he privatised the public universities and eliminated government support. Modi wishes to introduce more neoliberal policies to India, although to date he has been met with considerable resistance. In short, neoliberalism offers a menu of options among which elites can pick and choose; no clear alternative currently exists.

If scholars see themselves primarily as economic actors, how does it affect the pursuit of knowledge?

The various transformations of universities under neoliberal regimes have brought with them numerous unintended consequences. Perhaps of greatest concern is the apparent rise in fraudulent behaviour as a result of the pressure to publish an ever increasing numbers of papers. At worst, this involves the creation and publication of fraudulent research results. Also of concern is the growing use of ghost and honorary authors. In the former case, a paper is written by someone other than the person listed as its author; the ghost author may receive a fee for writing the paper. In the latter case, a well-known scholar not connected to a particular paper is invited to attach his or her name to it so as to enhance its prestige and likelihood of favourable review. In addition, some scholars have published the same or similar results in multiple venues. Others have broken down their research into numerous publications so as to increase their output. Most importantly, long-term and high-risk research is actively discouraged by the constant auditing. Undertaking a project that might take ten years before publication or undertaking a project with a high risk of failure can be quite costly to a scholar.

In addition, especially in the sciences and engineering, more and more research is funded jointly with industry. This has several perverse effects. Most obviously, it pushes academic research toward the development of marketable products and away from the more fundamental research previously typical of universities. In addition, within the academic community research has always been seen as something to make public. We academics talk of "contributions to the literature". In contrast, the private sector tends to keep research results private, even as trade secrets. Therefore, when the private sector invests in public universities and research institutes, there are almost always strings attached. Even if there are not, that link between public and private undermines the objectivity of public sector scientists.

Often, discussions about academia are dismissed as a niche concern for the elites – but clearly, this has wider significance. What is that significance?

There is certainly some validity to that argument. After all, in most nations most people do not go to universities. Most university faculty do not engage in research. They are engaged in teaching the next generation of students. Only a small number of institutions are research universities. Their faculties are engaged in research on subjects such as particle physics, archaeology, literary criticism, economic history, bioethics and number theory.

This said, the neoliberal "reforms" of higher education have real consequences. Students burdened with debt do not purchase homes. Students trained narrowly to meet job forecasts often discover that those very forecasts were wrong; the jobs they were being trained for are being phased out. Faculty who are being asked to turn out a steady stream of publications in a narrow set of academic journals are not likely to have time to ask big questions such as those surrounding energy use, climate change, the consequences of automation or protection against emerging disease vectors. They are more likely focused on what is marketable at this particular point in time. Perhaps the key concern is that the neoliberal turn in higher education undermines the ability of scholars to engage in sustained criticism of the current social, economic and political order. These issues don’t get addressed if faculty members are focused on short-term, perhaps even trivial, pursuits.

Finally, what can we do about it? Do you see any positive changes happening?

As is often the case, it is the young among us who have been most active in overturning neoliberal policies to date. Students in several countries have successfully challenged the prevailing neoliberal approach to higher education. In Chile and Quebec students were key participants in overturning neoliberal governments.

However, the neoliberal obsession with competitions, markets and audits is hardly limited to universities. What is happening in higher education is also happening in social work, medical care, policing, elementary and secondary education – indeed throughout the entire labour force. We need to explore those connections and build new solidarities. We need to question the neoliberal market mentality. We need to audit the auditors. We need to show each other the consequences of building a society consisting solely of markets and competitions. Where do cooperation, care and compassion fit in? We need to ask each other whether the neoliberal obsession with markets, competitions and audits really makes sense. Need all universities be in competition with each other? Need all nations be in competition with each other? For what? There is much work to be done.

In 1992, in an essay entitled "The Emerging Third Culture," I put forward the following argument:

In the past few years, the playing field of American intellectual life has shifted, and the traditional intellectual has become increasingly marginalized. A 1950s education in Freud, Marx, and modernism is not a sufficient qualification for a thinking person today. Indeed, the traditional American intellectuals are, in a sense, increasingly reactionary, and quite often proudly (and perversely) ignorant of many of the truly significant intellectual accomplishments of our time. Their culture, which dismisses science, is often nonempirical. It uses its own jargon and washes its own laundry. It is chiefly characterized by comment on comments, the swelling spiral of commentary eventually reaching the point where the real world gets lost.

Ten years later, that fossil culture is in decline, replaced by the emergent “third culture” of the essay’s title, a reference to C. P. Snow’s celebrated division of the thinking world into two cultures—that of the literary intellectual and that of the scientist. This new culture consists of those scientists and other thinkers in the empirical world who, through their work and expository writing, have taken the place of the traditional intellectual in rendering visible the deeper meanings of our lives, redefining who and what we are.

A Great Intellectual Hunger 

Advances in science are being debated and propagated by the scientists of the third culture, who share their work and ideas not just with each other but with a newly educated public through their books. Staying with the basics, focusing on the real world, they have led us into one of the most dazzling periods of intellectual activity in human history, one in which their achievements are affecting the lives of everyone on the planet. The emergence of this activity is evidence of a great intellectual hunger, a desire for the new and important ideas that drive our times. Educated people are willing to make the effort to learn about these new ideas. Book review editors, television news executives, professionals, university administrators are discovering the empirical world on their own. They are reading and learning about revolutionary developments in molecular biology, genetic engineering, nanotechnology, artificial intelligence, artificial life, chaos theory, massive parallelism, neural nets, the inflationary universe, fractals, complex adaptive systems, linguistics, superstrings, biodiversity, the human genome, expert systems, punctuated equilibrium, cellular automata, fuzzy logic, virtual reality, cyberspace, and teraflop machines. Among others.

One Intellectual Whole

Around the fifteenth century, the word "humanism" was tied in with the idea of one intellectual whole. A Florentine nobleman knew that to read Dante but ignore science was ridiculous. Leonardo was a great artist, a great scientist, a great technologist. Michelangelo was an even greater artist and engineer. These men were intellectually holistic giants. To them the idea of embracing humanism while remaining ignorant of the latest scientific and technological achievements would have been incomprehensible. The time has come to reestablish that holistic definition.

In the twentieth century, a period of great scientific advancement, instead of having science and technology at the center of the intellectual world—of having a unity in which scholarship includes science and technology just as it includes literature and art—the official culture kicked them out. The traditional humanities scholar looked at science and technology as some sort of technical special product—the fine print. The elite universities nudged science out of the liberal arts undergraduate curriculum, and out of the minds of many young people, who abandoned true humanistic inquiry in their early twenties and turned themselves into the authoritarian voice of the establishment. 

Thus, as we enter the most exciting and turbulent intellectual times in the past five hundred years, the traditional humanities academicians—by dismissing and ignoring science instead of learning it—have so marginalized themselves that they are no longer within shouting distance of the action. One can only marvel at, for example, art critics who know nothing about visual perception; "social constructionist" literary critics uninterested in the human universals documented by anthropologists; opponents of genetically modified foods, additives, and pesticide residues who are ignorant of evolutionary biology and too lazy to look up the statistics on risk. 

And one is amazed that for others still mired in the old establishment culture, intellectual debate continues to center on such matters as who was or was not a Stalinist in 1937, or what the sleeping arrangements were for guests at a Bloomsbury weekend in the early part of the twentieth century. This is not to suggest that studying history is a waste of time. History illuminates our origins and keeps us from reinventing the wheel. But the question arises: history of what? Do we want the center of culture to be based on a closed system, a process of text in/text out, and no empirical contact with the world in between?

A fundamental distinction exists between the literature of science and those disciplines in which the writing is most often concerned with exegesis of some earlier writer. In too many university courses, most of the examination questions are about what one or another earlier authority thought. The subjects are self-referential. Yes, there is a history of science, but it is a field in its own right, quite separate from science itself. An examination in science is a set of questions on the real stuff, as it were, rather than what our predecessors thought. Unlike those disciplines in which there is no expectation of systematic progress and in which one reflects on and recycles the ideas of earlier thinkers, science moves on; it is a wide-open system. Meanwhile, the traditional humanities establishment continues its exhaustive insular hermeneutics, indulging itself in cultural pessimism, clinging to its fashionably glum outlook on world events.

Cultural Pessimism 

"We live in an era in which pessimism has become the norm," writes Arthur Herman, in The Idea of Decline in Western History. Herman, who coordinates the Western Civilization Program at the Smithsonian, argues that the decline of the West, with its view of our "sick society," has become the dominant theme in intellectual discourse, to the point where the very idea of civilization has changed. He writes:

This new order might take the shape of the Unabomber's radical environmental utopia. It might also be Nietzsche's Overman, or Hitler's Aryan National Socialism, or Marcuse's utopian union of technology and Eros, or Frantz Fanon's revolutionary fellahin. Its carriers might be the ecologist's "friends of the earth," or the multiculturalist's "persons of color," or the radical feminist's New Amazons, or Robert Bly's New Men. The particular shape of the new order will vary according to taste; however, its most important virtue will be its totally non-, or even anti-Western character. In the end, what matters to the cultural pessimist is less what is going to be created than what is going to be destroyed—namely, our "sick" modern society. 

....the sowing of despair and self doubt has become so pervasive that we accept it as a normal intellectual stance—even when it is directly contradicted by our own reality.

Key to this cultural pessimism is a belief in the myth of the noble savage—that before we had science and technology, people lived in ecological harmony and bliss. Quite the opposite is the case. 

In Cultural Pessimism: Narratives of Decline in the Postmodern World, Oliver Bennett, the director of the Centre for Cultural Policy Studies at the University of Warwick, pushes matters a step further when he writes that "the intellectual judgments on which cultural pessimism rests are inflected by that same complex of biological, psychological and sociological factors that are linked to the incidence of some forms of depression and anxiety." He wonders whether the intellectuals of the postmodern world would benefit from antidepressants ("Schopenhauer on Prozac would perhaps have produced a different philosophical system").

That the greatest change continues to be the rate of change must be hard to deal with, if you're still looking at the world through the eyes of Spengler and Nietzsche. In their almost religious devotion to a pessimistic worldview, the academic humanists cannot acknowledge that thoughtful people can have positive ideas. Within their own circles, they have, until recently, gotten away with it. The romantic emoting of a culturally pessimistic worldview has been intellectually approved. The world of the professional pessimists is a closed system, a culture of previous "isms" that turn on themselves and endlessly cycle. How many times have you seen the name of an academic humanist icon in a newspaper or magazine article and immediately stopped reading? You know what's coming. Why waste the time?

The Double Optimism of Science

As a counternarrative to this cultural pessimism, consider the double optimism of science. 

The first optimism of the science-based thinkers is conceptual: the more science they do, the more there is to do. Scientists are constantly acquiring and processing new information. This is the reality of Moore's Law—just as there has been a doubling of computer processing power every eighteen months for the past twenty years, so too do scientists acquire information exponentially. They can't help but be optimistic.

The second level of optimism concerns the content of science. Much of the news is either good news or news that can be made good, thanks to ever deepening knowledge and ever more efficient and powerful tools and techniques. Because the findings of science are not mere matters of opinion, they sweep past systems of thought based only on opinion. Science, on its frontiers, poses more and better questions, better put. They are questions phrased to elicit answers; the scientists find the answers, and move on.

Scientists debate continually, and reality is the check. They may have egos as large as those possessed by the iconic figures of the academic humanities, but they handle their hubris in a very different way. They can be moved by arguments, because they work in an empirical world of facts, a world based on reality. There are no fixed, unalterable positions.

Unlike the humanities academicians, who talk about each other, scientists talk about the universe. Moreover, conceptually there's not much difference between the style of thinking of a cosmologist trying to understand the physical world by studying the origins of atoms, stars, and galaxies and an evolutionary biologist trying to understand the emergence of complex systems from simple beginnings or trying to see patterns in nature. As exercises, these entail the same mixture of observation, theoretical modeling, computer simulation, and so on, as in most other scientific fields. The worlds of science are convergent. The frame of reference is shared across their disciplines.

Scientists As Both Creators and Critics

A significant aspect of the third culture is that scientists are both the creators and the critics of the scientific enterprise. Ideas come from scientists, who also criticize each other's ideas. Through the process of creativity and criticism and debates, scientists decide which ideas get weeded out and which become part of the consensus that leads to the next stage. All scientists are involved in coming up with new ideas and engaged in the critique of existing ideas, whereas in literature and the other arts the creators and the critics are, with few exceptions, two distinct sets of people. 

Creativity in both the humanities and the sciences involves the same thought processes, but science understands that work becomes part of a common body of knowledge. It doesn't matter who had the ideas in the first place. Most scientific developments emerge when the time is right—a new experiment, a new discovery, a new paradox. Science is a combination of creative insights and robust criticism. This process gets rid of the failures and refines and improves the surviving ideas. Science figures out how things work and thus can make them work better. As an activity, as a state of mind, it is fundamentally optimistic. 

The Horizon Grows

Science is still near the beginning. As the frontiers advance, the horizon gets wider and comes into focus. And these advances have changed the way we see our place in nature. The idea that we are an integral part of this universe—a universe governed by physical and mathematical laws that our brains are attuned to understand—causes us to see our place in the unfolding of natural history differently. We have come to realize, through developments in astronomy and cosmology, that we are still quite near the beginning. The history of creation has been enormously expanded—from six thousand years back to the twelve or thirteen billion years of big bang cosmology. But the future has expanded even more—perhaps to infinity. In the seventeenth century, people not only believed in that constricted past but thought that history was near its end: the apocalypse was coming. 

A realization that time may well be endless leads us to a new view of the human species—as not being in any sense the culmination but perhaps a fairly early stage of the process of evolution. We arrive at this concept through detailed observation and analysis, through science-based thinking; it allows us to see life playing an ever greater role in the future of the universe.

Scientia

Many people, even many scientists, have a narrow view of science as controlled, replicated experiments performed in the laboratory—and as consisting quintessentially of physics, chemistry, and molecular biology. The essence of science is conveyed by its Latin etymology: scientia, meaning knowledge. The scientific method is simply that body of practices best suited for obtaining reliable knowledge. The practices vary among fields: the controlled laboratory experiment is possible in molecular biology, physics, and chemistry, but it is either impossible, immoral, or illegal in many other fields customarily considered sciences, including all of the historical sciences: astronomy, epidemiology, evolutionary biology, most of the earth sciences, and paleontology. If the scientific method can be defined as those practices best suited for obtaining knowledge in a particular field, then science itself is simply the body of knowledge obtained by those practices.

Just as science—that is, reliable methods for obtaining knowledge—has encroached on areas (such as psychology) formerly considered to belong to the humanities, science is also encroaching on the social sciences, especially economics, geography, history, and political science. Not just the broad observation-based and statistical methods of the historical sciences but also detailed techniques of the conventional sciences (such as genetics and molecular biology and animal behavior) are proving essential for tackling problems in the social sciences. Science is nothing more nor less than the most reliable way of gaining knowledge about anything, whether it be the human spirit, the role of great men in history, or the structure of DNA. Humanities scholars and historians who spurn it condemn themselves to second-rate status and produce unreliable results.

But this doesn't have to be the case. There are encouraging signs that the third culture now includes scholars in the humanities who think the way scientists do. Like their colleagues in the sciences, they believe that there is a real world and that their job is to understand it and explain it. They test their ideas in terms of logical coherence, explanatory power, conformity with empirical facts. They do not defer to intellectual authorities: Anyone's ideas can be challenged, and understanding progresses and knowledge accumulates through such challenges. They are not reducing the humanities to biological and physical principles, but they do believe that art, literature, history, politics—a whole panoply of humanist concerns—need to take the sciences into account.

Connections do exist: our arts, our philosophies, our literature are the product of human minds interacting with one another, and the human mind is a product of the human brain, which is organized in part by the human genome and evolved by the physical processes of evolution. Like scientists, the science-based humanities scholars are intellectually eclectic, seeking ideas from a variety of sources and adopting the ones that prove their worth, rather than working within "systems" or "schools." As such they are not Marxist scholars, or Freudian scholars, or Catholic scholars. They think like scientists, know science, and easily communicate with scientists; their principal difference from scientists is in the subject matter they write about, not their intellectual style. Science and science-based thinking among enlightened humanities scholars are now part of public culture.

One Culture, the Third Culture

Something radically new is in the air: new ways of understanding physical systems, new ways of thinking about thinking that call into question many of our basic assumptions. A realistic biology of the mind, advances in physics, electricity, genetics, neurobiology, engineering, the chemistry of materials—all are challenging basic assumptions of who and what we are, of what it means to be human. The arts and the sciences are again joining together as one culture, the third culture. Those involved in this effort—scientists, science-based humanities scholars, writers—are at the center of today's intellectual action.

They are the new humanists.

Leave a Comment

(0 Comments)

Your email address will not be published. Required fields are marked *