Scientists are not skeptics; they are theologians.

As Thomas Kuhn famously remarked in the Structure of Scientific Revolutions, “Perhaps the most striking feature of [normal science] is how little they aim to produce major novelties, conceptual, or phenomenal.” (Kuhn, 35) In fact, an extensive study by Bernard Barber reveals that scientists are intolerant of new discoveries.[1] Priestly never accepted Lavoisier’s oxygen theory. Kelvin never accepted Maxwell’s electro-magnetic theory. It took more than a century to convert scientists to Copernicus’ heliocentrism. Newton was not accepted for half a century on the Continent. There were good reasons for such resistance. Lavoisier’s theory could not cope with the proliferation of new gases. For a while, the phlogiston theory could legitimately claim that it solved many older problems better. For example, the phlogiston theory explained why bodies burned and why metals had so many more properties in common than their ores unlike Lavoisier’s. Copernicus’ theory was not more accurate than Ptolemy’s and it did not lead directly to any improvement in the calendar. It took time for the wave theory of light to be more successful than the corpuscular theory in resolving the polarization effects which were the principal cause of optical crisis. Throughout the 18th century, scientists failed to derive the motion of moon from Newton’s law of motion.

From reading the paragraph above, it is easy to assume that scientists were hesitant due to reasonable doubt. That is certainly a part of it. But, Kuhn’s book reveals a deeper truth about science: that it cannot function with deliberate skepticism. Normal science is an activity of puzzle-solving. After accepting a paradigm, normal science tries to articulate the theory through observation and research. Paradigm “forces scientists to investigate some part of nature in detail and depth that would otherwise be unimaginable.” (Kuhn, 25) Paradigms in their early states are always insufficient, as evident from the examples above. It requires faith from the first followers of the new paradigm to carry on the torch, until a generation of scientists verify through research and evidence that the new paradigm is better than the older paradigm. However, it is crucial to note that until that moment of total conversion, there is no way to resolve the conflicts between the two paradigms. They are both legitimate ways of making sense of the world, and, for a while, it is often the case that the older paradigm corresponds to facts better than the new one. The new paradigm might explain the anomalies that led the older paradigm to crisis, but it is not necessarily equipped with the means to explain many of the phenomena the older paradigm spent centuries researching.

In other words, verification or determination of theory by evidence is not a doctrine. Evidence only gets you so far. Imagine a person who keeps doubting their world view every time their sense data (evidence) contradicted it. That person would have rejected Newtonian mechanics altogether. Astronomers in the 18th century failed to derive the motion of planets from Newton’s law of motion. The evidence falsified the Newtonian world view, but these astronomers did not reject their theory. Instead, they hypothesized that, perhaps, there existed another planet that caused the mismatch of data with calculations. In other words, they postulated a non-observable entity to fit their calculations — an ad hoc hypothesis. Eventually, their telescopes detected the hypothesized planet and that is how they discovered Neptune. Evidence is not the only factor, and this makes convincing a scientist a difficult task.

Max Planck, surveying his own career in his Scientific Autobiography, sadly remarked that:

“a new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”

A similar sentiment is echoed by Darin at the end of his Origin of Species:

“Although I am fully convinced of the truth of the views given in this volume…., I by no means expect to convince experienced naturalists whose minds are stocked with a multitude of facts all views, during a long course of years, from a point of view directly opposite to mind…[B]ut I look with confidence to the future, — to young and rising naturalists, who will be able to view both sides of the question with impartiality.”

Why is this the case? Kuhn explains this by pointing out that paradigms essentially determine the world view of an individual. Before Einstein’s theory of relativity, space was an absolute, immovable aspect of nature — and the world. In order to accept Einstein’s theory, one must begin to live in a world with curved space. Priestly regarded oxygen as dephlogisticated air, whereas Lavoisier called it oxygen. It is only in hindsight that we square these two incompatible world views. Can Newtonian dynamics really be derived from relativistic dynamics? With certain restrictions, it can resemble relativity, but we interpret that resemblance only because we know Einstein’s theory. Before Einstein, Newton’s theory was never interpreted it that way. In a Newtonian world, mass is conserved; in an Einsteinian one, mass is convertible with energy “Only at low relative velocities may the two be measured in the same way, and even then they must not be conceived to be the same.” (Kuhn, 102) It is not an easy task to convince a person who lives in a completely different world. This is why both Planck and Darwin believe that one can only convince the youth en masse, since they can be indoctrinated into looking at the world differently.

So far, we have seen that faith in theory and paradigm are large factors in the development of science. There are also other factors at play that might surprise some readers. For example, Copernicus’ work was influenced by social pressures like calendar reform. Medieval Philosophers’ criticisms of Aristotle, which led to the rise of Renaissance Neoplatonism, and other significant historical elements certainly had a part in the work of Copernicus as well. (Kuhn, 69) Many historians and philosophers argue that aesthetics played a role in Einstein formulating his theory of relativity.[2] It is certainly naïve then to conclude that science is a discipline full of skeptics, constantly challenging their most fundamental assumptions based on contradictory evidence. Science’s efficiency is based on its narrow scope of research and inquiry. In this way, science is structurally closer to theology than other disciplines.

Unlike other disciplines like art or psychology, research scientists are not concerned with the opinions of the public. Due to the importance of preserving the paradigm, scientists also approach their education ahistorically. Students are not taught to read primary sources like Newton’s Principia and its critics in a historical lens. This would allow the student to doubt the paradigm and experience the world from the viewpoint of a different paradigm. Science cannot exist or function without a paradigm; therefore, it is more important to indoctrinate the students into accepting the paradigm first. This is why science textbooks are treated like doctrine, whereas in philosophy textbooks are of secondary importance. In science, key figures and their texts are interpreted in an ahistorical lens; in other words, from the viewpoint of the current paradigm with cherry-picked excerpts in text books.

“Many scientific curricula do not ask even grad students to read works not written specifically for students. The few that do assign supplementary reading in research papers and monographs restrict such assignments to the most advanced courses and to materials that take up more or less where the available texts leave off. Until the very last stages in the education of a scientist, textbooks are systematically substituted for the creative scientific literature that made them possible. Given the confidence in their paradigms, which makes this educational technique possible, few scientists would wish to change it.” (Kuhn, 164)

The Bible is similarly interpreted by theologians as their primary source of indoctrination. Like scientists, theologians operate within the paradigm of trying to understand divinity. Despite their structural similarities, there exist stark differences between theology and science. Their methods of research, equipment, and fundamental assumptions are all drastically different. But, it is interesting to note their structural similarity, given the recent attack on religion from New Atheists and popular scientists.

For the perceptive reader, this question might dawn upon them: “why do science popularizers perpetuate the lie that scientists are skeptics?” or that “science is an accumulation of theories and evidence.” I believe the reason is political. New Atheists like Richard Dawkins, Steven Pinker, and Sam Harris are trying to spread a secular version of neoliberalism. Their outdated enlightenment thinking naturally leads to accepting a neoliberal, imperialist, and colonial narrative that deems feminism, Islam, and activism for social justice as “anti-science.” They disguise their patriarchal, capitalist, and imperialist politics in the name of Reason and Science. This is precisely what happened in the 19th century. Another reason is that science needs to regard itself as cumulative. It cannot anticipate an upcoming revolution. That would be unhealthy for the narrow and efficient practices of normal science. In order to proliferate puzzle-solving, they must approach their discipline in an ahistorical manner: as if they had always been operating under this paradigm. God cannot be questioned, even if the definition of God had changed. It is essential that you never reject God. Every conflict, therefore, is regarded as happening under the same umbrella.


Thomas Kuhn, The Structure of Scientific Revolutions

[1] Bernard Barber, “Resistance by Scientists to Scientific Discovery,” Scienc, CXXXIV (1961), 597-602.

[2] Engler, Gideon. “Einstein, His Theories, and His Aesthetic Considerations.” International Studies in the Philosophy of Science, vol. 19, no. 1, 2005, pp. 21–30., doi:10.1080/02698590500051068.

 

 

 

Advertisements

After Virtue

We are in a moral crisis. The unending arguments over abortion, health care, and gun control are symptoms of the flaws of modernity according to the philosopher Alasdair McIntyre. Modernity is a fragmented version of Aristotelian ethics; it demands an individualist worldview, despite the fact that our culture and vocabulary reflect a communitarian heritage. What this means is that our endless arguments stem from the fact that we lack a shared conception of the good; in other words, our moral systems are incommensurable. For instance, the central conflict between Kantianism and Utilitarianism is not that either is logically inconsistent; rather, it stems from the fact that each theory has a different conception of the good.

Both theories, in their best versions, follow logically from their stated premises; the problem is that these premises are merely stated. After realizing the heterogeneity of pleasures, the great utilitarian Henry Sidgwick concluded that moral beliefs couldn’t be argued and must be merely accepted —just trust your intuitions! Immanuel Kant argued that a rational agent is logically committed to the rules of morality in virtue of their rationality; in order to practice reason, one must possess the freedom and well being necessary for rational agency. This led Kant to the conclusion that one is entitled to such freedom and well being. Although it is logically necessary to possess such freedom and well being to practice reason, it does not necessarily lead to the conclusion that one is entitled to them. This is merely asserted by Kant. According to MacIntyre, Utilitarianism and Kantianism’s fundamental premises are merely stated in this way. This is why the arguments between the two are endless: they are fundamentally incommensurable.

After Virtue is primarily a diagnosis of this moral crisis. It analyzes various aspects of our culture, language, and society to demonstrate that modernity is indeed a fragmented version of Aristotelian ethics; furthermore, it argues that the inventions of modernity such as emotivism and individualism are the root causes of our moral predicament. One of the most striking features of the book is its analysis of modern social roles. The defining character of modernity is that of a bureaucratic manager. A manager pretends to be effective and morally neutral; a manager adjusts the means to ends in the most economically efficient manner. Plus, managerial expertise requires a set of law-like generalizations to justify the manager. Unsurprisingly, one can easily spot the manager: Liberalism pretends to be effective and morally neutral, Liberalism privileges economic methodologies and conceptions, and the Enlightenment fetishizes law-like generalizations. The most interesting aspect of the manager, in my opinion, is McIntyre’s discussion on the fetishization of law-like generalizations.

This fetish is particularly apparent in the social sciences; they present themselves as providing law-like generalizations, despite the lack of evidence and predictability that is characteristic of those fields. Unlike most scientists who follow the Enlightenment, McIntyre contends that the worth of a scientific discipline is not determined by its predictive power. He believes that this is the wrong criterion by which to judge the success of the social sciences, because their subject is vastly more complicated and unpredictable: language, groups of persons, entire nations, and the global market. They cannot make predictions and generalizations that are nearly as strong as those made by Physics or Biology; even the strongest arguments have counter-examples.

For example, two of the most famous studies in sociology do not follow the Popperian model of falsification. First, James C. Davies’s famous thesis in 1962 generalizes Tocqueville’s observation that the French Revolution occurred when a period of rising and, to some degree, gratified expectations was followed by a period of set-back when expectations continued to rise and were sharply disappointed.  Second, Rosalind and Ivo Feierabend (1966) generalized that the most and least modernized societies are the most stable and least violent, whereas those at midpoint in the approach to modernity are most liable to instability and political violence. There exist many counter-examples to both: Russian and Chinese Revolutions to Davies, and Political Violence in Latin America to Feierabend’s. Nonetheless, such counter-examples do not refute their status as salient generalizations in Sociology. There exist no counter-factuals that ultimately refute a generalization.

This is not a fact that cheapens these disciplines. It merely reflects just how complicated human beings are. We are intentional beings that can choose one act over another. Choice creates unpredictability. All of this is then further complicated by the fact that we are social and linguistic beings. We have to figure out how complex beings interact with each other unpredictably in complex structures like the market, the state, and language. It would be impudent of Social Scientists to expect the law-like generalizations one encounters in Science. This is why McIntyre argues that it is wrong to expect law-like generalizations regarding sociology, politics, and so on.

Why is it then that we fetishize such law-like generalizations? As I briefly mentioned above, it is tied to the philosophical framework assumed by modernity; that is, modernity’s attachment to the ideal of the bureaucratic manager. Under the bureaucracy of modernity, moral beliefs are treated as inconvenient features of persons that function far better when they are managed by an “efficient” and “economically practical” bureaucracy. The manager justifies their position by insisting that they have law-like generalizations regarding human nature and social institutions; furthermore, they boast that they can provide an efficient governing of a pluralist society without privileging one good over another. However, modernity does, in fact, assume a set of goods that are disguised as morally neutral; the philosopher Michael Sandel lays out a number of such goods in his great book, What Money Can’t Buy.

Modernity cannot make law-like generalizations, yet we obey its tenets without much argument; we argue within the confines of modernity that were designed to be endless. McIntyre suggests that we look to the past for answers to our problems. Across several chapters, McIntyre sketches the moral framework of past societies ranging from the Greeks to Medieval Christians. What they all had in common was a shared conception of the good. Such goods like prudence, justice, and courage were achieved through the virtues that are human qualities acquired through practice. Furthermore, they recognized that a person is embedded in a social context: I am a son, a citizen, and a musician. Personal identity is a narrative that unifies one’s life from past to present, and my narrative is embedded in other narrative such as family, school, and friendship.

The problem of modernity is not only that it brings about endless arguments, but also that it is incompatible with our ordinary intuitions. We come from a past, in which our obligations and personal identity are constituted by the social context to which we belong. My community consumes most of my actions and thoughts; I act and think as a student and a family member, rather than a rational agent with his or her individual interests. When we judge a person’s character, we judge them by, more or less, a table of virtues, rather than whether they follow the categorical imperative or whether they pass the utilitarian calculus. The project of modernity is doomed to fail, because it cannot dissipate our communitarian past with its endless arguments and managerial fetish.

After Virtue turned out to be far more damning in its criticism than I anticipated. For the numerous Kantians and Utilitarians out there, this will be a group of pointed criticisms; one will not only find attacks on Kant and Sidgwick, but also criticisms of Rawls and Nozick. Indeed, this book is highly political; it analyzes Marx and Weber within the framework of the book’s communitarian argument. For my Marxist friends, this is not an easy read. Despite McIntyre’s admiration of Marx, he believes that Marx and his followers ultimately fall under the same moral framework that he takes down in this book. This is very much true in my own experience. Marxists have great criticisms of the managerial and bureaucratic aspects of Capitalism and Liberalism; yet, their solutions to Capitalism always end up Kantian or Utilitarian. It is either to follow an abstract principle of universality, or “to achieve communism by any means necessary.” What this suggests is that the faults of Capitalism and Liberalism are not merely economic and political: the error is modernity itself. By rejecting the Aristotelian system of virtues, any project is destined to fail no matter what social or economic structure they adopt. Undoubtedly, this is McIntyre’s deadliest gesture, because it indicates a deep pessimism about the project of modernity as a whole. As he discusses Trotsky’s later writings and his pessimism towards a communist future, McIntyre asks us to not fall into pessimism —it does not logically follow that we have no way out! But, the reader comes away with a dreadful feeling that we might never resolve the moral crisis; Aristotle has been dead for thousands of years.

 

Nietzsche

In this episode, Teague and I discuss the philosophy of Friedrich Nietzsche. We cover a ton of issues: from morality and epistemology to language, consciousness, and shyness. Hopefully, we clarified many common misrepresentations and helped discover the depth and greatness of his philosophy.

Marxism

In this episode of the Veil of Ignorance, Teague and I discuss Marxism with Cale Holmes and Kevin Salvatore. I believe this topic honestly requires more episodes, since it’s impossible to cover everything in one episode. Nonetheless, we tried and it was most definitely a lot of fun.

John Cage and the philosophy of music

The aesthetician Susan Sontag claims that Cage’s attempt to erase authorial intent, in one sense, allowed him to erase meaning altogether. This is also reflected in Cage’s writings:

“New music: new listening. Not an attemp to understand something that is being said, for if something were being said, the sounds would be given the shapes of words. ust attention to the activity of sounds.” 

— Cage, Experimental Music: Doctrine, p. 10

But, I don’t believe this to be case. The cultural and historical context of Cage’s works show that they are trying to dissolve the distinction between art and sound. Such contexts allow sounds to refer to themselves, thereby giving them meaning and intentionality.

Plus, one could consider Heidegger’s argument that we do not hear pure sounds:

“What we first hear is never noises or complexes of sounds, but the creaking wagon, the motor-cycle. We hear the column on the march, the north wind, the woodpecker tapping, the fire crackling. It requires a very artificial and complicated frame of mind to hear a pure noise…Likewise, when we are explicitly hearing the discourse of another, we proximally understand what is said, or — to put it more exactly — we’re already with him, in advance alongside the entity which the discourse is about… Even in cases where the speech is indistinct or in a foreign language, what we proximally hear is unintelligible words, and not a multiplicity of tone-date.”

— Being and Time, 163

This means that even the sounds we encounter in ordinary life are never meaningless, pure sounds.

All in all, I admire Cage’s attempt to bring ordinary sounds to the forefront of Western music. I think he exemplifies Heidegger’s claim that art makes the conflict between World and Earth conspicuous. For Heidegger, World is the human environment in which we lead our lives. It includes our tools, houses, values, and so on; in other words, it is the habitat of Dasein. On the other hand, Earth is the natural setting of World; the ground on which it stands and the sources of raw materials for our artefacts. Through his illustration of ordinary sounds, Cage makes apparent the rift between World and Earth: musical sounds vs. pure noise. Interestingly, Cage’s take on this conflict uncovers a naked truth, i.e., the attempt to erase intentionality paves the way for a deeper unconcealment of intentionality and Dasein. 


Being and Time, Martin Heidegger

The Origin of the Work of Art, Martin Heidegger

Cage and Philosophy, Noël Carroll
 

Donald Trump is not a liar

Donald Trump is not a liar; he’s a bullshitter. There is a fine difference. The philosopher Harry G. Frankfurt believed the difference between lies and bullshit is that lies are necessarily false; on the other hand, bullshit may happen to be true or false. In essence, a lie is a conscious act of deception, whereas a bullshit entails an indifference to how things really are. In order to lie, one has to implicitly acknowledge the existence of the truth — and then deceive another of not believing in such truth. However, a bullshitter does not care whether there exists a truth or a falsity. For example, I can bullshit a test by writing a bullshit answer. It doesn’t matter to me whether the answer is true or false. I just need to write some bullshit. If it happens to be true, I get a good grade. If it happens to be false, it doesn’t matter because I never bothered studying for the test to begin with.

Nietzsche believed that foregoing objective truth would be life-affirming. JL Mackie argued that disbelief in objective truth regarding morality would not be catastrophic. Some, however, worried that it could be very dangerous. If there is no objective morality, then why be moral? If there is no objective truth, then how do I make sense of things? In general, I think it is not so dangerous for people to not believe in objective truth regarding morality or the external world. We are hard-wired to care about certain values and facts; and I doubt that we would stop caring about them even if it turned out to be the case that they were not objective. I think this is especially true regarding morality. For instance, Foucault or Nietzsche still argued for certain virtues despite their skepticism of objectivity. The problem isn’t anti-realism. The problem is bullshit — antipathy towards objectivity. Trump is, in this sense, a bullshitter. He doesn’t care whether what he says is true or false. He spouts a ton of lies, but they are not calculated. They are not conscious acts of deception. It doesn’t matter to him whether his statements are true or false. A liar would try to show how his lie is the truth; Trump doesn’t provide any evidence. As we have seen, such bullshit has been extremely pernicious. Lies require effort and responsibility; one has to support them. Bullshit does not, and it can easily destroy a society when wielded by the powerful. Bullshit is worse than the nightmares of a postmodern world reigned by Nietzsche and Foucault, once feared by many. Anti-realism is not the problem; bullshit is.

Cultural Appropriation: Motoko Kusanagi is White?

In a recent article from the New Republic, Ryu Spaeth makes the case that it is fine to cast Scarlett Johansson, a white female, as Motoko Kusanagi, a robot detective in future Tokyo.

Cultural appropriation, as described by the cultural and racial theorist George Lipsitz, can exist for both the majority and the minority. It does not necessarily yield negative results as we often assume. Rather, it is a concept that reminds us to be cautious when the majority appropriates the minority’s culture, because it can often happen in a way that enforces negative stereotypes against minorities — e.g., that they are violent or servile or that they do not have a voice to represent their own culture — and entrenches existing power relations.

However, I think our usage of the term has evolved to mean only such negative instances — partly, because theorists have primarily focused on such usages. If you talk to someone about cultural appropriation, they are not going to mention Japanese animation, which often appropriates American culture; they will most likely mention black face or casting a white actor to play Mulan. So this is why I define cultural appropriation as an instance in which a dominant culture appropriates a minority culture; and this is why I use cultural cross-pollination to describe instances in which a culture benefits from using elements from another culture. In other words, cultural appropriation describes power relations; cultural cross-pollination describes fruitful interactions between cultures. I think such a demarcation will clear up our conceptual space and prevent unnecessary confusions.

One may object that the distinction between cultural cross-pollination and cultural appropriation is often blurry — samba and bossa nova are played by those who trace their ancestral roots back to European colonialists, Native Americans, and African slaves. So if such a person is a few percentages more European and plays samba, they are cultural appropriators? There is no clear way to tell whether this is an instance of cultural apporpriation. But we can tell that this is an instance of cultural cross-pollination. The purpose of the demarcation isn’t to sufficiently describe every exchange between cultures; rather, it is to clear up the conceptual space and prevent unnecessary confusions. The demarcation allows us to make sense of a cultural product that is both cultural cross-pollination and cultural appropriation — for example, the Rolling Stones. We celebrate their music, yet recognize the fact that they appropriated black artists like Muddy Waters. This is quite an intuitive answer, but the debate we often see on cultural appropriation prevents us from validating such intuitions. It, instead, insists that it is either cultural appropriation or cultural cross-pollination. Why not both? Clarifying such intuitions and preventing such unnecessary conflicts is the purpose of such a demarcation — and I believe that it is successful.