Through exploring the distinction between what we ordinarily think of as an experience and what Nishida Kitarō, the founder of the Kyoto School of philosophy, calls “pure” or “direct” experience, I will claim that things don’t exist as those things. My claim is that the existence of natural kinds conflicts with pure experience.
What I mean by experience is all that exists for you here and now. I mean experience as we use it in sentences such as: “I’m experiencing X.” Experience, therefore, includes arising feelings, thoughts, sounds, sights, etc. It includes everything that is perceived. So, I don’t mean the word experience as we use it in sentences such as: “I have a lot of experience in X.” Pure experience is a specific kind of experience in the former sense.
Pure experience is experience prior to conceptualization (Nishida, 1992). It is unitary. It has no distinctions of self/other, subject/object, sound/sight, etc. All of these come after we apply our concepts to the experience we have here and now. What we usually call experience is already clothed in concepts and corroded with words.
Colour’s five hues from th’ eyes their sight will take; Music’s five notes the ears as deaf can make; The flavours five deprive the mouth of taste; The chariot course, and the wild hunting waste Make mad the mind; and objects rare and strange, Sought for, men’s conduct will to evil change.
Laozi, Tao Te Ching, 12
Our habit of categorizing colors, sounds, and so on impairs our ability to perceive their uniqueness. Our habit of categorizing experiences impairs our ability to experience directly and without artificial divisions. Convention decides which signifiers correspond to which signifieds and sets the boundaries of those signifieds (Saussure, 1916/1959). One learns early on how “culture has tacitly agreed to divide things from each other, to mark out the boundaries within our daily experience” (Watts, 1999, pp. 20-21).
The moment of seeing a color or hearing a sound, for example, is prior not only to the thought that the color or sound is the activity of an external object or that one is sensing it, but also to the judgment of what the color or sound might be. (Nishida, 1992, p. 3)
If we recognize the distinction between ordinary and pure experience, we see that things don’t exist as we usually think. They exist in our experience only arbitrarily: they are distinct from everything else because of the concepts we apply to understand them. It is possible to drop conceptualization even for a brief moment, and that’s what often happens during meditation sessions. The belief in grammar and our thinking habits affect one’s metaphysical beliefs (Nietzsche, 2003, p. 21).
Consider the question of the existence of an ocean wave or a constellation of stars. Both of these are perceived as distinct objects only because we choose to. There are no “natural” boundaries between signifieds and no “natural” definitions for them. Things don’t exist as those things: we label them because of the instrumental value of doing so and without implying any metaphysical truth.
References
Laozi. (1970). Tao Te Ching. Penguin Books.
Nietzsche, F. (2003). Nietzsche: Writings from the Late Notebooks. Cambridge University Press.
Nishida, K. (1992). An Inquiry into the Good. Yale University Press.
Saussure, F. de, (1959). Course in General Linguistics. Philosophical Library. (Original work published 1916)
Watts, A. (1999). The Way of Zen. Knopf Doubleday Publishing Group.
‘This tomb, friend Pyrrho, leaves no doubt about it.’
– Julianus Ægyptius, Prefect of Egypt, translated by John Sampson (Quarrie, 2020, p. 166)
Introduction
Julianus Ægyptius’ epigram is not a refutation of Pyrrhonian skepticism, but it expresses one of its pitfalls: it is just too counter-intuitive. Pyrrhonian skepticism is a school of philosophical thought that originated in ancient Greece with the philosopher Pyrrho of Elis (Bett, 2022; Vogt, 2022). Most of what we know about it comes not from Pyrrho but from Sextus Empiricus’ Outlines of Pyrrhonism.
According to Pyrrhonian skepticism, the suspension of judgment is the only justified attitude concerning any proposition (including this one). While this approach to philosophy has had a significant influence on Western thought, it is not without its critics.
In this essay, I will first examine the claim that Pyrrhonian skepticism is circular or self-refuting. Then, I will discuss the idea that the suspension of judgment is itself a judgment and hence genuine skepticism is impossible. I will finally discuss the tension between genuine doubt and living while suspending judgment on all propositions. I will conclude that it is nearly impossible for Pyrrhonian skeptics to be consistent in their beliefs and actions.
According to some, Pyrrhonian skepticism is flawed due to its reliance on circular reasoning. The skeptic posits that knowledge of the true nature of reality is unattainable and subsequently uses this assumption to suspend judgment on all matters of fact. However, the validity of this assumption cannot be established, as it is based on the premise that certainty is impossible. As a result, the skeptic assumes the truth of their position.
The Pyrrhonian skeptic is essentially saying that we cannot be certain of anything, including the claim that we cannot be certain of anything. This does not prove the self-defeating nature of Pyrrhonian skepticism, but it undermines its persuasiveness.
The problem with this argument is that it presents Pyrrhonian skepticism as an argument. Skeptics aren’t committed to the truth of the premises and conclusions of Pyrrhonian skepticism. The aim of Pyrrhonian skeptics is not and shouldn’t be to convince anyone of its truthfulness. So if we (non-Pyrrhonians) think that the claims of Pyrhonnian skepticism aren’t sound, we should investigate why.
Suspension is Judgment
Another common criticism of Pyrrhonian skepticism relies on the idea that there is no such thing as neutrality. More specifically, claiming that you’re suspending judgment about X is a judgment about X. For example, claiming that both sides are wrong on a given political issue commits one to a specific political point of view.
What is important is that when someone acts like a skeptic, they do so in a non-neutral way. Does this make skepticism self-refuting? No. All this does, as Ronald Dworkin pointed out, is refute certain arguments for skepticism.
Doubt and Behavior
Pyrrhonian skepticism is incompatible with everyday experience. This is because the skeptic is effectively saying that we cannot be certain of anything, including the most basic and mundane aspects of our lives. For example, we cannot be certain that the sun will rise tomorrow, that the floor will hold us up when we stand on it, or that the food we eat will nourish us. However, these are all things that we rely on and take for granted in our everyday lives, and to suggest that we cannot be certain of them is to reject the very foundations of our experience.
This does not prove the contradictory nature of skepticism or do anything to undermine its truth value, but it does render it problematic from a pragmatic perspective. According to Charles Sanders Peirce (1992, pp. 109-123), Descartes made a similar error when he advocated for the doubt of all things and the reliance on first principles. Merely expressing a proposition in the form of a question does not demonstrate the presence of genuine doubt.
True doubt is always accompanied by a feeling. It motivates us to investigate until we minimize or eradicate it. Our beliefs guide our actions. So if skepticism is to have any practical significance beyond being a philosophy one espouses to appear original at social events, it must have practical consequences. What would those consequences be? How would a genuine Pyrrhonian skeptic behave?
The answer to these questions is unclear. If such people exist, I’ve never seen them. It is even difficult to imagine them without engaging in fantasy. It might be that a genuine Pyrrhonian skeptic is a philosophical chimera that does not and can not exist, similar to a strict nihilist, moral relativist, or Übermensch.
The main problem with Pyrrhonian skepticism is not that it is contradictory or logically flawed. Nor that it is too permissive or tolerant of opposing views. In my view, skepticism commits the opposite error: it demands too much of truth without acknowledging less demanding alternatives. I agree with Pyrrho and Sextus Empiricus in that complete certainty is not possible, but it does not have to be. Suspension of judgment on all matters is too extreme of a conclusion to draw from all this.
That demand for certainty Nietzsche characterized (The Gay Science, §347) as arising from the instinct of weakness is not a necessary component of all belief systems outside of skepticism. One can reject the concept of certainty while still holding beliefs about certain propositions. I believe that most people who consider themselves to be Pyrrhonian skeptics engage in this type of belief formation, despite their claims to the contrary.
Conclusion
Pyrrhonian skepticism is not self-refuting, circular, logically contradictory, or anything of the sort. People who claim to be skeptics can and do exist. It is a valid philosophical position that individuals can adopt. The point I want to convey is that given my pragmatist way of thinking, theory of truth, and definitions of what constitutes a “genuine X,” I see no clear way a Pyrrhonian skeptic could consistently act and think like one.
Let no thought pass incognito, and keep your notebook as strictly as the authorities keep their register of aliens. (Benjamin, 1928/2016, p. 46)
For as long as I can remember, I’ve been obsessed with systems that can help me think. Even my desire to write is downstream of my desire to understand. I’m afraid of forgetting, so I keep switching and inventing new systems for taking notes. My long-term memory is pretty good, but I still need an external system that collects my thoughts and lets me communicate with them. So does almost everyone else: “I started the index card file for the simple reason that I have a poor memory” (Luhmann, 1987, p. 149). Taking notes is a way to transfer the burden of remembrance from your brain to something else. Writing something down gives a sense of security—Lethe is no longer threatening. You have no reason to trust your memory. Your working memory could be the best in the world, yet that still wouldn’t be good enough. Any effective system for taking notes should allow you to forget about forgetting. So here I want to lay out mine. I want to specify why I chose this system and why you may wish to apply it.
The practice of taking notes has a long history. Many prominent writers, philosophers, and scientists have their methods. Gottfried Wilhelm Leibniz, Georg Philipp Harsdörffer, Niklas Luhmann, Claude Lévi-Strauss, Roland Barthes, Vladimir Nabokov, Georg Wilhelm Friedrich Hegel, Walter Benjamin, Reinhart Koselleck, Paul Valéry, Heinrich Heine, Arno Schmidt, Jean Paul, and Ernst Jünger just to name a few. Would their accomplishments be the same without their collections of notes? If not, how come none of them realized they were wasting their time? Luhmann, for example, managed to produce over 50 books and 550 articles during his lifetime because of his sophisticated system for taking notes (Schmidt, 2016, p. 289). The systems differ. No system is perfect for everyone at all times. I know this well, yet the thought that my system could be just a bit better never abandons me.
I often buy notebooks and soon stop using them for the most superficial reasons. I obsess over the formatting of each and every note. I dismiss entire notebooks because of a small mistake. I download new note-taking apps and try them out, only to stop using them after a month or two. I’m never satisfied with my tools. I’m never really calm. I’m terrified of becoming so obsessed with productivity and finding the ‘perfect system’ that this inhibits rather than helps me think. I see all this as a problem I need to fix. If any of this applies to you, I hope this essay serves you well.
I will do the following: First, I will discuss why you might want to write notes. Then I will briefly outline the history of note-taking. Specifically, I will consider hypomnema, commonplace books, journals, Paul Valéry’s Cahiers, and Niklas Luhmann’s Zettelkasten. I will then outline how I intend to take notes, at least for the conceivable future. I hope to encourage you to start taking notes more seriously. I also want to force myself to stick with one of the methods I discuss here. Essentially, this essay is a form of “working with the garage door up“: I will write about my problems and try to work through them openly.
In his famous essay, Kommunikation mit Zettelkästen, Niklas Luhmann posits that it is impossible to think in a sophisticated or networked way without writing (Luhmann, 1992, pp. 53-61). Is he right? Could, for example, Grigori Perelman have proved the Poincaré conjecture all in his head? Could James Joyce have written Ulysses linearly in one go? Could Shakespeare have written his 154 sonnets without any editing or modifications? I think not. What Luhmann means by “sophisticated thinking” is the type of thinking required for the complex synthesis of previously unrelated material. Synthesizing diverse material through writing notes is a form of intellectual labor.
If you want to retain the intellectual labor you’ve already done and increase its quality—you must write. You must write to protect yourself from forgetfulness. You must write to think. You must write now so that you may write more later. Writing accretes. Your collection of notes becomes your communication partner. If you want to figure something out and can’t succeed through thinking independently, you should speak about it with someone (von Kleist, 1805/2001, p. 319). That ‘someone’ may be the previous you. The previous you that made all those notes so many years ago. You may have forgotten, but the ink is still there. When one writes, one reads what one has written. The letters act upon their author right when they are born, just as they act upon anyone who reads them later. In time, notes become sovereign machines capable of influencing their readers.
Any practice or idea that has lasted millennia has a good chance of being useful. Lasting for long is proof of usefulness. Concepts go through the process of natural selection. Time obliterates all that is useless. We should, therefore, expect that note-taking will have many precedents that stretch for thousands of years. This is exactly what we find. Even Leibniz, Harsdörffer, Heine, and Hegel are relatively recent examples. Countermeasures against forgetfulness have their roots in ancient Greece. Plato’s criticisms of writing in the Phaedrus rely on the idea that writing discourages the use of memory. That instead of an elixir of memory, writing is an elixir of reminding (Plato, Phaedrus, 275a).
There also existed a practice of writing what the Greeks called hypomnemata (ὑπομνήματα). These books, notebooks, or notes were not substitutes for memory. They were not narratives of oneself or intimate journals. They intended to capture what was already said (Foucault, 2005, p. 500). To collect what one heard, read, or came up with (Foucault, 1997, pp. 210-211). Hypomnemata made possible the establishment of a “relationship of oneself with oneself” by making one’s recollection of the fragmentary λόγος readily available.
Seneca stressed the importance of keeping a treasure store of reading. According to Seneca, going from book to book and not writing anything down puts one in danger of retaining nothing and forgetting oneself (Seneca the Younger, Epistles, p. 7). The writing process counteracts stultitia, i.e., mental agitation, distraction, and constant wavering between opinions. Reading for nourishing and refreshing the intellect and writing for fixing ideas are both essential (Seneca the Younger, Epistles, p. 277). Hypomnemata allow the psyche to detach itself from concerns about the future and direct itself toward the past.
The issue with hypomnemata or notebooks is that they are collections of heterogeneous fragments. The writer must establish links, similarities, and differences between them. Most notebooks are unfinished, underused, chaotic, and fragmented. For these notebooks to be helpful, it is necessary to organize and index them.
Let us take down one of those old notebooks which we have all, at one time or another, had a passion for beginning. Most of the pages are blank, it is true; but at the beginning we shall find a certain number very beautifully covered with a strikingly legible hand-writing. Here we have written down the names of great writers in their order of merit; here we have copied out fine passages from the classics; here are lists of books to be read; and here, most interesting of all, lists of books that have actually been read, as the reader testifies with some youthful vanity by a dash of red ink. (Woolf, 1958, p. 25)
Seneca had a solution. It was to run everything we have absorbed through our lens. To digest all thoughts gathered from reading. To put in all hypomnemata a unifying element: us. Only in this way will the material enter our reasoning rather than exclusively our memory. Other writers, for example, Umberto Eco, have stressed a similar point (Eco, 1977/2015, pp. 164-167). It is vital to write in your own words. To paraphrase rather than quote. Direct quotations don’t demonstrate how you understand what you’re quoting. Paraphrasing, by contrast, always involves interpretation. It transforms the original material into “tissue and blood” (Seneca the Younger, Epistles, p. 281). Interpretation is a transformative and creative act. The more effort one puts into thinking an idea through, the sharper it seems. I write ‘seems’ because whether or not it sharpens is a matter of debate. The important part is that you become more confident in your writing.
There are, of course, exceptions to this rule. Sometimes you want to analyze a poem. Paraphrasing it is almost always a bad idea. Sometimes you want to dissect how the author uses language in a given passage. Or the language is so powerful that you do not wish to change anything. Sometimes your interpretation is not as important because the meaning is unambiguous. Consider, for example, Shakespeare’s 7th sonnet (1609/1923):
Lo, in the orient when the gracious light Lifts up his burning head, each under eye Doth homage to his new-appearing sight, Serving with looks his sacred majesty; And having climb’d the steep-up heavenly hill, Resembling strong youth in his middle age, Yet mortal looks adore his beauty still, Attending on his golden pilgrimage; But when from highmost pitch, with weary car, Like feeble age, he reeleth from the day, The eyes, ’fore duteous, now converted are From his low tract and look another way: So thou, thyself out-going in thy noon, Unlook’d on diest, unless thou get a son.
If I had the choice of paraphrasing or directly quoting the passage above, I think it’s clear why I would choose the latter. The language gives it more value than any interpretation could. I could paraphrase just the meaning, but so much would get lost. The reader wouldn’t see that although Shakespeare uses many metaphors to describe the sun, the word itself never comes up. And the sonnet, when you’re not expecting, ends with the word son. This and other such ingenuities would all get lost. Analyzing the way Shakespeare uses language in this sonnet would require a lot of work and would be a more worthy endeavor.
In producing art of any kind, be it writing, painting, or architecture, I find that the more effort I put into something, the more proud I am. If I don’t spend much time on, for example, an architectural project, I will inevitably think it’s not good enough. When people say that a lot of work went into creating an art piece, they usually mean it as a compliment. As if the work that went into creating something gives it more value (Zumthor, 1998/2010, p. 12). Whether or not this is true, it certainly seems like it is. And it is, therefore, worth it for me to spend more time with each note. Hence my preference for writing by hand before moving to a computer. It gives them more value.
I won’t exhaust the comparison of digital vs. analog tools for taking notes. You probably know most of the pros and cons of each already. I could cite research on how writing things down by hand makes it more likely for you to remember them or how reading on paper rather than on a screen has the same effect. I could discuss how digital notes have the advantage of being easily searchable. I could claim that you might not want to rely on a specific app or company to store all your writings. That discussion deserves a separate essay, but there isn’t much I can add that hasn’t already been said a thousand times. I think it ultimately boils down to personal preference. I use both.
So how do we organize our hypomnemata? With commonplace books like John Locke, John Milton, Francis Bacon, Ralph Waldo Emerson, and Henry David Thoreau? With notebooks like Friedrich Nietzsche, Martin Heidegger, Virginia Woolf, Walter Benjamin, and Paul Valéry? With slip boxes like Gottfried Wilhelm Leibniz, Vladimir Nabokov, Roland Barthes, Claude Lévi-Strauss, and Niklas Luhmann? The way I currently organize my notes is a combination of these, so I will explain what each means in turn.
Let’s begin with commonplace books. Commonplace books are notebooks used for collecting notes about books, articles, journals, and other media. “In their purest form, commonplace books were written in Latin by schoolboys who organized sententiae under topic headings” (Burke, 2013). They differ from diaries or writer’s journals because they are generally not introspective or chronological (Basbanes, 2006, p. 82). My issue with exclusively relying on commonplace books is that my ideas have to get mixed with reading notes, and I think those two deserve their own spaces. If you want to find some idea and the only thing you use is a commonplace book, you must remember what you were reading when that idea came to you to find it. And I don’t want that. I want to be able to find any note based on its address.
Reflective notebooks, like Nietzsche’s notebooks (Nietzsche, 2003 & Nietzsche, 2009), differ from commonplace books because you don’t fill them with notes about the books you’ve read. You fill your reflective notebooks with the ideas you develop. Paul Valéry, for example, devoted the early hours of each morning to writing in his Cahiers (Lawler, 1976, p. 346). I prefer combining this with a commonplace book. I split my notebooks in half: I use the first half for my ideas and the second half for reading notes. These two correspond to different types of slip cards in Niklas Luhmann’s system.
Now I will explain what the zettelkasten is. A zettelkasten (slip box) or card file consists of small paper slips arranged in an organic network and linked to one another. Many writers use slip cards to help them write, but I won’t explain each method separately. I believe Niklas Luhmann brought everything useful about a zettelkasten to near perfection. I will, therefore, focus only on that.
When working on his dissertation, Luhmann realized that the notes he took would be useful not just for one book but for a lifetime (Luhmann, 1987, p. 149). Filling the zettelkasten took more of his time than the writing process itself (Luhmann, 1987, p. 142). Yet the results speak for themselves: it was worth it. The card index helped rather than held back Luhmann from writing so much.
Although surprising for such a systematic thinker, Luhmann didn’t pedantically adhere to specific types of ink or paper. He used A6 size paper, but the backside was sometimes already written. He took some notes with a blue pen and some with a red one. Some are on thick paper, some on thin, and so on (Schmidt, 2016, p. 291). I wish I could do this too, but it bothers me too much. Maybe after some inconsistencies, one gets used to them. I hope I will too. Luhmann used these slips of paper to record all kinds of information: his theses and concepts, reading notes, questions, and bibliographical information.
He had three types of cards. We may call them permanent, bibliographic, and index cards. Permanent notes contained distilled ideas. Bibliographic cards contained brief reading notes. Index cards contained lists of keywords with several addresses per keyword. What are addresses? Since Luhman was using pen and paper, he had to find a way to find the slip cards whenever he wanted. So he would give an alpha-numeric address to each slip. Card 9 would go between cards 8 and 10. If Luhmann wanted to put a new card between cards 9 and 10, he would give it the address 9a. If he wanted to put a card between 9a and 10, he would give it the address 9a1, and so on. I prefer using a physical notebook for ephemeral notes and bibliographic cards, and I use my computer for permanent notes. I find physical index cards to be limiting, and I prefer carrying around a notebook to carrying around a bunch of A6 papers.
So my system consists of two things: a notebook for reading and ephemeral notes and a folder on my computer for my permanent notes. The reasons I keep a physical notebook are as follows: (1) writing things down makes you remember them, (2) a notebook is more portable than a laptop, (3) I don’t have to worry about formatting, searchability, etc., (4) writing by hand forces you to make your notes more concise, (5) I don’t get distracted by a computer while reading, and (6) aesthetics.
The reasons I prefer to have my permanent notes on my computer are as follows: (1) it is easier to lose pieces of paper than files if you use external drives, (2) I can copy the already written text directly, (3) I can copy citations which I have already generated, (4) I can edit the permanent notes if I have to, (5) I can search the notes, so there is no need for physical index cards, (6) There is no length limit to any single note, and (7) digital notes are more portable. The software doesn’t matter much. I use Obsidian, but there are many other options.
We began with a discussion of the history of fighting forgetfulness. Note-taking has a surprisingly long history: even the ancient Romans have much to say about it. We saw different methodologies for taking notes and arrived at my system. I tried to give reasons for choosing this system, but I admit that this subject gives me a lot of trouble. I’m not firm in my convictions. I hope this was useful for you too.
references
Basbanes, N. A. (2006). Every book its reader: The power of the written word to stir the world. Perennial.
Benjamin, W. (2016). One Way Street. Harvard University Press. (Original work published 1928)
Eco, U. (2015). How to Write a Thesis. The MIT Press. (Original work published 1977)
Foucault, M. (1997). Ethics: Subjectivity and Truth. New Press.
Foucault, M. (2005). The Hermeneutics of the Subject: Lectures at the Collège de France 1981–1982. Pan Macmillan.
Krajewski, M. (2011). Paper Machines: About Cards & Catalogs, 1548-1929. The MIT Press.
Krajewski, M. (2016). Note-Keeping: History, Theory, Practice of a Counter-Measurement Against Forgetting. In Cevolini, A. Forgetting Machines: Knowledge Management Evolution in Early Modern Europe. BRILL.
Luhmann, N. (1987). Biographie, Attitüden, Zettelkasten. In Luhmann, N. Archimedes und wir. Interviews. Merve Verlag. pp. 125–155.
Luhmann, N. (1992). Universität als Milieu: Kleine Schriften. Haux.
Nietzsche, F. W. (2003). Writings from the Late Notebooks. Cambridge University Press.
Nietzsche, F. W. (2009). Writings from the Early Notebooks. Cambridge University Press.
Plato. Phaedrus.
Schmidt, J. F. K. (2016). Niklas Luhmann’s Card Index: Thinking Tool, Communication Partner, Publication Machine. In Cevolini, A. Forgetting Machines: Knowledge Management Evolution in Early Modern Europe. BRILL.
What’s your most radical and controversial view about ethics? My answer: I don’t believe in moral responsibility. Believing in moral responsibility requires that you ascribe agency to human beings, and I don’t: if we’re all just cogs in the machine, blame is always inappropriate. I can’t blame you for something you couldn’t have helped doing. We’re all familiar with making choices. Categories like responsibility, blame, guilt, revenge, and consent presuppose human agency. But this doesn’t change the fact that “there is in the mind no volition” (Spinoza, 1677/1910, p. 75). You are a force that changes the universe, yes, but you don’t decide how you will do so. In this essay, I want to reexamine concepts like blame and moral responsibility in light of my firm conviction that human beings don’t have agency. I will discuss whether these concepts are harmful and propose arguments for why they are fictional.
People often use the repercussions of determinism regarding the justice system as a practical argument for free will. Yet this has nothing to do with its actual existence. So free will is thought of as a Platonic noble lie (Plato, Republic, Book III, 414c). But even the benefits of this illusion are questionable. Yes, we must radically reassess moral responsibility, blame, praise, guilt, revenge, and similar concepts in light of determinism, but do you see a convincing consequentialist case against doing so? The justice system doesn’t necessarily have to rest on the assumption that human beings have moral agency. Punishment with the aim of prevention is still fully compatible with strong determinism. All that needs to change is our view of the relationship between blame and causation.
Let’s take a famous example from 18th-century history: The execution of King Louis XVI at the hands of Charles-Henri Sanson. Now, we would all say that Sanson caused Louis XVI’s death. But the disagreement between determinists and non-determinists is about blame and responsibility, not causation. So a determinist would say that even though Sanson caused the king’s death, he is not to blame, i.e., he is not responsible. All this sounds too radical, but here’s my reasoning: If the executioner is just as much a machine as the guillotine, then blaming him is as absurd as blaming the guillotine. The guillotine is just as much a cause of Louis XVI’s death as Sanson is, but it’s not responsible because it has no agency. I can’t blame the killer or the bullet. And if I can’t blame killers, I can’t punish them for punishment’s sake.
When I was in high school, I remember having a discussion with one of my teachers about the death penalty. We were analyzing a specific scenario: imagine that there is a prisoner so dangerous that the only options we have are life imprisonment and execution. If the prisoner prefers execution, should we administer it? My teacher’s answer was no. Her answer was not necessarily problematic, but her reasoning was: She said that we shouldn’t administer the death penalty because that wouldn’t be severe enough. All this sounds so evil to me because I don’t recognize punishment for punishment’s sake as anything other than madness. In a deterministic world, such ‘justice’ makes no sense. The conclusion that punishment of this kind is unjust follows from the premise that free will doesn’t exist.
If humans have no agency, a murderer doesn’t choose to kill. An amalgamation of forces moves the murderer toward that action. No one can claim to know all of those in any specific case, but that’s not necessary for the soundness of my argument. As for punishment, it doesn’t matter whether the murderer had the choice to kill or not. What matters is that we can reasonably expect similar behavior from the murderer in the future, so we must do what we can to prevent it. I don’t want to defend utilitarianism, but it is also compatible with strong determinism. If a utilitarian claims that the pleasure gained by the offended, for example, outweighs the pain inflicted upon the offender and, therefore, punishment is justified, then determinism has nothing to say for or against doing so. So we see that even if we remove blame from the equation, we can still use the concept of justice.
As a case study of how a deterministic moral framework could function, we may consult the Odyssey. In Homeric Greece, the problems of voluntary/involuntary action and the riddles concerning free will do not arise. There was no concept of will, so the moral vocabulary looked quite different. It doesn’t matter whether you could’ve acted courageously or the circumstances made it impossible for you to do so. You still wouldn’t be considered courageous. Odysseus can blame the suitors for having false beliefs (Homer, Odyssey, Book XXII) even though we would today think of having a false belief as an involuntary error. As MacIntyre explains, “it is not that Homer thinks that beliefs are voluntary; he is engaged in an assessment to which what the agent could or could not have done otherwise is irrelevant” (MacIntyre, 1966/2017, p. 7).
Almost any tragedy from Ancient Greece, and most famously Oedipus Rex, implicitly confronts us with the problem of free will. We see Oedipus try to avoid his destiny and achieve the exact opposite. While Sophocles had no concept of free will, he would probably have agreed that usually, we don’t see the true causes behind our actions, even when we think we’re acting freely. Even when we are the cause, we don’t choose to be.
After considering Homer and Sophocles, we can also look at Aristotle’s case. He expounds on a theory of voluntary and involuntary actions that obsoletes the concept of free will. He divides actions into voluntary and involuntary ones based on their causes. Actions that are caused by chance or necessity are involuntary, while those caused by one’s own wants and habits are voluntary. Note that each kind of action still has a cause that we don’t “control”:
Now, all human actions are either the result of man’s efforts or not. Of the latter some are due to chance, others to necessity. Of those due to necessity, some are to be attributed to compulsion, others to nature, so that the things which men do not do of themselves are all the result of chance, nature, or compulsion. As for those which they do of themselves and of which they are the cause, some are the result of habit, others of longing, and of the latter some are due to rational, others to irrational longing. Now wish is a rational longing for good, for no one wishes for anything unless he thinks it is good; irrational longings are anger and desire. Thus all the actions of men must necessarily be referred to seven causes: chance, nature, compulsion, habit, reason, anger, and desire. (Aristotle, Rhetoric, Book I, Chapter 10, Sections 7-8)
We cannot control our appetites, anger, reasoning, and habits. All this is because we cannot control who we are and what we think. To understand this, consider the following question: How do your thoughts arise? Not through any conscious decision on your part. Could it even be possible to decide what and when to think? No. The succession of thoughts always is and can only be unplanned.
Imagine the opposite: a thought, X, arises because of a preceding decision. I decide to think about X. But for this to be voluntary, I would have to decide to decide what to think. I would have to decide: I am going to decide whether to think X or something else. Even if this happened, I am in an infinite regress. At a certain point, the cause of my thoughts will necessarily be outside of my control. Furthermore, at any point in this chain, the decision is determined.
In the form in which it comes, a thought is a sign with many meanings, requiring interpretation or, more precisely, an arbitrary narrowing and restriction before it finally becomes clear. It arises in me – where from? How? I don’t know. It comes, independently of my will, usually circled about and clouded by a crowd of feelings, desires, aversions, and by other thoughts, often enough scarcely distinguishable from a ‘willing’ or ‘feeling’. It is drawn out of this crowd, cleaned, set on its feet, watched as it stands there, moves about, all this at an amazing speed yet without any sense of haste. Who does all this I don’t know, and I am certainly more observer than author of the process. Then its case is tried, the question posed: ‘What does it mean? What is it allowed to mean? Is it right or wrong?’ – the help of other thoughts is called on, it is compared. In this way thinking proves to be almost a kind of exercise and act of justice, where there is a judge, an opposing party, even an examination of the witnesses which I am permitted to observe for a while – only a while, to be sure: most of the process, it seems, escapes me. – That every thought first arrives many-meaninged and floating, really only as the occasion for attempts to interpret or for arbitrarily fixing it, that a multitude of persons seem to participate in all thinking – this is not particularly easy to observe: fundamentally, we are trained the opposite way, not to think about thinking as we think. The origin of the thought remains hidden; in all probability it is only the symptom of a much more comprehensive state; the fact that it, and not another, is the one to come, that it comes with precisely this greater or lesser luminosity, sometimes sure and imperious, sometimes weak and in need of support, as a whole always exciting, questioning – because every thought acts as a stimulus to consciousness – in all of this, something of our total state expresses itself in sign form. (Nietzsche, 2003, pp. 34-35)
So the ‘I’ that thinks is itself a construction of thinking. The conjunction ‘I think’ is a grammatical convention that might not correspond to reality. Nietzsche’s claim that “something can be a condition of life and nevertheless be false” (Nietzsche, 2003, p. 21) applies equally well to the thinking ‘I’ and to free will. You are not the thinking I. The ‘I’ feels, observes, and experiences. It doesn’t decide. If all this follows from determinism, then it is understandable why people often consider it a nihilistic worldview. Despite my belief that true nihilists don’t exist, this is a fair objection and merits a response.
I think the objection arises out of a different understanding of the relationship between being a cog in the machine and being coerced. I do not claim that we are just acted upon by outside forces, making all actions coercive. And, of course, no one likes being coerced all the time. But the nonexistence of free will doesn’t imply that all we do, we do reluctantly or against our will. “The absolute necessity of everything that happens contains no element of compulsion” (Nietzsche, 2003, p. 62). I claim that we also partially comprise those forces. You can influence the life of someone in good or bad ways. You can be the cause of someone’s happiness or pain. The fact that your behavior is determined doesn’t change this. You can’t be blamed or credited for who you are and what you do, but you can still be honest or dishonest, courageous or cowardly, proud or humble, and so on. There is no element of unpleasant compulsion in any of this. No one compels you to act against your will. It’s just the way the machine operates.
The offence taken at the doctrine ‘of the unfreedom of the will’ is that it seems to assert ‘you do what you do not voluntarily but unwillingly, i.e., under coercion’. Now, everyone knows how it feels to do something unwillingly, that is, reluctantly, ‘against your will’ – and one doesn’t concede that, because one does many things, in particular many ‘moral’ things, gladly. One thus understands ‘unfree will’ as meaning ‘a will coerced by an alien will’, as if the assertion were: ‘Everything you do, you do under coercion by somebody else’s will’. Obedience to one’s own will is not called coercion, for there is pleasure in it. That you command yourself, that is ‘freedom of will’. (Nietzsche, 2003, p. 57)
Free will may not exist, but the distinction between free men and slaves, like the distinction between voluntary and involuntary actions in Aristotle’s theory, remains. You may still know which actions are good for you and perform those, hence being free. Alternatively, being ignorant of your good coerces you to act in a way that is contrary to your actual will.
We shall easily see what is the difference between a man who is led by opinion or emotion and one who is led by reason. The former, whether he will or not, performs things of which he is entirely ignorant; the latter is subordinate to no one, and only does those things which he knows to be of primary importance in his life, and which on that account he desires the most; and therefore I call the former a slave, but the latter free. (Spinoza, 1677/1910, pp. 186-187)
Augustine needed the concept of free will to explain the problem of evil. God is all-good and all-powerful, so why do atrocities happen? Because God preferred to give us free will rather than make us slaves to destiny. This belief has persisted even among atheists due to inertia and its intuitiveness. So what is the practical significance of refuting responsibility and blame? Why go against everyday experience and common sense?
My issue with the concept of free will, when it comes to politics, is the notion of just desert that comes with it. If there is no free will, no one is inherently more deserving than anyone else. And the only reasons for permitting inequalities would be practical ones (Studebaker, 2013). So, for example, in a determinist world, the only reason to pay a neurosurgeon more than one pays an architect would be that being a neurosurgeon requires more years of training and is a more difficult job. Consequently, we need to give them a stronger incentive so that the profession does not die out. We would never pay doctors more because they are inherently more deserving or meritorious.
The other reason for problematizing moral responsibility, blame, and guilt are their usefulness to the oppressors. The concept of punishment for the sake of punishment that I wrote about in the beginning would not exist if not for the invention of free will:
Today we have no sympathy anymore for the concept of “free will”: we know only too well what it is—the most disreputable of all the theologians’ tricks, designed to make humanity “responsible” in the theologians’ sense, that is, to make it dependent on them . . . Here I am simply offering the psychology of all making-responsible.—Wherever responsibilities are sought, what tends to be doing the seeking is the instinct of wanting to punish and rule. One has stripped becoming of its innocence when some state of being-such-and-such is traced back to will, to intentions, to acts: the doctrine of the will was essentially invented for purposes of punishment, that is, for purposes of wanting to find people guilty. (Nietzsche, 1889/1997, p. 35)
Blame is, therefore, a machine for oppression. Thought up by the powerful to satisfy the desire to punish and suppress. Blaming the people is like blaming the bullets. What we can do and what we cause are not up to us. Like impersonal forces that have no agency, we make things happen. Whether we like it or not, what we want and when we want it is not for us or anyone else to decide. It’s just how we are and always will be. As Luther would say: “Hier stehe ich, ich kann nicht anders.” I know this vision doesn’t sound all that attractive. You may agree with Nietzsche and say that “truth is terrible” (Nietzsche, 1908/2018, p. 84.). I might not be able to change your mind. It is not my intention to convince as many people as possible. I’m no missionary. In my personal experience, I’ve found that people seem very keen on denying these claims, so I wouldn’t expect an essay to succeed. But here, I tried at least to make you consider these questions, even if they seem so contrary to what you currently believe.
MacIntyre, A. (2017). A Short History of Ethics: A History of Moral Philosophy from the Homeric Age to the Twentieth Century, Second Edition. University of Notre Dame Pess. (Original work published 1966)
Nietzsche, F. (1997). Twilight of the Idols, Or, How to Philosophize with the Hammer. Hackett Pub. (Original work published 1889)
Nietzsche, F. (2003). Nietzsche: Writings from the Late Notebooks. Cambridge University Press.
Nietzsche, F. (2018). Ecce Homo: How One Becomes What One Is. Immortal Books. (Original work published 1908)
There’s a picture of Rosalind E. Krauss that always makes me want to write. Some phrases, such as “stand by your text” and “to know, one must burn,” (Calasso, 2014) make me want to argue. Some paintings, such as The Minotaur by G. F. Watts, make me want to endure suffering. Thinking about certain people makes me want to read. Looking at pictures of Luhmann’s Zettelkasten makes me want to do research. What I want to show with this is how important aesthetics are for our decisions.
So what exactly do I mean by aesthetics? I could go through the etymology and the definition of aesthetics, but this won’t capture what I mean. The examples above should suffice since aesthetics is a family resemblance concept: aesthetics might not have a fixed essence. There may instead be a “network of similarities overlapping and criss-crossing” in the sense Nietzsche introduced and Wittgenstein developed (Wittgenstein, 1953/2010, p. 36). There are no exact standards for what counts as a sufficiently strong resemblance for the term ‘aesthetics’ to be applicable.
Our entire lives are, in a sense, governed by aesthetic categories. I admit that my view of what’s “good for me” is based ultimately on aesthetics (and probably, so is yours). I would follow Nietzsche in claiming that the good life is not ‘useful,’ ‘moral,’ or ‘authentic,” but admirable. To use Plato’s terminology, this is a very silver (Plato, Republic) way of thinking. And admiration is an aesthetic category (Geuss, 2009, p. 95).
Both admiration and disgust can be seen as powerful forces which move human beings to action. For Nietzsche, some of the most important characteristics of a person are her or his aesthetic preferences. Specifically, preferences concerning feelings of admiration and disgust (Geuss, 1999, p. 187).
What is instinctively repugnant to us, aesthetically, is what the very longest experience has demonstrated to be harmful, dangerous, suspect to man: the aesthetic instinct which suddenly raises its voice (e.g., when we feel disgust) contains a judgement. To this extent, the beautiful belongs within the general category of the biological values of the useful, beneficent, life-intensifying: but in such a way that many stimuli which very distantly remind us of and are associated with useful things and states arouse in us the feeling of the beautiful, i.e., of growth in the feeling of power. (Nietzsche, 2003, p. 201-202)
Yet different people experience what Nietzsche calls the “value feeling of the beautiful” through different things. The experience of beauty and ugliness cannot in any way be objective: “To experience a thing as beautiful necessarily means experiencing it wrongly” (Nietzsche, 2003, p. 203). But there is one thing aesthetic categories have that ethical ones lack: they are almost always resistant to our will.
There are no ‘objectively’ admirable or disgusting properties, but it is also true that we can’t simply decide to feel admiration or disgust. These feelings aren’t subject to arbitrary decisions. I can’t ‘decide’ to no longer find a girl beautiful. You can’t ‘decide’ to find Scarpa’s Tomba Brion disgusting. How much you can control what arouses these feelings in you will depend partially on your personality.
To freely have or not have your affects, your pros and cons, to condescend to them for a few hours; to seat yourself on them like you would on a horse or often like you would on an ass: – since you need to know how to use your stupidity as well as you know how to use your fire. (Nietzsche, 1886/2002, p. 171)
REFERENCES
Calasso, R. (2014). Ardor. Penguin UK.
Wittgenstein, L. (2010). Philosophical Investigations. John Wiley & Sons. (Original work published 1953)
Plato. Republic.
Geuss, R. (2009). Outside Ethics. Princeton University Press.
Geuss, R. (1999). Morality, Culture, and History: Essays on German Philosophy. Cambridge University Press.
Nietzsche, F. (2003). Nietzsche: Writings from the Late Notebooks. Cambridge University Press.
Nietzsche, F. (2002). Nietzsche: Beyond Good and Evil: Prelude to a Philosophy of the Future. Cambridge University Press. (Original work published 1886)
The non-existence of free will is something I am almost certain of. When people hear this, they usually tell me that my position is “nihilistic”. While I don’t believe that “nihilists” really exist, I understand the aversion people feel towards the idea of being a cog in the machine. It’s an illusion, but so what? Nothing important needs to change in someone’s behavior when they stop believing in free will.
The argument looks something like this:
When we make a conscious decision to act a certain way, we make that choice because of the way we are, mentally speaking.
Free will requires you to be able to self-determine the way you are.
But to have self-determined the way you are, you would have had to make conscious decisions to be a certain way.
Now we are caught in an infinite regress: for self-determination to be possible, you would have to be able to self-determine the process by which you self-determine, which would require that you self-determine the process by which you self-determine the process by which you self-determine.
Therefore, free will, understood in this sense, is impossible.
Imagine the following scenario: at some point in your life you decide to change the way you are. The way you try to change the way you are will be the function of the way you already are. And just to be clear, “the way we are” includes the thoughts that pop into our heads without a prior conscious decision on our part.
The often misunderstood argument does NOT claim that the universe must be wholly deterministic, or that there is no “randomness” in it. Whether we are the way we are because of determinate laws of nature, randomness, heredity, environment, or some combination of these doesn’t matter for the argument at hand. It is nonetheless true that what you do and can do is determined by your nature and nurture, and you are the author of neither. Not being a causal determinist is fully compatible with not believing in free will, and vice versa.
It is important to note that this line of thought is in no way original. Some version of the above argument has been formulated in different ways by many philosophers: John Locke, Jonathan Edwards, Arthur Schopenhauer, Friedrich Nietzsche, Bertrand Russell, and Galen Strawson just to name a few.