Home » Archives for Brian Kemple

Living through the Barbarism

Perhaps this is an odd title—Living through the Barbarism—but it seems that ours is an age of unthinking strife. As a Lyceum Member asks: What is work and what is its purpose? This is something I have been thinking about a lot recently but also as a follow up to our conversation on Private Property [discussed on 11 October 2023]. It seems like most people do not see any purpose in the work that they do. This I believe is a broader societal problem about the value we hold toward our own lives and the lives of others. We no longer really seek the Good but instead seek what is most expedient and lucrative. We work, it seems, so that we can make a company bigger and bigger, whether it be in market share, notability, number of employees, etc. Whether these companies themselves seek any good is never really considered, however.

What Makes Something Work?

“Do what you love and you’ll never work a day in your life”—this statement contains multitudes, and most of them, I would submit, are lies. The first is the literal sense of the conjunction: as though operations undertaken for the sake of a beloved object entail no labor, no toil, no struggle, no difficulty—not only with the accomplishment of the task but with one’s own motivation to carry it out.

The second and implicit lie is that work is something per se hateful or unfortunate. In other words: do we regard work as a necessary evil, only? Is work itself something we do simply because we must? Is there no good to working itself, and only a good to the product of work?

More fundamentally we must ask, therefore, is the question of what makes something to be “work” in the first place? What is “work”?

What is the End of Work?

Closely related to this question: why do we work? As just mentioned, there exists an obvious answer: we work to produce something, be it a car, a toy, a report, or, in an extended sense, money—so that we can buy food, and clothes, and shelter, and provide for a family, so that we can… what? Continue going to work? Teach our children to work? Buy better and better luxury items with or through which we seek pleasures? Retire in comfort and enjoy our “Golden Years”?

Can there be a life without work? In a sense, yes. There are quite a few whose lives entail no servility: that is, demands of labor for ends not one’s own, in exchange for which one receives some supposedly proportional material benefit. Often these persons—anecdotally, from my own experience and from the accounts of literature and the like—appear not only spoiled and out-of-touch with the realities of the world but, even more tellingly, deeply dissatisfied with their own lives. Might it be that work is not merely a necessary evil… but something that ought to be integral to living well?

I would argue so. But I believe the modern structure of work has made this rather difficult to realize. Perhaps recapturing some distinctions about different ways in which work may be performed can be helpful.

How can we make Work Better?

A Pew Survey conducted earlier this year—with the caveat that such surveys may be misrepresentative in many ways* (consider the skewing by age)—reported that only a very slightly majority (51%) of Americans find their jobs “highly satisfying”. I suspect that both the word “highly” and the percentage are inaccurate. I also suspect that many who do report a satisfaction with their job (and note how much higher it is among those who are paid well!) are satisfied with its outcomes (like being paid)… and not with the work itself.

So how, let us ask, can we make work better? Join us this Wednesday (6 December 2023) to discuss! Links below:

Philosophical Happy Hour

« »

Come join us for drinks (adult or otherwise) and a meaningful conversation. Open to the public! Held every Wednesday from 5:45–7:15pm ET.

*E.g., while the methodology of randomization is fair, the self-selective nature of those responding to the survey cannot be controlled by those conducting the poll.

On the Death of the Artist

A Lyceum Member proposes, as a topic for our 29 November 2023 Happy Hour: “How much does the artist’s intention factor into the meaning of his art? How can semiotic Thomism help us to answer this question? Can there be a more fitting interpretation of the art he makes than the one he intended? Is the more fitting interpretation, the ‘correct’ interpretation, even if it is not the one intended by the artist?  What is fittingness?”

Questions about the nature of art—a perennial inquiry renewed time and again—have resurfaced in recent years and months as intelligence-simulating pattern-recognition and reconstitution algorithms (commonly misnamed “Artificial Intelligence”) dramatically improve their abilities to produce graphical (and soon other) representations of human artistic creation. This is a rather complex way of asking: does AI produce art? To back these kinds of question into those written above, is there art without intention? Who is the artist when someone plugs a prompt into ChatGPT? How does the output of an intelligence-simulator correspond to artistic causality?

AI artists?

The above image was generated in less than 60 seconds with a relatively simple prompt. One can dissect it to discern the influences of various artists, famous, infamous, and virtually unknown alike; one might even be able to reconstitute the prompt from such analysis. So: who created the image? And what interpretations may be made of it?

Exemplar Causality and Intention

At the center of every work of art stands a formal cause: that is, the principle by which are arranged all the material parts making it to be what it is. When drawing a portrait, one seeks to capture the visage of the human person. There are countless material variations through which this might be achieved. A portrait may exhibit technical proficiency but fail inasmuch as the person does not truly appear within it. In this, we would say that it falls short formally. But the intrinsic formal constitution of the artistic work both relies upon and relates to an extrinsic formal cause as well, namely, the idea or plan in the mind of the artist.

This extrinsic formal cause may be termed the exemplar (later Scholastic philosophers called it the idea). As John Deely writes:

The first and obvious way in which a formal cause can be extrinsic to an effect, and the way which was principally considered in the history of the discussion of these questions, is again in the case of artifacts: the architect constructs a building out of materials and according to a plan which he has drawn up, and [1] this plan is then embodied in the building, so that it becomes a “Mies van der Rohe building”, for example, an instance and illustration of a definite architectural style and school; the artist [2] creates a painting as an expression of something within the artist, or models a work, such as a statue or a portrait, on something external which the artist nonetheless wishes to represent. Even [3] when the work is called “non-representative” and so strives to be a mere object with no significant power, as an expression it fails, in spite of itself, to be without significance. Extrinsic formal causality in this first sense came to be called ideal or exemplar causality among the Latins.

Deely 1994: New Beginnings, 160.

There are many points compressed within this paragraph worthy of extended consideration, but we will limit ourselves to the three annotated: first, [1] the embodiment of a plan in the work; second, [2] the internally-expressive rendering of something externally-extant; and third, [3] the invariable signification of productive expression.

Concerning the first [1], this point proves important to a fundamental understanding of art. The work of art terminates the act of the artist. It receives the expressive form in an embodied manner. Even performance art—the playing of music (or even 4’33”), a dance, juggling—requires an embodiment. But it is not just any embodiment that renders something “art”. There must be a definite plan: the exemplar cause. If I slip and happen to make the motions of a most stunning and beautiful pirouette, I fail to perform the ballerino’s art. There was no intent behind my performance, no plan, no exemplar cause.

Now, had there been—had I myself seen the artful spins of professional dancers and wished to emulate them—then, second [2], I would be expressing from my own conception an observation of something external. So too, if I draw a portrait of Audrey Hepburn, I must draw upon my impressions of her in memory, from movies, in photographs, etc. Even the most seemingly-innovative artistic creation relies upon the grasp of a form outside the self which is creatively transformed through the exemplar expression. We are imitative creatures.

To this point, third [3], we cannot create any forms that do not themselves, as existing in the embodied artistic product, further signify to others. Even the most wildly re-constituted expressive form still draws from the extrinsic causality of those things we have first grasped ourselves. Thus, “abstract” art yet falls under the auspices of interpretation.

Interpretation and Specifying Causality

Or, to put this otherwise: interpretation is always a part of artistic creation. This raises a difficult question, however: what do we mean by “interpretation”? It seems a word the meaning of which we often take for granted. Is it the drift of associations? The insight into “what is”? The relation of appearance to context? Anything at all?

Perhaps the easiest answer: interpretation is the working-out of an object’s meaning. “Meaning”, too, of course, presents a challenge. If, in the context of art, we presuppose meaning to reside principally in the intrinsic formal cause of a work, we simplify the conversation. For the form of the work—the embodiment of the artist’s intention—invariably specifies the audience. In other words, the audience can interpret the work according to its own complex of determinations and indeterminations. If I am somehow conditioned to hate all impressionist art, I will interpret all impressionist art hatefully. It cannot specify me otherwise. If, however, I am not so-determined, but remain open to its specification, I may interpret it other ways.

Sometimes, for instance, when we learn an artist’s intention behind his creation, it appears in different light. Sometimes better, sometimes worse. Other times, we may discover another work of art—or a philosophical premise—which allows us deeper insight into the work. This insight may be entirely outside the author’s intent, and, yet, rings true. But how do we justify these interpretations?

Life of Art?

To many, it may seem that death stands imminent for the artist. Intelligence-simulation threatens artistic life. It will discern and reconstitute patterns faster than we can even conceive them.

Or will it?

Can there be art without exemplar causality? Can machines interpret? Produce expressions? Does “artificial intelligence” produce art or… something else? Is it a medium for a new, emerging kind of artist? Come discuss these and other fascinating questions concerning the nature of art with us today, 29 November 2023! Links below:

Philosophical Happy Hour

« »

Come join us for drinks (adult or otherwise) and a meaningful conversation. Open to the public! Held every Wednesday from 5:45–7:15pm ET.

On Modern Science and Sacred Traditions

“Religion is anti-science.”  Jerry Coyne, Professor Emeritus of Ecology and Evolution at the University of Chicago, once wrote the following:

I’ll construe “science” as the set of tools we use to find truth about the universe, with the understanding that these truths are provisional rather than absolute.  These tools include observing nature, framing and testing hypotheses, trying your hardest to prove that your hypothesis is wrong to test your confidence that it’s right, doing experiments and above all replicating your and others’ results to increase confidence in your inference.

And I’ll define religion as does philosopher Daniel Dennett: “Social systems whose participants avow belief in a supernatural agent or agents whose approval is to be sought.” Of course many religions don’t fit that definition, but the ones whose compatibility with science is touted most often – the Abrahamic faiths of Judaism, Christianity and Islam – fill the bill.

“Yes, there is a war between science and religion”, The Conversation, 21 December 2018

These two definitions, as Coyne puts it, construe incompatible ways of viewing the world.  Arguably, however, these are very bad ways of defining both religion and science.  Neither gets after something essential, but aims, instead, at a kind of generalized amalgamation.  Coyne goes on from these dubious definitions to argue that religion provides no good reasons or evidence for its claims, but requires unreasoning faith, whereas science employs an empirical method of inquiry that can result in “confident inferences”.

1. Reconciling Sources

Debating Coyne’s unserious and weak assertions (and understanding) is not our purpose, here, however.  His—and generally other “new atheist” objections (which smack of intellectual insecurity; what else could so philosophically-bereft minds feel, when facing philosophically-dependent questions?)—instead serve to raise a point: how should we understand science, and, with that, its compatibility with religion and sacred traditions?

The hermeneutic question of interpreting different sources for truth—the books of nature and of revelation—has long been asked by none other than religiously-minded figures themselves.  On its own, asking this hermeneutic question is itself a kind of scientific inquiry.  For we must recognize that what often goes by the name “science” today—or “modern science”—is but one dependent branch on the tree of human understanding.  To this end, Jeremy Bentham (of all people!) once felicitously proposed the terms “idioscopic” and “cenoscopic” to distinguish between the methods used in “modern science” and the philosophically-geared methods of discovery.  Fr. Scott Randall Paine has an extensive and wonderful essay on the distinction available here.  In short, the idioscopic specializes its vision to discern things indiscernible otherwise; while the cenoscopic utilizes the common reasoning capacities of the human being to resolve discoveries into a coherent whole.  Regarding idioscopy as alone the tree upon which knowledge grows (cutting that branch off and sticking it in the ground, as it were) has borne sickly intellectual fruits.  “Modern science”, divorced from the humanities, arts, philosophy, religion and theology—all the domains of cenoscopic inquiry—leaves us with an unresolved picture of the world.

But modern science alone does not cause this separation.

1. Scripture and Science

Commenting upon the modern philosophical rejection of the textually-commentarial tradition of Scholasticism, John Deely writes in a lengthy footnote:

Although sometimes I wonder to what extent this objection of the times, apparently directed against the Aristotelian philosophers, a safe target, is not the more intended for the unsafe target of the theologians, who in fact have always been the far more culpable in this area from the earliest Christian times.  I think of such examples as that of Cosmas Indicopleustes with his Christian Topography (Alexandria, i.535–47ad), “in which he refutes the impious opinion that the earth is a globe”, for “the Christian geography was forcibly extracted from the texts of scripture, and the study of nature was the surest symptom of an unbelieving mind.  The orthodox faith confined the habitable world to one temperate zone, and represented the earth as an oblong surface, four hundred days’ journey in length, two hundred in breadth, encompassed by the ocean, and covered by the solid crystal of the firmament” (Gibbon 1788).  But examples of equal or greater offensiveness can easily be culled from every tradition of sacred, “revealed” texts, both before and outside of the Christian development.  Surely, within the Christian era, one of the more outstanding examples of hermeneutic abuse is the career of the “blessed” Robert Cardinal Bellarmine (1542–1621) who, well in advance of the most famous trials over which he held sway (in 1600 that of Bruno, in 1616 that of Copernicus’ work, laying the ground for the 1633 condemnation of Galileo), had arrived through scriptural study at a detailed cosmology which he regarded as “virtually revealed”.  These astonishing results he recorded between 1570 and 1572 in his unpublished Commentary on Qq. 65-74 of Aquinas c.1266 [Summa theologiae, prima pars], autographs which we may hope will one day be brought to full publication (Baldini and Coyne 1984 [“The Louvain Lectures of Bellarmine and the Autograph Copy of his 1616 Declaration to Galileo”] is barely a start) to add to the many object-lessons still resisted that make up the ending of the “Galileo Affair”: see Blackwell 1991 [Galileo, Bellarmine, and the Church] esp. 40–45, 104–06 (on the truth of the Bible even in trivial matters being guaranteed as Bellarmine put it, ex parte dicentis – “because of God being the one who says so”).  Too bad Galileo, writing in 1615 with Bellarmine in mind as well as still alive (see Blackwell 1991: 274), felt constrained to leave unpublished his observation that “those who try to refute and falsify [propositions about the physical world] by using the authority of… passages of Scripture will commit the fallacy called ‘begging the question’.  For since the true sense of the Scripture will already have been put in doubt by the force of the argument, one cannot take it as clear and secure for the purpose of refuting the same proposition.  Rather one needs to take the demonstrations apart and find their fallacies with the aid of other arguments, experiences, and more certain observations.  And when the truth of fact and of nature has been found in this way, then, but not before, can we confirm the true sense of Scripture and securely use it for our purposes.  Thus again the secure path is to begin with demonstrations, confirming the true and refuting the false”.  This lesson applies across the cultures to every group that draws upon texts deemed revealed, not in every case, indeed, but wherever arise questions that can be investigated and resolved by means of natural investigations, scientific or philosophical.

Deely 2002: What Distinguishes Human Understanding, 57-58n13.

While I generally agreed with my mentor on many things, I find his objections (and dismissive attitude) toward Bellarmine problematic.  Yet—I must admit a hesitation here.  There seems to be a valid objection to the hermeneutic used often still today by Biblical literalists; one which attempts to conform an understanding of the physical world to an already-determined interpretation of Scripture’s meaning, rather than to understand Scripture’s revelations about the natural world through an understanding of that world itself.  Study of the natural world responds to our human thirst for knowledge, and, nourished in the proper context of a holistic human learning, enlivens the soul.  To constrain it under the bounds of a Scriptural interpretation itself question does, indeed, beg the principle.

3. Universal Hermeneutics of Continuity

Can we resolve the diverse sources of knowledge into a coherent whole?  How?  How should we interpret Scripture and science as parts of one continuous whole for human knowledge?  Join us this evening (and perhaps again in the future!) to discuss.

Philosophical Happy Hour

« »

Come join us for drinks (adult or otherwise) and a meaningful conversation. Open to the public! Held every Wednesday from 5:45–7:15pm ET.

On Worldviews and Ideologies

“Ideology” is a distinctly modern word that helps us to discern a distinctly modern phenomenon.


Whenever we have a world picture, an essential decision occurs concerning beings as a whole.  The being of beings is sought and found int eh representendess of beings.  Where, however, beings are not interpreted in this way, the world, too, cannot come into the picture – there can be no world picture.  That beings acquire being in and through representedness makes the age in which this occurs a new age, distinct from its predecessors.


Yeah, well… you know, that’s just like… uh… your opinion, man.

-Mark Shiffman, What is Ideology? | -Martin Heidegger, Die Zeit des Weltbildes | -The Dude, Big Lebowski

Understanding the World(view)

What do we mean by the common term “worldview”?  Our English word originates from the German Weltanschauung (from Welt, meaning “world”, and Anschauung, “view”, “perception”, or even “perspective”).  Often, the term is used as though it needs no explanation: “That’s your worldview”, “My worldview is…”, “The Roman worldview” or “The Catholic worldview”, etc.  But the German philosophical traditions from which the notion arose, and through which it develops, course in diverse and confusing ways.  Kant, Humboldt, Hegel, Husserl, Jaspers, Heidegger, and many others all spoke meaningfully about the world, about worldviews, and/or about the “world-picture”.

In a similar vein, Karl Marx developed (in a departure from its origins in the late eighteenth-century French thinker, Antoine Destrutt de Tracy) a notion of the “ideology” that shapes thinking to this day in a similar fashion.  In Marx’s bending of ideology, it was put forward as a “set of ideas whose purpose is to legitimate a set of social and economic relations, and which originates from those same relations.”[1]  As the twentieth-century Italian Marxist Gramsci furthered this interpretation, ideologies were not only echoes of our economically-shaped consciousness, but themselves a real battleground for social and political struggle.  Thus, ideology is understood as “a set of ideas justifying a power agenda and helping it to obtain cultural sway by dominating the minds of those who can be brought to accept it”.[2]

Thus, the contemporary notion of “ideology” is narrower than that of “worldview”, which comprises a sense of the whole, whereas the ideology concerns itself only with what fits inside the “idea”.

Constraining the World

But are these really different?  If the “world” is encompassed in the “view”, or its meaning restrained to what can be viewed—or, given in a picture—do we not thereby restrict the being of the world?  Let us take, for instance, the “American worldview” as experienced in the 1950s.  Fresh off the victory of World War II, and confronted by tensions with the growing power of the USSR, the American worldview was truly a “view of the world”, as a stage upon which conflict with the Soviets was to be won or lost.  The American represented freedom, justice, prosperity, and faith; the Soviet oppression, abuse, poverty, and godlessness.  One held to the dignity of the individual and the family; the other Procrustean conformity to the collective.

How much of the real world was omitted through such myopic lenses?

Or consider the idea of a “Catholic worldview”—a claim today so vague as to be all-but-meaningless.  Why?  Should there not be a common, underlying view through which all Catholics view the world?  Perhaps, yes; but the very notion of a “Catholic worldview” seems more and more to be coopted into one or another ideological claim: that of care for the poor and marginalized, the “open arms”; or one of returning to tradition, beauty, and “rigid” codes of behavior.  What causes this divergence?  The lenses appear to be narrowing—letting in less and less light as each day passes.

Realism and the World

Central to most claims touting the advance of a “worldview”, “world-picture”, or “ideology” one finds, I believe, either an inherent skepticism or a deliberate agnosticism about humans’ common possession of the ability to know what truly, really is, independent of the mind.  No wonder the world ends up constrained!

Doubtless there is much more to be said—so come say it!  Join us for our Philosophical Happy Hour this Wednesday (11/8/2023) from 5:45–7:15pm ET.

Philosophical Happy Hour

« »

Come join us for drinks (adult or otherwise) and a meaningful conversation. Open to the public! Held every Wednesday from 5:45–7:15pm ET.


[1] Shiffman 2023: What is Ideology? 10.

[2] Ibid.  Cf. Zizek 1989: Sublime Object of Ideology, 49-51.

On Modernity, Ultramodernity, and Postmodernity

If you and I are to have a conversation—that is, a kind of living-together through discourse or the concrete articulations of language—we must, indeed, do so in the same “place”.  Put otherwise, we cannot have a conversation unless the objects signified by our words are the same.  I do not mean that each and every word used by each person in the conversation needs to have the exact same denotations and connotations as understood by every other person in the conversation.  But without some common referents—some formal identities between us as to what the words we use mean—we are not and cannot be engaged in a dialogue, but only exchanging equally incomprehensible monologues.

It is to the end of establishing some of these necessary commonalities, particularly concerning the meaning of modernity and thus the elaborated forms of ultramodernity and postmodernity, that this article has been written.

The Common Understanding of Postmodernity

Let us begin by asking: what is postmodernism?  Commonly, the term is used to indicate a movement committed to the radical abdication of belief in ultimately-defensible intelligible meaning.  James Lindsay, for instance—who attained some fame through his work with Peter Boghossian and Helen Pluckrose in exposing the absurdity in a lot of academia—has frequently referred to an “applied postmodernism” identified with social justice activism.  By this phrase is meant: the truth about things is less important than the imposition of moral categories based on emotional responses, many of which have been fostered through radical historicism or selective criticism of the structures common to Western civilization.  James Croft, University Chaplain and Lead Faith Advisor to the University of Sussex—with a EdD in Education and Human Development, who describes himself as a gay rights and Humanist activist—describes postmodernism as comprising positions “anti-foundationalist”, “anti-essentialist”, “anti-teleological”, and “anti-universal”. 

Academic Understandings

But is this throwing-around of terms in the public sphere, without careful attention, truly indicative of what postmodernism is or what it is understood to be among its adherents, advocates, and expositors?  In a sense: yes.  The Stanford Encyclopedia of Philosophy begins its entry on “postmodernism” by writing:

That postmodernism is indefinable is a truism.  However, it can be described as a set of critical, strategic and rhetorical practices employing concepts such as difference, repetition, the trace, the simulacrum, and hyperreality to destabilize other concepts such as presence, identity, historical progress, epistemic certainty, and the univocity of meaning.

Further, the first prominent instance of the word in academic discourse seemingly belongs to Jean-Francois Lyotard’s 1979 book, La condition postmoderne, published in 1984 under the English title, The Postmodern Condition: A Report on Knowledge.  In Lyotard’s “simplification to the extreme”, he defines “postmodern as incredulity toward metanarratives” (xxiv).  As he states, the incredulous turn began around the turn of the 20th century, as the “rules” for science, literature, and the arts were changed through a seeming radical re-thinking.  One thinks of Andre Breton’s two manifestos on Surrealism, transgressive films like Federico Fellini’s La Dolce Vita or , or some of Hitchcock’s work, compositions of Arvo Pärt, Györgi Ligeti, Phillip Glass, or Brian Eno, the novels of James Joyce or William Gaddis or Thomas Pynchon or Michel Houellebecq—or any of the expressive turns which attempt to convey a meaning, or the possibilities of interpretation, through methods which defy the previously-established norms and conventions of society.

Nascent and oft-unrecognized turns towards this incredulity of narratives in science can be found as early as the mid-19th century, particularly in the establishment of psychology as a discipline independent from philosophy and the shift to an apparently post-Aristotelian logic.  Though ostensibly grounded in the modern traditions of science—with their focus upon the experimental and the quantitative—these “new” approaches further untethered the objects of thinking from the mind-independent real.  The development of these, and other-like sciences, led to a further fragmentation of intellectual advance and the present-day irreconcilability of the disciplines[1] as well as the widely-known “replication crisis”—not to mention, opened the door for events such as the Sokal Hoax or the “grievance studies affair”.  Some might take these latter as evidence that the social sciences are insufficiently scientific.  Their simultaneity with the replication crisis shows, however, a deeper problem about the condition of knowledge—precisely the point articulated by Lyotard in his 1979 work.

The Public Face of the “Postmodern”

The fragmentation of knowledge and dissolution of narrative foundations has found its way also into the practical, moral, and political dimensions of life.  Without widespread social acceptance of either common principles for knowledge or shared sentiments—say, those stemming from religious belief, patriotism, or adherence to the dictates of an authority—new struggles appear concerning law and power.  Thus arise new movements of “social justice activism” or “applied postmodernism”, frequently witnessed on or stemming from college campuses throughout the late 20th century and into the early 21st—which follows insofar as these movements spring from a “theoretical postmodernism”.  In recent decades, the application of postmodern thinking (without often going explicitly under this name) has reached a somewhat more national consciousness, infiltrating politics, with not only the once-fringe Bernie Sanders shuffling into the limelight, but also the influx of young, female politicians with contrarian ideas, such as Ilhan Omar, Rashida Tlaib, and Alexandra Ocasio-Cortez.  The lattermost, for instance, gave an interview to Anderson Cooper on the television program 60 Minutes where she stated that “there’s a lot of people more concerned about being precisely, factually, and semantically correct than about being morally right.”  Though she walked it back, that sort of statement is not simply a mistake, a slip of the tongue, but rather exhibits a structure of thinking.

In other words, there is a strong opposition within the contemporary movement to what historically has been called “reason”.  At the theoretical level, this opposition often utilizes the very language of power-struggle arising in the wake of reason’s dismissal to vilify “reason” as an instrument of oppression—a tool employed by the hegemonic forces that are responsible for seemingly all human pain or suffering.  To the contrary, emotions and lived experience are promoted as the new and perhaps the only means to a just world.

Theoretical Roots of the “Postmodern”

At any rate, the theoretical or philosophical roots will often be linked to names like—in no particular order—Karl Marx and Friedrich Engels, Jacques Lacan, Max Horkheimer, Jean-Paul Sartre, Albert Camus, Jacques Derrida, Jean Baudrillard, Gilles Deleuze and Félix Guattari, Herbert Marcuse, Theodor Adorno, Louis Althusser, Michel Foucault, perhaps Slavoj Zizek, and the ethereal movement which may not be precisely identified with Karl Marx’s thought, but which not-illegitimately claims inspiration from him—the movement which receives even if it repudiates the name of cultural Marxism, exemplified in the recent “woke” movement exemplified by public figures such as Ibram X. Kendi; or in the closely-related field of “semantic activism”, that is, the effort to shift the meanings of words to produce desired cultural outcomes, advocated by academics such as Kate Manne.  Broadly speaking, the so-called theoretical postmodern is enmeshed with relativism, a “post-truth” mentality, and the radical resolution of the meaning of any apparent phenomena to naught but the individuated psyche, which rather ironically leaves power—understood not as the necessity of force but the persuasion to willing conformity—as the only means to social change.

Or, to sum the core belief up in a single sentence: “At the heart of liberty is the right to define one’s own concept of existence, of meaning, of the universe, and of the mystery of human life”.

The words are not mine, of course, but those of Supreme Court Justice Anthony Kennedy, written in the decision of a 1992 case, Planned Parenthood of Southwest Pennsylvania v. Casey, which “reaffirmed the essential holding” of Roe v. Wade.

Now, regardless of your stance on abortion—an issue I believe factually straightforward but psychologically complex and thus will not venture into here—this assertion of the Supreme Court, if you believe that reason has a central role to play in human life and the pursuit of human good, should deeply and profoundly disturb you, and raise questions about the concept of liberty: central not simply to so-called postmodernism, but perhaps even more prominently to the modernism which preceded it.

The Meanings of Modernity and Modernism

Understanding “modernism”, however, is not simple.  For one, the term may be applied in two different but related senses: first, referring broadly to a cultural transition from centralized monarchical political authorities to individualistic democratic or republican governments as well as a secularization of education, both in means and in content; and second, referring narrowly to the philosophical movements which prompted much of this cultural change.  The cultural transition was exemplified intellectually by Galileo, Newton, and Darwin—as the rise of secular science, outside the Catholic university—not to mention the Enlightenment generally, and politically by the American and French revolutions and the ensuing diminishment of monarchy across all the Western world.  Less lauded or even recognized by the proponents of modernity (except as a sign of modernity’s achievements, rather than a cause), but just as centrally necessary to its achievements and philosophy, was the rise of technology and especially industrial technology.

This is not to say that every fruit sprung from the tree of cultural modernism is poison.  For instance, through the shift away from authority—though a shift taken entirely too far—a better image of the means to individual flourishing as intellect-possessing animals (or semiotic animals), emerged.  Yet this fruit is the exception, rather than the norm.  By contrast, the loss of the true good of authority—namely the connection of an individual with a normative truth higher than any individual—and instead the false belief in authority as a kind of necessary power to enforcing a social contract is poisonous indeed; as are the slide into scientism, the fragmentation of knowledge, and the rejection of tradition on the mere basis of its being traditional.

Philosophical Modernity

Most important for addressing modernism, however, is to understand the philosophical roots.  Here, we can quickly get at the essence: for modern philosophy has two founders that stand above the rest, namely René Descartes (1596–1650) and John Locke (1632–1704).[2]  Descartes is best known as the first of the modern rationalists, holding that ideas are not derived from the objects of sense experience, such that sense experience at most gives us the occasion to form or discover an idea, while other ideas are given directly by God or are instilled in us innately (which might as well be the same thing).  Contrariwise, John Locke held the mind as a blank slate, and thought all our ideas were formed from the empirical experience of objects, built up in an almost atomistic fashion, such that having experience of one sensation after another we came to form generalized notions of the things experienced.  The decades and centuries following Descartes saw theorists of both rationalist and empirical thinking—such as Gottfried Wilhelm Leibniz, George Berkeley, Baruch Spinoza, David Hume, and Immanuel Kant, among others too numerous to name—arguing back and forth over the starting points and nature of knowledge… all of whom seemed entirely unaware that both sides partook of a fundamental and egregiously mistaken presupposition: namely, the belief that our ideas are themselves the direct terminal objects of our cognitive actions; in other words, the belief that we know our ideas and from knowing our ideas know the world.  Though often the rationalists have received the name of “idealist”, in truth, the empiricists are just as fundamentally idealist as their opposition.

This presupposition, regardless of one’s theory of ideation—and which presupposition we may call following Leibniz and John Deely as the “Way of Ideas”—drives an incorporeal and imaginary wedge between the individual human being and everything else.  The more attempts are made to build a bridge over this gap, the deeper the wedge is driven.  For the wedge of idealism, once admitted into a theory of knowledge, sets the individual and his or her experience as knowing his or her ideas, over and against the world as something extended and material, not known directly but only through the mediation of one’s subjectively-constrained ideas.   Inevitably, therefore, it drives individuals deeper into individualism as they believe themselves to dwell in their own minds.  Thus the Way of Ideas ends up driving the wedge so deeply that it widens the initial gap into a vast chasm: a chasm between the world as known and the self as knower, between the physical and the cultural, and between the natural and the personal.

For the turn first introduced by Descartes is a turn inwards; a turn which makes thought essentially private—and there is a lot here to be said about the technology of the printing press and the paradigmatic shift between the scholastic modes of philosophy obsolesced by privately owned and read books, a lot that I am not going to say here in fact: only that the Cartesian subjective turn gives an intelligible articulation to a spreading psychological habit technologically exacerbated, making that habit both explicit and reinforced.  The result is a diminished belief in the truth of intellectual conception as an essentially public activity.  Instead, truth is seen as something realized privately and subsequently spread societally through convention and agreement.  The promise upon which this extension of private thinking into social convention depended was the supposed universality of the scientific method—and perhaps, a philosophy structured in the same manner.   Such was the proposal of Immanuel Kant in his Critique of Pure Reason.  Such was the spirit of the Enlightenment in general: Diderot’s encyclopedia, Voltaire’s histories and letters, Rousseau’s theories of social contract, and so on.  Everywhere, one saw attempts to guarantee a better future through the blending of empirical observation and the faith in scientific method to regulate those observations into a universal monolith of “objective” truth.

The result of these efforts, however, is not only a habit of thinking as private, but also a habit of denying the reality of our own experiences: for every experience we ever have, of anything whatsoever, in any discernible regard, always exceeds what is grasped in mere empiricism (understood as the discrete reception of sensory phenomena).  Do our experiences and our knowledge begin in sensation?  Absolutely and undoubtedly.  But does the sensory data or even the neurological activity explain either the having of experience or the attainment of knowledge?  No; and it does not even come close.

“Postmodernity” is Ultramodern

And this philosophical error is why modernism leads inevitably towards so-called postmodernism: not because modernism ebbs away, but because its own internal principles, carried towards their logical conclusions, lead inescapably to nonsensical, non-rational positions—to the very repudiation of reason itself.  Superficially this appears most ironic, and will—by all adherents of modernism—be rejected.  For modernism hails “reason”; but the reason it hails is one stripped of its vigor, for it is not a reason which discovers the truth concerning the fullness of reality outside the mind or independent of the psychological self.  Modernity’s “reason” supplants the search for a cognition-independent truth with an amalgamation of facts like so many grains of sand out of which it tries to build the truth; and now the remnants of ideological modernity wail when the so-called postmoderns—who, in truth, are really ultramoderns—come knocking down their granular edifice and to re-shape it as they see fit.

Allow me here a lengthy quote from an article of John Deely:[3]

Relativism and solipsism are not matters that follow upon or come after modernity: in philosophy they have proved to be its very heart and essence, present from the start, a looming presence which took some centuries fully to unveil itself.  Late modern philosophy, phenomenological no less than analytical, finally embraced fully what Descartes and Kant had shown from the start: the human being is cut off from nature, hardly a part of it; the human being becomes a cosmos unto itself, with no way to relate beyond itself, beyond the veil of phenomena forever hiding the other-than-human things which are other than our representations, whimsical or a-priori as the case may be.

Modern philosophy fathered and fostered the pretense that science must confront nature as an “objective observer”, or not at all.  But modern science found that not to be the situation at all.  Instead of confronting nature as an outside observer, science came to see itself rather in Heisenberg’s terms as an actor in an interplay between the human world within nature and the larger world of nature of which the human world forms a part.  It found itself to be “focused on the network of relationships between man and nature, and which we as human beings have simultaneously made the object of our thought and actions” (Heisenberg 1955: 9).

From the point of view of the sciences as co-heirs of modernity with philosophy, this paradigm shift seemed a kind of revolution, a veritable new beginning.  But from the point of view of semiotics this shift is something more than merely a new beginning: this shift is a going beyond the modern heritage.  In effect, the late modern philosophers clinging to their notion of the human world of culture as a whole unto itself, cut off from nature as if autonomous in its unfoldings, are anything but “postmodern”, notwithstanding the many who have tried so to style the embrasure of the relativism implicated in the epistemological view beginning and ending with the subject thinking.  If anything, the muddle of thinkers whose usage of “postmodern” Stjernfelt would like discussed contribute nothing that goes beyond modernity, but only reveal and revel in what modern epistemology logically leads to, what modern epistemology had entailed all along.  Ultramodern rather than postmodern, they are not the beginning of a revolution against modernity but the death throes of the revolution in philosophy that was modernity, Mr. Hyde to the Dr. Jekyll of modern science in its maturity.

What is called postmodernism is not really in any way post modernity.  A true postmodernism has only begun to claw its way through a series of unimaginable philosophical errors, which origins I will try to demonstrate over the next several videos.

True Postmodernity

If modernity follows the Way of Ideas, and the idealist epistemological quagmire of the moderns leads to its own demise in nonsensical irrational ultramodernity, a meaningful postmodernity must be one which follows a different path: namely, what Deely has named the Way of Signs

Thus, if there are two figures I would definitively name as proponents of a genuine postmodernity, they are Charles Sanders Peirce and that same John Deely.  I would add, as a figure responsible for truly breaking out of modernity (even if his break has been badly misunderstood and consequently misappropriated by many), Martin Heidegger.  Based upon a cursory, initial reading of some of his works, I suspect that the little-known Eugen Rosenstock-Huessy ought also to be included.  Neither of the latter two explicitly advocate for the Way of Signs; but both turn language back to things, rather than to ideas.

Such a turn—whether implicit or explicit—allows us to recover truth as normative: precisely what modernity discarded, even if it did not realize it.


[1] Not, mind you, that the disciplines of academe are irreconcilable in principle, but rather, as presently practiced.  The reconciliation could only be affected through discovery of, study in, and resolution to a common basis of knowledge.

[2] We can include also as founders of modernity Francis Bacon and Niccolò Machiavelli, but their contributions—though essential to understanding modernity’s full constitution—are more like necessary adjuncts than central pillars of its nature.

[3] 2006: “Let us not lose sight of the forest for the trees …” in Cybernetics & Human Knowing 13.3-4.

Reclaiming Culture in the Digital Age

The provincial attitude is limited in time but not in space. When the regional man, in his ignorance, often an intensive and creative ignorance, extends his own immediate necessities into the world, and assumes that the present moment is unique, he becomes the provincial man. He cuts himself off from the past, and without benefit of the fund of traditional wisdom approaches the simplest problems of life as if nobody had ever heard of them before. A society without arts, said Plato, lives by chance. The provincial man, locked in the present, lives by chance.

Allen Tate 1945: “The New Provincialism”

Hollow men, T.S. Eliot named denizens of the 21st century. Are we any less vacuous in the 21st? Or have we been further emptied?

It often proves difficult to describe our situation—our time and place in history and the world—without sounding morose, or, indeed, without falling into that trap of assuming our present moment is unique, or, to take Tate’s criticism farther, that we, as somehow constituting this moment, are ourselves unique. Arguably, our situation is unique. But we remain as human as any and every human ever has or ever will. There are challenges faced in 2023 that were not and perhaps could not quite be imagined in 1945—let alone 1845, or 545. But the uniqueness of these challenges, such as how culture is to be formed or reclaimed in the digital age, leaves us yet with an unchanged nature.

Regional Cultures

Among the unchanging truths of human nature: we are cultural beings. There has never been a time nor a place in which a human being did not carry some mark of culture—even its absence (say, in a child raised by wolves!), that is, being something distinctly human, visible in its resulting deficiency. But the deficiencies sometimes come not from the absence of culture, but from its own noxious constitution. These noxious cultural vapors are hard to discern when living amidst them. It belongs to the insightful critic, therefore, to give us the perspective from which they can be seen.

Allen Tate (1899–1979)—American poet laureate in 1943, essays, social commentator, brilliant mind and troubled soul—proved himself such an insightful critic time and again. His 1945 essay, “The New Provincialism”, clearly articulates the titular source of a cultural vapor much-thickened in the past 80 years. The term “provincialism” has often been used in criticism of rural thinking. To be “provincial”, it was often said, was to be narrow-minded. The provincial man, in other words, is an unsophisticated bumpkin.

Against the “provincial”, Tate contrasts the “regional”, which he describes as “that consciousness or that habit of men in a given locality which influences them to certain patterns of thought and conduct handed to them by their ancestors. Regionalism is thus limited in space but not in time.” In other words, the regional carries on local tradition. It may carry such traditions on across countless generations. Regionalism focuses not upon the now, but the here. Thereby, it constitutes a cultural place: an innermost boundary within which a culture may be located.

Contemporary Provinces

The ”provincial” man, however, as stated above, takes his regional here and extends it into the world, transforming the idiosyncrasy of place into an idiom of time. He becomes “locked in the present”. Do we not hear this all-too-often today? “C’mon, it’s 2023!” Do we not see obtuse historical idiocy trotted out daily?

Our culture today consists little in regional awareness and almost entirely in provincial outlook. We have no place for our cultures. They seem, therefore, to lack solidity, sameness, any transgenerational durability. Buildings across the world look increasingly similar. Dialects disappear. Styles of art—painting, sculpture, music, cinema, one and all—lose their distinctiveness through a flattening refinement of technique and production.

Can we recover any genuine “regionalism” in our modern, hypercommunicative world?

Digital Culture

As the Executive Director of an institution founded within the hypercommunicative digital environment, I think often of how our technological tools of culture can be used without contravening the good of our nature. I do not believe regional dissolution follows of necessity from our global communication. But I do think we need better habits of living today, in order that we not lapse forevermore into the “new provincialism”. Come join us (details below) this Wednesday (11/1/2023) to discuss Tate’s essay and the formation of these habits to discover how we might reclaim culture in the digital age.

Philosophical Happy Hour

« »

Come join us for drinks (adult or otherwise) and a meaningful conversation. Open to the public! Held every Wednesday from 5:45–7:15pm ET.

On the Meanings of “Object”, “Objective”, and “Objectivity”

The word “language” often suffers a confusion in use because of a partial equivocation in signification.  Sometimes, we use it to signify the species-specifically human capacity to express semantic depth pertaining to a being known as independent of our cognitive activity; in other words, we use the word “language” to indicate our ability for signifying things as they are in themselves and not merely as they considered by reference to our pragmatic considerations.  To disambiguate the partial equivocation, we can call this the “linguistic capacity”.  Other times, however, when we speak about “language”, we signify a specific system of signs used for carrying out this linguistic capacity.  We can call such systems “languages” or “specific languages”.

Growth of Symbols

Every specific language is composed of words, which are signifiers by convention.  That is, there is no necessary correlation between the sounds I make with my mouth or the letters I write on the page and the objects that these words constituted through sound or writing (or any other means) are intended to signify.  Thus, two distinct words can signify one and the same thing, as “dog” in English and “Hund” in German both signify the same species of animal.  But the sound “snargglish” might just as well signify that very same species—that of dogs—and by a kind of stipulation, I can say that that is what “snargglish” signifies.  If enough other people start using “snargglish” in this way, the signification changes from being by stipulation (what I have attempted to authoritatively impose) to being by custom (where no one needs to know that I have imposed it).  Customary significations tend to become stuck in the minds of those who use them; thus, if I started using the word “dog” to signify a pile of leaves, there would be both confusion and resistance, for this does not hold as a custom in the minds of others, even if it holds this way in my own.  Nevertheless, the meanings of words—the objects they are used to signify—do change, grow, become clearer, shift, gradually dim, or fall into obscurity, and so on and on, depending on how they are customarily used.

That said (and by saying it we have broached the topic of semiotics) while the signification ascribed to any particular word belongs to it by convention, the specific languages we use are languages at all—that is, they are instances of our linguistic capacity—insofar as the words constituting the language immediately and proximately signify the concepts of our minds.  While the words of the specific language are conventional, the significations belonging to the concepts are not.  A longstanding tendency to conflate words with concepts obscures this truth.  But the simple fact that we have multiple languages whereby words composed of different sounds, letters, or even entirely different writing systems nevertheless convey the same ideas shows that the concept and the word are not one and the same.

It is an important point which we cannot elaborate upon here (but which has been well-discussed many other places) that our concepts themselves, too, function as signs: that all thought is through signs.

Sometimes, therefore, the ways in which we as societies and cultures affect changes in our words as used allow us to better signify and explain the significations of our concepts.  “Symbols”, Charles Peirce said—and words are the preeminent kind of symbol—“grow”.[1]  The conventional words across many languages for “sign”, for instance, have grown considerably as symbols since the early use in ancient Greece (which, in Greek rather than English, was “semeion”, used initially to signify the symptoms of a medical condition).  This will be the topic of another post.  But we can likely think of many other words which have grown over the course of history: “community”, for one, or “truth”; “Catholic” or “Christian”, “American” or “Russian”, “education” or “rhetoric”, and so on and on; that is, a growth which is not necessarily an increase of the term’s comprehension (including more particulars under it, that is), but perhaps a deepening, strengthening, or clarification of its meaning.

“Objective” Meaning

Other times, however, the changes of a word’s usage result in a concept being signified poorly, or perhaps even no longer being signified at all, such that the concept experiences a societal atrophy.  Or other changes, stemming from a lack of careful philosophical reflection on how terms are used or a blending of languages, a mix-up in translation, a mix-up in intellectual traditions, might result in a confusion not only of their verbal signifiers but of their concepts, too.

A little of each kind of confusion has happened with the word “objective”.  Here, we have to note that “objective” has two other forms commonly used today: namely “object” and “objectivity”.  Both “object” and “objective”, have an equivocal use as well, for both are used at times to signify a goal or aim, as in describing a “mission objective” or in the sentence, “She has always been the object of his affections.”  This is closely related to the grammatical use, where we talk about direct and indirect objects of verbs.  In contemporary discourse generally, however, the terms object, objectivity, and objective all alike have a common signification of pertaining to reality as cognition-independent.  Thus, the term “object” is commonly used as a synonym for “thing”; “objectivity” is used to signify an absence of vested interest in the outcome of a situation; and “objective” is used to reference things as they are “factually”, “scientifically”, or independent of any “subjective” interpretation or opinion.

Many people can be observed striving to demonstrate their “objectivity” in disputed matters, just as they are seen jockeying to prove their claims as “objectively true”—mostly by some reliance upon a scientific method of experimentation and statistical verification.  When it is said that we are treating another human being as a “mere object”, this indicates a diminution of their status from “person” to a “thing for use”—which (mis)use constitutes another albeit closely-related issue, since there is a depreciated sense of the aforementioned equivocal meaning of “object” as pertaining to a goal or aim in such a use.

However: none of these words in their contemporary usages signifies the same concept that the word “object” originally signified; or as it was in Latin, in which specific language the word originated, “obiectum”.  This Latin word, “obiectum”, was composed from two parts: a preposition, ob– meaning “against”, and iactum, a perfect passive participial form of the verb “iacere”, meaning “to throw”.  Thus, the “obiectum” was “that which was thrown against”.  Thrown against what?  As understood by the Latin Scholastic philosophers, the obiectum was thrown against some power or faculty belonging to a subject; that is, to be an object, for philosophers such as Thomas Aquinas, Duns Scotus, John Poinsot—and many others—was something precisely and only insofar as it was in relation to some power, and most especially a cognitive power.  Noticeably, there is a remnant of this understanding in the equivocal meaning of the term “object” as pertaining to a goal or an aim.  But this is a very weak remnant compared to the full force of the original sense.  For being in relation to a power can occur in different ways which I will not go into here, for the sake of brevity, except to say that that potential relativity of objects-to-powers is far more complex than simply between an agent and its goal or aim.

In Latin antiquity, therefore, “subjective” and “objective” were not opposites as meaning “what belongs to the opinion of an individual mind” and “what is true regardless of what any individual person thinks”, respectively (and as commonly used today), but were a correlative pair: the obiectum was “thrown against” the cognitive faculties or powers of the subiectum, and it was by the faculties of the subject that the thing was an object at all.  That is to say, that everything having existence is a subject, but only subjects with psyches, with cognitive powers, can have objects, properly speaking.  Or to put it otherwise, ontologically speaking, everything is in itself subjective and becomes objective only by relation to a cognitive agent.

Lost Meaning

The shift of the words’ use to our contemporary meaning is, frankly, a little funny to think about: for now they are used to convey the precise opposite of what they originally were intended to signify.  But it is only a little funny, because this opposition constitutes not only an inversion of the terms, but, in fact a loss of the original meaning.  Moreover, the rise of the new meanings has had two profoundly negative consequences.

First, the idea of “objective” knowledge or of “objective truth” badly mangles the meaning of “truth”.  Truth—the revelation of what is and the subsequent adequation of thought and thing—unfolds through interpretive means.  That is, the “adequation” is not a simple 1-to-1 ratio of matching something in your mind to something in the world but requires effort; it requires us to inquire into what really is, since very often we are mistaken in taking a mere appear for an essential reality.  Our concepts, which are the means through which the adequation occurs, are not dropped into our heads as perfect, precise means, but must be worked out through various operations—and it is never the case that we get the full, unblemished, absolute truth about the objects those concepts signify.  Our concepts are never perfect signs.  They may be sufficient, adequate, and accurate; but never perfect.  Our intellects are so weak, as Thomas Aquinas says, that we can never perfectly know the essence of even a single fly.

Second, the original concept signified by obiectum, the intelligible “thing” precisely as it is in relation to a cognitive power, is not sufficiently signified by any other term or succinct phrase of the English language.  Indeed, even the word “thing” misnames what an obiectum is.  There occurs a certain parallel in the German word Gegenstand, but their language, too, has suffered a similar confusion.  And it is difficult to make known just how incredibly important the concept signified by obiectum is when the misuse has become stuck in the minds of the many.  That is: the objects of our thoughts are not always the same as the things themselves.  Our concepts may present to us objects which differ from the things they include, either by being more limited than those things (which is almost always the case in at least one regard) or by including in their signification to us certain aspects which are outside those things themselves (which also occurs almost always).  To give brief examples, I saw a picture the other day of the “dumpy tree frog”.  Both my concept—signifying the kind of creature, the essence of such frogs—and my percept or mental image—composed from particular experiences of such tree frogs—are extremely thin; I have one picture in mind, and almost no specific knowledge about the frog beyond what I know about all frogs, and even that isn’t very rich knowledge.  Thus the frog as an object of my cognitive action is much less than the frog as a thing.

On the other hand, in seeing any bulbous, dumpy-looking frog, because of the cultural exposure I have had in my life, I immediately think of the frog not just as an animal, but as one that sits on lily pads, hops around small ponds, perhaps retrieves a golden ball, and gets kissed by princesses—the first two being things that follow from its nature, but which are nevertheless relational, and the last two being fictional relations.  Since I know they are fictional, I’m not deceived, but a young child might be.  Regardless, they certainly signify something more than the frog itself.

Something very similar to this relational conceptualization happens, however, in most of our experiences.  Certainly, it happens in every experience of culturally-shaped socialization.  That is, every object we encounter which has something in it that does not belong to it strictly on account of its own nature is an object expanded beyond the boundaries of what is presented by the thing in itself: for instance, friends, lovers, teachers, judges, police officers, and so on.  There might be a basis for their being such objects—as some people make better friends than others because of how they are in themselves—but being such an object entails something more than that basis.  The mug on my desk is my mug—on my desk.  But neither desk nor mug has any property in it which makes it mine.  It receives this designation only by an objective relation: what we call extrinsic denominations, which may be more or less fitting, but which fittingness depends upon a myriad of factors irreducible to the mind-independent realities themselves.

Conclusion: The Need for Semiotics

In conclusion: it is important to distinguish between our “linguistic capacity” and our “languages” so as better to grasp the nature of concepts and the means of their signification.  Language never exists in a fixed reality—“rigid designators” being, as John Deely once wrote, “no more than an intellectual bureaucrat’s dream”—but always shifts and alters over time, through use.  The conventional nature of our languages and their symbols allows us to improve our signification—but also to lose our concepts.  Such lacunae can be destructive to understanding: not only in that we misinterpret the works of the historical past but in that we misunderstand the reality which we inhabit.  For instance, the very real presence and effect of extrinsic denominations cannot be coherently understood without a robust distinction between “mind-independent things” and “mind-dependent objectivities”.  Simultaneously, the notion of “objective truth” results in “truth” being misappropriated as something entirely impossible.

Deep and consistent reflection upon the function of our signs—not only in general but in the particular languages we use—proves necessary to ensuring our conceptual coherence and clarity.


[1] c.1895: CP.2.302.

On the End of War

“Only the dead have seen the end of war,” wrote George Santayana in 1922.  A century later his observation clearly remains poignant.  War has proven a commonplace ever since, with seldom a year passing without violent conflict.  Though we in the United States have lived fortunate enough never to have prolonged conflict upon our own soil, war has remained at least marginally present in our cultural conscience for decades.

But what makes a war just?  This is the question we seek to ask and answer in our Philosophical Happy Hour this week—a topic more sober and sobering than our usual fare.

To help orient our conversation, it is helpful to note that three temporal conditions have been customarily established to provide context for justification of war: jus ad bellum (before war), jus in bello (during the war) and jus post bellum (after the war).

Jus ad Bellum: Before the War

Why and when do we find war a necessity?  The ethical considerations prior to engaging in armed conflict ought to weigh heavily on any legitimate and authoritative leader: considerations such as whether one’s cause is just—for instance, self-defense, defense of innocent life, or attainment of conditions necessary to living—or that the war is not clearly susceptible to abuse by bad actors, that there is a reasonable probability of success, that no reasonable peaceful option remains, and that conditions of victory are clearly stipulated and recognized.

Jus in Bello: During the War

Once conflict has been enjoined, it remains necessary that the combatants employ restraint: proportionality ought to guide every action.  Even if the enemy might be wholly eliminated—purged from the face of the earth—this will seldom if ever be called for by just conditions of victory.  Similarly, non-combatants should not be targeted, nor should the violence exceed what proves necessary to win the conflict: acts such as rape, torture, or the use of weapons such as chemical gasses have no place in even the most acrimonious of wars.

This restraint protects not only the opposed soldiers and civilians, but also one’s own warriors: to de-humanize the enemy is to lose one’s own humanity.

Jus post Bellum: After the War

Most especially is this evident in the aftermath.  Having overcome the opposed, the victor might impose, and all too readily, certain punitive measures on the conquered.  Again, the measure here must be a certain proportionality aimed not merely at victory but at peace and at the reinstallation of equitable conditions.  Maintaining these principles will be much easier given both just antecedent cause and restraint in the war’s conduct.  Escalation of violence beyond proportion can end only in utter annihilation of even non-combatants or in a resentment that nourishes hatred.

Join the Conversation

Though we take up here a heavy topic, given current conditions in the world, it is important to note that—so long as the sun shines—we must strive ourselves to live a fulfilling human life.  Our conversation will not likely end violence in the world; but it may add some humanity to our understanding of war.

Philosophical Happy Hour

« »

Come join us for drinks (adult or otherwise) and a meaningful conversation. Open to the public! Held every Wednesday from 5:45–7:15pm ET.

Discussing Certitude and Intuition

A Lyceum Member writes, proposing a Philosophical Happy Hour topic: What is certitude? What role do signs play in achieving certitude? What role do signs play in intuition? Can I be certain about my mother’s love – is it intuited through signs, or through some other means?

The notions of certitude and intuition have played an important role in modern philosophy for centuries. But what are they? While they are subject to dispute and revision (say, this Wednesday, 10/4!) it should be helpful to offer provisional definitions. We may identify certitude as a firm conviction in the truth of the proposition which admits no doubt under current circumstances. Intuition, on the other hand, may be defined as an immediate and non-discursive grasp of some truth. Intuition, very often, is held to extend primarily if not exclusively to objects beyond the sensible. This

Semiotics contra Modernity

René Descartes puts certitude at the center of his noetic revolution: the method of skeptical doubt rejects anything which cannot be situated on indubitable grounds, and thus the justification of any claim to knowledge requires that it be grasped with certitude. Attempting to combat this skepticism, Locke and other self-professed empiricists attempted to demonstrate how sense perception gives rise to true knowledge. But because many apparent objects and experiences in even our banal, daily lives defy reduction to the strictly sensible, the notion of intuition outlined above gains greater prominence.

As C.S. Peirce explains this notion of the intuitive:

[intuition] is a cognition not determined by a previous cognition of the same object, and therefore so determined by something out of the consciousness… Intuition here will be nearly the same as “premise not itself a conclusion”; the only difference being that premises and conclusions are judgments whereas an intuition may, as far as its definition states, be any kind of cognition whatever. But just as a conclusion (good or bad) is determined in the mind of the reasoner by its premise, so cognitions not judgments may be determined by previous cognitions; and a cognition not so determined, and therefore determined directly by the transcendental object, is to be termed an intuition.

1868: “Questions Concerning Certain Faculties Claimed for Man”.

But just such a cognition, Peirce goes on to argue, cannot exist: that is, every apparent intuitive grasp of some truth is, in fact, an unrecognized process of semiosis, the use of signs. Does there remain a role for intuition in our noetic theory? What happens to the notion of certitude?

Join us!

We’ll tackle these (and any related topics) this Wednesday (4 October 2023) from 5:45 until 7:15 pm ET. Use the links below!

Philosophical Happy Hour

« »

Come join us for drinks (adult or otherwise) and a meaningful conversation. Open to the public! Held every Wednesday from 5:45–7:15pm ET.

A Philosophical Inquiry into Facts

What is a fact? The English word, used so commonly throughout the modern world, comes from its Latin cognate, factum: an event, occurrence, a deed, an achievement. But since the mid-17th century, under the auspices of the Enlightenment’s so-called “empiricism”, the word has been taken to be a “reality” known as independent of observation. The fact is Absolute. Facts, therefore, are discovered by and studied within “science”. They are “objective”. They are “verifiable”. That water at sea level boils at 212° Fahrenheit; that Columbus arrived in the New World in 1492; that Chicago is west of New York: most people regard these as facts.

Other claims may be disputed, such as that Jesus Christ rose from the dead; or that Domingo de Soto was the first to introduce the distinction between formal and instrumental signs. These disputes hinge upon the evidence: given the right data, it is thought, we could decide definitively one way or another. Other claims are not disputed as to their factuality, but regarded as irresoluble to facts. For instance, the claim that socialism is evil, or that capitalism drives moral flaw; that Aquinas was a better philosopher than Wittgenstein, or that a particular pope has undermined the Catholic faith.

Pseudo-Philosophical Presuppositions

This bifurcation into what is or is not a fact, however, presupposes much. Arguments often appeal to facts (or “evidence”). Arguments structured through or upon factual bases typically appear stronger. Contrariwise, if someone lacks a factual basis for his argument, others will regard that argument as “subjective”, a matter of opinion, and therefore as weak. To give an example, consider the claim that socialism is evil. The commonest way to defend this claim consists in examining facts about the Soviet Union. We advance the argument by pointing to the number of people killed, or the churches destroyed. We look at the facts of the Gulag. The Soviets themselves did all they could to hide these facts from much of the world.

Curiously enough, however, the Soviets (at least those making the decisions), despite their efforts to hide the facts did not seem overly troubled by them. Indeed: commonly, “facts” seem themselves always embedded in social contexts of interpretation. Bruno Latour has argued that what we regard as “facts” are not mind-independent truths discovered through science but socially-constructed fictions premised upon some observation. That is: circumstances and instruments, as well as often-tacit social agreements, contextualize every purported discovery of “fact”.

Discussing the Philosophical Reality of “Facts”

Yet the idea of the “fact”, despite such challenges, remains powerful in our contemporary social imaginary. Facts, as oft-repeated by a certain fast-talking pundit, do not care about your feelings.

But, we have to ask—we ought to ask—is there even really such a thing as a “fact”? What makes something to be a fact? How do we discover them, share them, interpret them? Can we gain “factual knowledge” without interpretation?

Join us this evening to discuss facts—and philosophy!

Philosophical Happy Hour

« »

Come join us for drinks (adult or otherwise) and a meaningful conversation. Open to the public! Held every Wednesday from 5:45–7:15pm ET.