Home » On Modernity, Ultramodernity, and Postmodernity

If you and I are to have a conversation—that is, a kind of living-together through discourse or the concrete articulations of language—we must, indeed, do so in the same “place”.  Put otherwise, we cannot have a conversation unless the objects signified by our words are the same.  I do not mean that each and every word used by each person in the conversation needs to have the exact same denotations and connotations as understood by every other person in the conversation.  But without some common referents—some formal identities between us as to what the words we use mean—we are not and cannot be engaged in a dialogue, but only exchanging equally incomprehensible monologues.

It is to the end of establishing some of these necessary commonalities, particularly concerning the meaning of modernity and thus the elaborated forms of ultramodernity and postmodernity, that this article has been written.

The Common Understanding of Postmodernity

Let us begin by asking: what is postmodernism?  Commonly, the term is used to indicate a movement committed to the radical abdication of belief in ultimately-defensible intelligible meaning.  James Lindsay, for instance—who attained some fame through his work with Peter Boghossian and Helen Pluckrose in exposing the absurdity in a lot of academia—has frequently referred to an “applied postmodernism” identified with social justice activism.  By this phrase is meant: the truth about things is less important than the imposition of moral categories based on emotional responses, many of which have been fostered through radical historicism or selective criticism of the structures common to Western civilization.  James Croft, University Chaplain and Lead Faith Advisor to the University of Sussex—with a EdD in Education and Human Development, who describes himself as a gay rights and Humanist activist—describes postmodernism as comprising positions “anti-foundationalist”, “anti-essentialist”, “anti-teleological”, and “anti-universal”. 

Academic Understandings

But is this throwing-around of terms in the public sphere, without careful attention, truly indicative of what postmodernism is or what it is understood to be among its adherents, advocates, and expositors?  In a sense: yes.  The Stanford Encyclopedia of Philosophy begins its entry on “postmodernism” by writing:

That postmodernism is indefinable is a truism.  However, it can be described as a set of critical, strategic and rhetorical practices employing concepts such as difference, repetition, the trace, the simulacrum, and hyperreality to destabilize other concepts such as presence, identity, historical progress, epistemic certainty, and the univocity of meaning.

Further, the first prominent instance of the word in academic discourse seemingly belongs to Jean-Francois Lyotard’s 1979 book, La condition postmoderne, published in 1984 under the English title, The Postmodern Condition: A Report on Knowledge.  In Lyotard’s “simplification to the extreme”, he defines “postmodern as incredulity toward metanarratives” (xxiv).  As he states, the incredulous turn began around the turn of the 20th century, as the “rules” for science, literature, and the arts were changed through a seeming radical re-thinking.  One thinks of Andre Breton’s two manifestos on Surrealism, transgressive films like Federico Fellini’s La Dolce Vita or , or some of Hitchcock’s work, compositions of Arvo Pärt, Györgi Ligeti, Phillip Glass, or Brian Eno, the novels of James Joyce or William Gaddis or Thomas Pynchon or Michel Houellebecq—or any of the expressive turns which attempt to convey a meaning, or the possibilities of interpretation, through methods which defy the previously-established norms and conventions of society.

Nascent and oft-unrecognized turns towards this incredulity of narratives in science can be found as early as the mid-19th century, particularly in the establishment of psychology as a discipline independent from philosophy and the shift to an apparently post-Aristotelian logic.  Though ostensibly grounded in the modern traditions of science—with their focus upon the experimental and the quantitative—these “new” approaches further untethered the objects of thinking from the mind-independent real.  The development of these, and other-like sciences, led to a further fragmentation of intellectual advance and the present-day irreconcilability of the disciplines[1] as well as the widely-known “replication crisis”—not to mention, opened the door for events such as the Sokal Hoax or the “grievance studies affair”.  Some might take these latter as evidence that the social sciences are insufficiently scientific.  Their simultaneity with the replication crisis shows, however, a deeper problem about the condition of knowledge—precisely the point articulated by Lyotard in his 1979 work.

The Public Face of the “Postmodern”

The fragmentation of knowledge and dissolution of narrative foundations has found its way also into the practical, moral, and political dimensions of life.  Without widespread social acceptance of either common principles for knowledge or shared sentiments—say, those stemming from religious belief, patriotism, or adherence to the dictates of an authority—new struggles appear concerning law and power.  Thus arise new movements of “social justice activism” or “applied postmodernism”, frequently witnessed on or stemming from college campuses throughout the late 20th century and into the early 21st—which follows insofar as these movements spring from a “theoretical postmodernism”.  In recent decades, the application of postmodern thinking (without often going explicitly under this name) has reached a somewhat more national consciousness, infiltrating politics, with not only the once-fringe Bernie Sanders shuffling into the limelight, but also the influx of young, female politicians with contrarian ideas, such as Ilhan Omar, Rashida Tlaib, and Alexandra Ocasio-Cortez.  The lattermost, for instance, gave an interview to Anderson Cooper on the television program 60 Minutes where she stated that “there’s a lot of people more concerned about being precisely, factually, and semantically correct than about being morally right.”  Though she walked it back, that sort of statement is not simply a mistake, a slip of the tongue, but rather exhibits a structure of thinking.

In other words, there is a strong opposition within the contemporary movement to what historically has been called “reason”.  At the theoretical level, this opposition often utilizes the very language of power-struggle arising in the wake of reason’s dismissal to vilify “reason” as an instrument of oppression—a tool employed by the hegemonic forces that are responsible for seemingly all human pain or suffering.  To the contrary, emotions and lived experience are promoted as the new and perhaps the only means to a just world.

Theoretical Roots of the “Postmodern”

At any rate, the theoretical or philosophical roots will often be linked to names like—in no particular order—Karl Marx and Friedrich Engels, Jacques Lacan, Max Horkheimer, Jean-Paul Sartre, Albert Camus, Jacques Derrida, Jean Baudrillard, Gilles Deleuze and Félix Guattari, Herbert Marcuse, Theodor Adorno, Louis Althusser, Michel Foucault, perhaps Slavoj Zizek, and the ethereal movement which may not be precisely identified with Karl Marx’s thought, but which not-illegitimately claims inspiration from him—the movement which receives even if it repudiates the name of cultural Marxism, exemplified in the recent “woke” movement exemplified by public figures such as Ibram X. Kendi; or in the closely-related field of “semantic activism”, that is, the effort to shift the meanings of words to produce desired cultural outcomes, advocated by academics such as Kate Manne.  Broadly speaking, the so-called theoretical postmodern is enmeshed with relativism, a “post-truth” mentality, and the radical resolution of the meaning of any apparent phenomena to naught but the individuated psyche, which rather ironically leaves power—understood not as the necessity of force but the persuasion to willing conformity—as the only means to social change.

Or, to sum the core belief up in a single sentence: “At the heart of liberty is the right to define one’s own concept of existence, of meaning, of the universe, and of the mystery of human life”.

The words are not mine, of course, but those of Supreme Court Justice Anthony Kennedy, written in the decision of a 1992 case, Planned Parenthood of Southwest Pennsylvania v. Casey, which “reaffirmed the essential holding” of Roe v. Wade.

Now, regardless of your stance on abortion—an issue I believe factually straightforward but psychologically complex and thus will not venture into here—this assertion of the Supreme Court, if you believe that reason has a central role to play in human life and the pursuit of human good, should deeply and profoundly disturb you, and raise questions about the concept of liberty: central not simply to so-called postmodernism, but perhaps even more prominently to the modernism which preceded it.

The Meanings of Modernity and Modernism

Understanding “modernism”, however, is not simple.  For one, the term may be applied in two different but related senses: first, referring broadly to a cultural transition from centralized monarchical political authorities to individualistic democratic or republican governments as well as a secularization of education, both in means and in content; and second, referring narrowly to the philosophical movements which prompted much of this cultural change.  The cultural transition was exemplified intellectually by Galileo, Newton, and Darwin—as the rise of secular science, outside the Catholic university—not to mention the Enlightenment generally, and politically by the American and French revolutions and the ensuing diminishment of monarchy across all the Western world.  Less lauded or even recognized by the proponents of modernity (except as a sign of modernity’s achievements, rather than a cause), but just as centrally necessary to its achievements and philosophy, was the rise of technology and especially industrial technology.

This is not to say that every fruit sprung from the tree of cultural modernism is poison.  For instance, through the shift away from authority—though a shift taken entirely too far—a better image of the means to individual flourishing as intellect-possessing animals (or semiotic animals), emerged.  Yet this fruit is the exception, rather than the norm.  By contrast, the loss of the true good of authority—namely the connection of an individual with a normative truth higher than any individual—and instead the false belief in authority as a kind of necessary power to enforcing a social contract is poisonous indeed; as are the slide into scientism, the fragmentation of knowledge, and the rejection of tradition on the mere basis of its being traditional.

Philosophical Modernity

Most important for addressing modernism, however, is to understand the philosophical roots.  Here, we can quickly get at the essence: for modern philosophy has two founders that stand above the rest, namely René Descartes (1596–1650) and John Locke (1632–1704).[2]  Descartes is best known as the first of the modern rationalists, holding that ideas are not derived from the objects of sense experience, such that sense experience at most gives us the occasion to form or discover an idea, while other ideas are given directly by God or are instilled in us innately (which might as well be the same thing).  Contrariwise, John Locke held the mind as a blank slate, and thought all our ideas were formed from the empirical experience of objects, built up in an almost atomistic fashion, such that having experience of one sensation after another we came to form generalized notions of the things experienced.  The decades and centuries following Descartes saw theorists of both rationalist and empirical thinking—such as Gottfried Wilhelm Leibniz, George Berkeley, Baruch Spinoza, David Hume, and Immanuel Kant, among others too numerous to name—arguing back and forth over the starting points and nature of knowledge… all of whom seemed entirely unaware that both sides partook of a fundamental and egregiously mistaken presupposition: namely, the belief that our ideas are themselves the direct terminal objects of our cognitive actions; in other words, the belief that we know our ideas and from knowing our ideas know the world.  Though often the rationalists have received the name of “idealist”, in truth, the empiricists are just as fundamentally idealist as their opposition.

This presupposition, regardless of one’s theory of ideation—and which presupposition we may call following Leibniz and John Deely as the “Way of Ideas”—drives an incorporeal and imaginary wedge between the individual human being and everything else.  The more attempts are made to build a bridge over this gap, the deeper the wedge is driven.  For the wedge of idealism, once admitted into a theory of knowledge, sets the individual and his or her experience as knowing his or her ideas, over and against the world as something extended and material, not known directly but only through the mediation of one’s subjectively-constrained ideas.   Inevitably, therefore, it drives individuals deeper into individualism as they believe themselves to dwell in their own minds.  Thus the Way of Ideas ends up driving the wedge so deeply that it widens the initial gap into a vast chasm: a chasm between the world as known and the self as knower, between the physical and the cultural, and between the natural and the personal.

For the turn first introduced by Descartes is a turn inwards; a turn which makes thought essentially private—and there is a lot here to be said about the technology of the printing press and the paradigmatic shift between the scholastic modes of philosophy obsolesced by privately owned and read books, a lot that I am not going to say here in fact: only that the Cartesian subjective turn gives an intelligible articulation to a spreading psychological habit technologically exacerbated, making that habit both explicit and reinforced.  The result is a diminished belief in the truth of intellectual conception as an essentially public activity.  Instead, truth is seen as something realized privately and subsequently spread societally through convention and agreement.  The promise upon which this extension of private thinking into social convention depended was the supposed universality of the scientific method—and perhaps, a philosophy structured in the same manner.   Such was the proposal of Immanuel Kant in his Critique of Pure Reason.  Such was the spirit of the Enlightenment in general: Diderot’s encyclopedia, Voltaire’s histories and letters, Rousseau’s theories of social contract, and so on.  Everywhere, one saw attempts to guarantee a better future through the blending of empirical observation and the faith in scientific method to regulate those observations into a universal monolith of “objective” truth.

The result of these efforts, however, is not only a habit of thinking as private, but also a habit of denying the reality of our own experiences: for every experience we ever have, of anything whatsoever, in any discernible regard, always exceeds what is grasped in mere empiricism (understood as the discrete reception of sensory phenomena).  Do our experiences and our knowledge begin in sensation?  Absolutely and undoubtedly.  But does the sensory data or even the neurological activity explain either the having of experience or the attainment of knowledge?  No; and it does not even come close.

“Postmodernity” is Ultramodern

And this philosophical error is why modernism leads inevitably towards so-called postmodernism: not because modernism ebbs away, but because its own internal principles, carried towards their logical conclusions, lead inescapably to nonsensical, non-rational positions—to the very repudiation of reason itself.  Superficially this appears most ironic, and will—by all adherents of modernism—be rejected.  For modernism hails “reason”; but the reason it hails is one stripped of its vigor, for it is not a reason which discovers the truth concerning the fullness of reality outside the mind or independent of the psychological self.  Modernity’s “reason” supplants the search for a cognition-independent truth with an amalgamation of facts like so many grains of sand out of which it tries to build the truth; and now the remnants of ideological modernity wail when the so-called postmoderns—who, in truth, are really ultramoderns—come knocking down their granular edifice and to re-shape it as they see fit.

Allow me here a lengthy quote from an article of John Deely:[3]

Relativism and solipsism are not matters that follow upon or come after modernity: in philosophy they have proved to be its very heart and essence, present from the start, a looming presence which took some centuries fully to unveil itself.  Late modern philosophy, phenomenological no less than analytical, finally embraced fully what Descartes and Kant had shown from the start: the human being is cut off from nature, hardly a part of it; the human being becomes a cosmos unto itself, with no way to relate beyond itself, beyond the veil of phenomena forever hiding the other-than-human things which are other than our representations, whimsical or a-priori as the case may be.

Modern philosophy fathered and fostered the pretense that science must confront nature as an “objective observer”, or not at all.  But modern science found that not to be the situation at all.  Instead of confronting nature as an outside observer, science came to see itself rather in Heisenberg’s terms as an actor in an interplay between the human world within nature and the larger world of nature of which the human world forms a part.  It found itself to be “focused on the network of relationships between man and nature, and which we as human beings have simultaneously made the object of our thought and actions” (Heisenberg 1955: 9).

From the point of view of the sciences as co-heirs of modernity with philosophy, this paradigm shift seemed a kind of revolution, a veritable new beginning.  But from the point of view of semiotics this shift is something more than merely a new beginning: this shift is a going beyond the modern heritage.  In effect, the late modern philosophers clinging to their notion of the human world of culture as a whole unto itself, cut off from nature as if autonomous in its unfoldings, are anything but “postmodern”, notwithstanding the many who have tried so to style the embrasure of the relativism implicated in the epistemological view beginning and ending with the subject thinking.  If anything, the muddle of thinkers whose usage of “postmodern” Stjernfelt would like discussed contribute nothing that goes beyond modernity, but only reveal and revel in what modern epistemology logically leads to, what modern epistemology had entailed all along.  Ultramodern rather than postmodern, they are not the beginning of a revolution against modernity but the death throes of the revolution in philosophy that was modernity, Mr. Hyde to the Dr. Jekyll of modern science in its maturity.

What is called postmodernism is not really in any way post modernity.  A true postmodernism has only begun to claw its way through a series of unimaginable philosophical errors, which origins I will try to demonstrate over the next several videos.

True Postmodernity

If modernity follows the Way of Ideas, and the idealist epistemological quagmire of the moderns leads to its own demise in nonsensical irrational ultramodernity, a meaningful postmodernity must be one which follows a different path: namely, what Deely has named the Way of Signs

Thus, if there are two figures I would definitively name as proponents of a genuine postmodernity, they are Charles Sanders Peirce and that same John Deely.  I would add, as a figure responsible for truly breaking out of modernity (even if his break has been badly misunderstood and consequently misappropriated by many), Martin Heidegger.  Based upon a cursory, initial reading of some of his works, I suspect that the little-known Eugen Rosenstock-Huessy ought also to be included.  Neither of the latter two explicitly advocate for the Way of Signs; but both turn language back to things, rather than to ideas.

Such a turn—whether implicit or explicit—allows us to recover truth as normative: precisely what modernity discarded, even if it did not realize it.


[1] Not, mind you, that the disciplines of academe are irreconcilable in principle, but rather, as presently practiced.  The reconciliation could only be affected through discovery of, study in, and resolution to a common basis of knowledge.

[2] We can include also as founders of modernity Francis Bacon and Niccolò Machiavelli, but their contributions—though essential to understanding modernity’s full constitution—are more like necessary adjuncts than central pillars of its nature.

[3] 2006: “Let us not lose sight of the forest for the trees …” in Cybernetics & Human Knowing 13.3-4.

Leave a Reply

%d