The Deconstructionist Ghost in the Textualist Machine

Ius et Iustitium is happy to present this guest post by Vincent Clarke. Mr. Clarke is scholar of Lacanian psychoanalysis.


It is not often commented on, but judicial textualism bears many of the same hallmarks as philosophical postmodernism. When this is properly appreciated, many of the results of textualist legal reasoning can be better understood. Implicit in textualism are many of the same principles as postmodernism. Understood in this way, textualism operates as a sort of system of gears allowing postmodern culture to infiltrate the legal apparatus. That the founders of textualism may not have intended such a result is an historical irony, but does not change the ultimate result. Since postmodernism’s aim is ultimately to undermine the legal system, textualism thus becomes an instrument of subversion.

These questions have been brought into much sharper focus with the recent ruling in Bostock v. Clayton County, 140 S. Ct. 1731 (2020), where the Supreme Court expanded the protections offered in the Civil Rights Act of 1964 to include not just discrimination based on race, color, religion, sex or national origin, but also ‘gender identity’ and ‘sexual orientation.’

Those criticizing the ruling have tended to fall into two camps. In the first camp are those that argue that the Bostock ruling was a perversion of textualism. (How it could be a perversion of textualism when the theory claims to be a positive, not a normative, theory is unclear.). In the second camp are those that point out that textualism is an empty formalism and that this leads to a lack of resistance against nihilistic liberal political culture. Recently this second position has been expanded by Adrian Vermeule who posits that the key driver of controversial court decisions is what he calls the “hidden constitution.” This is a set of liberal presuppositions about culture that exert what he calls a sort of “gravitational force” on court decision-making (Vermeule 2020).

In this essay, I want to expand on this second view and, in a sense, radicalize it. I want to make the case that the “hidden constitution” is, in fact, an anti-constitution; something resembling a legal computer virus that is geared toward undermining the legal operating system itself by exploding the implicit hierarchy on which it rests. Textualist theory is where the rubber meets the road. Much like utilitarianism in ethics, textualism is an instrumentalist theory that allows corrosive, nihilistic and anarchistic theories to gradually undermine the legal apparatus from within. Lacking a positive telos of its own, textualism automatically adopts the anti-telos that has been embedded in our culture by the postmodernists.

Derrida, Deconstruction, and the Destruction of Metaphysics

Before we turn to textualism and the implicit philosophy behind it, we must first understand a little better the postmodern ideas that it facilitates. In the realm of legal analysis and textual exposition the key figure in this regard is Jacques Derrida. In conservative circles today—most especially in English-speaking conservative circles—it is fashionable to say that Derrida is ‘unreadable’ or writes nonsense. Derrida’s prose is arcane, it is true, but so too is the prose of Anglophone logicians like Alonzo Church and Anglophone linguists like Noam Chomsky. Arcane prose is not, despite what some might have you believe, a sign of weakmindedness.

Derrida formulated his project as a reaction to the philosophy of Martin Heidegger. One of Heidegger’s projects in his magnum opus Being and Time was to strip back Western philosophy completely. To do this Heidegger advocates demolishing Western metaphysics which he sees, with Nietzsche, as being a repressive and evil development in the history of thought. Heidegger introduces the term Destruktion and advocates a process of chipping away at central aspects of Western metaphysics “until we arrive at those primordial experiences in which we achieved our first ways of determining the nature of Being—the ways that have guided us ever since” (Heidegger 1962, p. 44). Heidegger is not advocating a return to origins so much as he is advocating a destruction of the notion of fixed origins. He is not advocating a meditation on Being, as we might find in the metaphysical tradition. Rather he is advocating an intuitive encounter with the “primordial experiences” that he believes allowed earlier philosophers to begin their meditation on Being. Is this an incoherent and fantastic goal? Probably. But it is at the core of Heidegger’s project.

Derrida does not see himself as challenging this project as much as he believes that he is setting it on a more fruitful course. Derrida believed that Heidegger’s thought itself suffered from too much essentialism. He argued that the project of Destruktion was a noble one— a Nietzschean too, Derrida agreed that the Western metaphysical tradition is oppressive—but that it was incoherent because it did not go far enough. In her preface to Derrida’s magnum opus Of Grammatology, Gayatri Spivak writes:

To be sure, when Heidegger sets Being before all concepts, he is attempting to free language from the fallacy of a fixed origin, which is also a fixed end. But, in a certain way, he also sets up Being as what Derrida calls the ‘transcendental signified’… Being is indeed the final signified to which all signifiers refer. But Heidegger makes it clear that Being cannot be contained by, is always prior to, indeed transcends, signification. It is therefore a situation where the signified commands, and is yet free of, all signifiers – a recognizably theological situation. The end of philosophy, according to Heidegger, is to restore the memory of that free and commanding signified, to discover Urworter (originary words) in the languages of the world by learning to waylay the limiting logic of signification, a project that Derrida describes as “the other side of nostalgia, which I will call Heideggerian hope. . . .” (Derrida 2013, p. xvi.)

Derrida argued that Heidegger still labored under the illusion that fixed origins could be destroyed when, in fact, any attempt to destroy fixed origins necessarily ended in establishing new fixed origins. The alternative, the true path to the destruction of metaphysics, Derrida argued, was to set thought in motion and allow for the perpetual destruction of meaning – Derrida called this project ‘deconstruction.’ Derrida argues that the key to destroying metaphysical concepts is to render unstable their meaning by highlighting the fact that their meaning relies on their opposition to other terms. Spivak again:

Derrida seems to show no nostalgia for a lost presence. He sees in the traditional concept of the sign a heterogeneity—“the other of the signified is never contemporary, is at best a subtly discrepant inverse or parallel discrepant by the time of a breath-of the order of the signifier”… Word and thing or thought never in fact become one. We are reminded of, referred to, what the convention of words sets up as thing or thought, by a particular arrangement of words. The structure of reference works and can go on working not because of the identity between these two so-called component parts of the sign, but because of their relationship of difference. The sign marks a place of difference. (Ibid.)

Derrida’s theory of deconstruction is not really a theory at all. It is more so a toolbox of concepts that are deployed in the deconstruction of texts. The deconstruction of texts is undertaken to ferret out any concepts that, explicitly or implicitly, rely on metaphysical or foundational principles (recall these principles are seen as ‘repressive’). These days, deconstruction is not hard to understand because it is taking place everywhere. Daily we are told of activists “undermining gender binaries” or “exposing patriarchal notions of beauty” or what have you. This is the language of deconstruction – vulgarized yes, but still put to its proper usage – applied to cultural politics in the early 21st century.

The shift from Heideggerian existentialism to Derridean deconstructionism can be seen clearly in the culture. For the baby boomer generation, a vulgar form of existentialism was ascendant. Baby boomers strove to find their ‘authentic Being’ by ‘self-actualizing’[1]. This gave rise to judgements like Roe v Wade, as the judges caved into their demands and arbitrarily read rights into the constitution that obviously did not exist. True, the existentialist path was laughable and a surefire road to nowhere, but it kept the baby boomers on the straight and narrow to some extent. It led to a silly society where packs of consumerist zombies allowed themselves to be sold a variety of ‘lifestyles’ while pursuing riches. The generation Xers and millennials, on the other hand, are a nihilistic bunch and, thanks to their radical college professors, have embraced the language of deconstruction. They seek their ‘authentic Being’, to be sure, but their ideas tend to be more twisted and their motivations seem more hostile and anarchistic. Their language is different too. It seeks not so much to ‘self-actualize’ as to ‘subvert.’ The remnants of sociability that lumbered on in the baby boomers is gone. Antinomianism and anger reign supreme.

The goal of deconstruction is not really to give transgender people rights or allow overweight people in advertisements. It is a revolutionary Nietzschean project that aims to undermine hierarchies of all forms. It aims at dissolving cultural hierarchies altogether to allow for an anarchistic play of ideas and affects, none of which have any stable meaning or reference point. The end point of deconstruction is supposed to be ever-changing anarchy. Deconstructionists, with Rousseau, want to return Man to a state of nature. To achieve this, deconstructionists believe that hierarchies must first be dissolved within language itself; this is what we have seen take place in college campuses in recent years. Once this is achieved, these unstable meanings will start to penetrate the legal system and begin to undermine it.

Textualism as a Gear for Deconstruction

Much of the trajectory of philosophy in the second half of the 20th century can be seen as a reaction to the grim events that marked that century. It has been pointed out that today we labor under the vision of Karl Popper’s ‘open society’ because we are told that any move to a more substantially grounded society might risk tipping into National Socialist or Communist barbarism (Reno 2019). For this reason, we are encouraged to practice a sort of vacuous formalism in every aspect of life – from ethical commitments to philosophical practice to economic planning. Everywhere negative liberty reigns supreme.

According to Cass Sunstein, textualism as a movement arose in much the same way. Horrified by the excesses of the courts in the totalitarian countries, liberal jurists retreated into formalism.

In the Nazi period, German judges rejected formalism. They did not rely on the ordinary or original meaning of legal texts. On the contrary, they thought that statutes should be construed in accordance with the spirit of the age, defined by reference to the Nazi regime. They thought that courts could carry out their task “only if they do not remain glued to the letter of the law, but rather penetrate its inner core in their interpretations and do their part to see that the aims of the lawmaker are realized.” After the war, the Allied forces faced a range of choices about how to reform the German legal system. One of their first steps was to insist on a formalistic, ‘plain meaning’ approach to law. (Sunstein 1999, pp. 1-2.)

In America, textualism/originalism[2] arose later and as a reaction to the excesses of liberal jurists’ constitutional eisegesis. The liberal wing on the Supreme Court had adopted an interpretative frame-work that looked to “penumbras” and “evolving standards of decency,” leading to decisions that were wholly unmoored from either the text or from traditional substantive morality. Conservatives reacted, not by arguing for substantive morality directly, but insisting on a strict adherence to the text within the framework of the positivistic divorce of law and morality. Although this was an avowedly positivist legal theory, the neutral insistence on “original meaning,” it was hoped, would anchor decisions in the substantive moral values of the 18th and 19th century, instead of the 1960s.

Textualism thus became in jurisprudence what neoclassical economic theory was in public policy and analytic philosophy was in ethical theory. Ethics would be subordinated to the neutralized public space articulated by Rawls; public policy would be judged based on purely the formalistic utilitarian criteria by neoclassical economics; and jurisprudence would be increasingly subject to close textual readings of legal documents. These moves allowed the elite in Western liberal societies to avoid the responsibility of articulating a positive moral vision for those over whom they ruled. In the case of textualism and originalism, it allowed enervated and embattled conservative jurists to avoid joining battle directly over substantive moral questions.

To do this, however, these disciplines had to adopt many of the operating principles of postmodernism. This has recently been noted with respect to analytic philosophy by Simon Blackburn (Blackburn 2018). By focusing so much on the internal logic of statements, analytic philosophers have come perilously close to denying any reality outside of those statements. Likewise, in economics, the neoclassical theory applied as public policy appears to have given way to a financialized economy that increasingly looks like the sort of symbol-manipulation that Jean Baudrillard mocked as being a hallmark of the post-truth postmodern era – capitalism is becoming a neutralized simulacra of itself (Baudrillard 1981).

Whereas ethics and economic theory are concerned with practical decisions, however, jurisprudence is concerned with the interpretation of law. This leaves jurisprudence even more open to toxic and subversive cultural developments than the other branches of governance because while the actual economy may put up resistance to crazed economic theories, texts cannot put up a fight against textualism. By trying to empty jurisprudence of its assumptions, textualism unintentionally opened the legal apparatus to manipulation by anarchistic forces.

Derrida was quite explicit that he saw the penetration of deconstruction into the legal system as key to his project. He wrote:

I think that the developments in ‘critical legal studies’ or in work by people like Stanley Fish, Barbara Herrnstein Smith, Drucilla Cornell, Sam Weber and others, which situates itself in relation to the articulation between literature and philosophy, law and politico-institutional problems, are today, from the point of view of a certain deconstruction, among the most fertile and the most necessary. They respond, it seems to me, to the most radical programs of a deconstruction that would like, in order to be consistent with itself, not to remain enclosed in purely speculative, theoretical, academic discourses but rather (with all due respect to Stanley Fish) to aspire to something more consequential, to change things and to intervene in an efficient and responsible, though always, of course, very mediated way, not only in the profession but in what one calls the cité, the polis and more generally the world. Not, doubtless, to change things in the rather naive sense of calculated, deliberate and strategically controlled intervention, but in the sense of maximum intensification of a transformation in progress, in the name of neither a simple symptom nor a simple cause (other categories are required here). In an industrial and hyper-technologized society, academia is less than ever the monadic or monastic ivory tower that in any case it never was. And this is particularly true of “law schools.” (Derrida 1992, pp. 8-9.)

As we can see, Derrida did not want to offer an articulated program of what should be changed in the legal structure. Rather he favored a “maximum intensification of a transformation in progress.” Like the doctrine of deconstruction itself, Derrida favored shaking the legal apparatus to produce tumult, internal contradiction, and anarchistic movement. Derrida’s champions in the “critical legal studies” movement failed to get much traction beyond the legal academy. It is no small irony that what the critical legal studies movement failed to bring about by direct frontal assault is now coming to pass at the hands of textualists, under the guise of neutral positivistic jurisprudence. The textualists might deny that this is what they are doing—they would, after all, claim to merely be stating “what the law says”—but the results are nothing less than the wholesale achievement of the deconstructionist project.

That textualism should prove such a useful vehicle for deconstruction is not surprising if we examine the precepts on which textualism and deconstruction are based. Derrida’s starting principle was famously “il n’y a pas de hors-texte” – “there is nothing outside the text.” This statement has both a theoretical and an applied component. Theoretically, for deconstruction to function, Derrida relied on the idea that there was no true ‘reference point’, no actual object to which the text referred. If the text referred to an object – if, say, the word ‘apple’ corresponded to an actual apple – then that object could put up resistance to Derrida’s whirlwind of deconstruction. But if texts only referred to themselves and to other texts, as deconstruction teaches, then the whirlwind can wind on, sucking up more and more of the actual world. This is the applied component. Deconstruction rests on a gamble that by changing perceptions through the spread of its doctrine and by convincing people that there is no objective reality, this will lead to the anarchistic outcome that deconstructionists desire.

What is not widely commented on is that the statement “il n’y a pas de hors-texte” could just as easily be lifted from a legal textbook definition of textualism. At the beginning of his judgement in Bostock, Justice Gorsuch lays this out in no uncertain terms.

When the express terms of a statute give us one answer and extratextual considerations suggest another, it’s no contest. Only the written word is the law, and all persons are entitled to its benefit. (Bostock, 140 S. Ct. at 1737.)

There is, for all practical purposes, nothing outside the text. Gorsuch’s language could not be clearer. When the “terms of a statute” come into conflict with “extratextual considerations” then the statute wins out. “Extratextual considerations” could be anything from biological differences to metaphysical truths to a radical change in historical circumstances – they could be anything non-textual and objective – and they will always be treated as second-class citizens in the textualist’s legal society.

It is in this way that textualism provides the groundwork for deconstruction to enter the fray. By cordoning off questions of objective truth in such a profound way—questions that Derrida and Heidegger are correct to say always rely on metaphysical foundations—the textualists open the legal apparatus up to the postmodern whirlwind. The agents of postmodernism – those we see everywhere in our societies after years of deconstructionist agitprop in the universities—can then recruit their favorite textualist to undermine the ‘logocentric’ legal order[3]. The cleverer the textualist devotee, the better he will be to write chaos into the legal system if given the optimal deconstructionist case.

Gorsuch as the Deconstructionists’ Useful Idiot

It is only by understanding this that we can properly understand the reasoning behind Bostock. Many have commented on how contorted that reasoning was. But the contortion is precisely the point. Alternatively, many have commented on how Bostock unsettles the legal apparatus and will give rise to a proliferation of post-Bostock rulings and hence to legal chaos. Again, chaos is precisely the point—it is the “maximum intensification of a transformation in progress” that Derrida advocated.

Let us carefully examine the two key passages of the ruling. These are:

An employer who discriminates against homosexual or transgender employees necessarily and intentionally applies sex-based rules. An employer that announces it will not employ anyone who is homosexual, for example, intends to penalize male employees for being attracted to men and female employees for being attracted to women. (Bostock, 140 S. Ct. at 1745.)

And:

We agree that homosexuality and transgender status are distinct concepts from sex. But, as we’ve seen, discrimination based on homosexuality or transgender status necessarily entails discrimination based on sex; the first cannot happen without the second. (ibid, p19)

The ruling hinges on the distinction between the general and the particular; or, the distinction between the group and the individual. Justice Gorsuch is explicit about this when he writes: “the [Civil Rights Act of 1964] focuses on discrimination against individuals, not groups” (ibid, p17). The logic affirms particularistic categories as against general categories—but does so by appealing to general categories. If you are, say, a transgender or a homosexual, you are an individual; you do not belong to a group (of transgenders and homosexuals). But certain groups do exist; namely, the ones named in the 1964 act. One of these groups is ‘sex.’ If your individual status happens to overlap in some way with these groups, you get protection under the law[4].

This is the logic of deconstruction. The particular is affirmed in the face of the general to liquidate the general; the individual uses its group status to dissolve the group. To be more concrete, homosexuals and transgenders are deployed as purely individualistic categories to ‘deconstruct’ the groups defined under the term ‘sex.’

How? Basically, this is done by highlighting the grey areas within the categories. Take the example of defining ‘sex’ by reference to procreation. One way to define the sex of a human or an animal is by pointing to its reproductive capacities: a male can inseminate a female; a female can bear children. Now, redefine ‘reproduction’ to mean ‘sexual activity including homosexual activity.’ This scrambles our capacity to define sex based on its reproduction capacities. Enter Gorsuch. He plays on this implicit redefinition—this implicit deconstruction within the culture embodied in Vermeule’s ‘hidden constitution’—and ‘shows’ that it would be discriminatory for an employer to hire a man engaging in ‘reproduction’ (now redefined to mean ‘sexual activity including homosexual activity’) with a woman and not hire a man engaged in ‘reproduction’ with another man.

Or take the transgender component of the ruling. Most people would assume that a biological female turning up for work in a skirt and high heels is a fundamentally different act from a biological male doing the same thing. Not according to Gorsuch. The individual biological male in a skirt and high heels negates the sex difference on which the distinction between the respective acts rests. Affirming this sex difference is now ‘discriminatory’ and against the law. Thus the biological male wearing the skirt and high heels is used, in conjunction with the text of the 1964 law, to deconstruct the ‘phallocentric’ idea of sex difference.

Gorsuch seems to assume that his reasoning is in some sense natural but in fact most people who have thought about the matter would strongly disagree. Consider the findings of a recent survey by Media Matters showing what types of sources are used when browsing content related to transgenderism online. Around 20% of people went to sources that promoted transgender ideology; around 15% went to sources that were neutral; while a full 65% of people went to sources that were overtly hostile to or critical of transgender ideology (January 2020). No matter what vague public opinion polling on issues like ‘transgender rights’ might tell us, most people who are interested enough to think and read about the matter would find the assumptions on which Gorsuch’s reasoning rests to be twisted and bizarre[5].

Gorsuch’s ruling shows how textualism operates as the machine into which the postmodern ghost climbs. Postmodernism injects deconstruction into the culture and, in a wide variety of wily ways, leads us to question basic metaphysical and scientific truths. It does this by telling us that reality is wholly a social construction and that there is “nothing beyond the text.” Textualism then takes the baton and channels these tendencies into the courts, wreaking havoc on the implicit hierarchy that structures our legal systems. Again, it does this by telling us that there is nothing in law beyond the text; no objective reality, moral or scientific.

Here we see, in the form of court rulings, the transition we noted earlier from mid-20th century existentialism to radical deconstructionism. Justice Kennedy perfectly summarized the ethos behind Roe v Wade after the fact when he wrote:

At the heart of liberty is the right to define one’s own concept of existence, of meaning, of the universe, and of the mystery of human life. (Planned Parenthood v Casey, 505 U.S. 833, 851 (1992.)

In the interim, the left ideologies have soured. In the past they subordinated morality to vacuous notions of ‘self-actualization.’ Today they seek subversion of all institutions and attempt to tip society into madness. Just as the courts followed the most radical ideas in the culture in the 1970s, they do so today. But now they have a formidable, well-articulated theory to hijack – and on the conservative side of the bench no less. Textualism may not be an anarchist ideology, but it is the Trojan horse through which deconstructionism enters the legal apparatus.

Metaphysics, the Only Counter to Judicial Anarchism

By now it should be clear what has gone wrong. It is not textualism per se. Textualism is a late development. Like many ‘conservative’ ideologies that arose in the 20th century, it is an ideology steeped in ultraliberalism. In spite of the intent of its earliest proponents, it is not in the end a conservative doctrine and cannot be because it has no foundation in positive content. With nothing at bottom to conserve, textualism is simply an empty vessel, a machine that channels radical ideas into the legal system.

The core issue remains the very one that Derrida and Heidegger and their followers realized: whether metaphysical truths exist and persist. These can be anything that stand against the randomized flux of postmodernist ideology. They can be scientific truths, such as the biological differences between men and women; or they could be moral truths, such as the difference between Good and Bad. All these truths, however, require a deeper grounding. Postmodernism attempts to neutralize Truth by undermining the metaphysical foundations on which Truth sits to produce a vacuum in which meaningless flux takes place.

An empty formalistic program like textualism— cannot counter this total destruction of meaning. Only doctrines that promote essentialism and meaning can. Meaning requires metaphysics. It requires the capacity to recognize Good from Evil, Man from Woman; not on some arbitrary biological or evolutionary basis, but based on, to quote Spivak channeling Derrida, “a situation where the signified commands, and is yet free of, all signifiers – a recognizably theological situation.”

There is no easy way out for those institutions who wish to counter the quickening slide into anarchy facing Western societies. We must return to metaphysics, to hard truths. There is no trick, no ‘life hack’ that can cleverly get around the need for metaphysical truth. We either affirm it or we allow the forces of chaos to consume us.

Vincent Clarke

Bibliography

Baudrillard, J. (1981). Simulacra and Simulation. University of Michigan Press.

Blackburn, S. (2018). ‘Past Masters of the Postmodern.’ Inference Review. Vol. 4, No. 2. October 2018.

Bostock v Clayton County, 140 S. Ct. 1731 (2020).

Derrida, J. (1992). ‘Force of Law: The Mystical Foundation of Authority.’ Deconstruction and the Possibility of Justice, eds Cornell, Rosenfeld and Carlson. Routledge.

Derrida, J. (2013). Of Grammatology. John Hopkins University Press.

Heidegger, M. (1962). Being and Time. Martino Fine Books.

Lasch, C. (1985). The Minimal Self: Psychic Survival in Troubled Times. W.W. Norton & Company.

January, B. (2020). ‘The Right is Dominating Facebook Engagement on Content About Trans Issues.’ Media Matters.

Reno, R. (2019). Return of the Strong Gods: Nationalism, Populism and the Future of the West. Gateway Editions.

Sunstein, C. (1999). ‘Must Formalism Be Defended Empirically?.’ John M. Olin Law & Economics Working Paper No. 70. Chicago Working Papers.

Vermeule, A. (2020). ‘Why Conservative Justices Are More Likely to Defect.’ Washington Post. July 8th 2020.


  1. These attitudes were filtered through to the culture via the psychotherapeutic movement. Figures like Rollo May and Carl Rogers formulated existential psychological theories which became enormously influential and shaped the culture and the language of the baby boomer generation. One might say that while the US constitution is written in the language of Locke and the early liberals, Vermeule’s hidden constitution was, until recently, written in the language of existential psychology. Christopher Lasch’s somewhat overlooked work from the 1980s highlighted this well (Lasch 1985). Today, however, the hidden constitution has taken an even darker turn.

  2. For the most part any distinction between the two terms can be ignored for the purposes of this essay. Originalism is slightly more resistant to postmodernism than is textualism because originalism takes into account the historical situation in which a text is written. This at least gives some external reference point for law to refer to. But history itself is unstable unless an objectivist historiography is applied. Today this is rarely the case due to the impact of postmodernism and deconstructionism in the history departments. One can easily imagine a deconstructionist-infused originalism where postmodernist readings of history—such as the infamous ‘1619 Project’ in The New York Times—are used to undermine the rule of law.

  3. ‘Logocentrism’ is a term adopted by Derrida and others to refer to institutions and theories that attempt to reference an objective external reality. Such institutions and theories are said by deconstructionists to be oppressive hierarchical systems that do not, in fact, describe a true external reality but rather impose an arbitrary hierarchical system on individuals and society. Naturally, being based on pure textual readings with no reference to external reality, textualism would share with deconstructionism many of the critiques of the notion of logocentricity.

    It is interesting to note that ‘logos’ also refers, in our culture, to the figure of Jesus Christ (John 1:1-3, 14). Spivak notes this in her introduction when she writes that logocentrism means “the belief that the first and last things are the Logos, the Word, the Divine Mind, the infinite understanding of God” (Derrida 2013, p lxviii). Naturally, it should concern Christians and those who defend the Christian underpinnings of our societies that an ideology that denies the logos has gained such purchase in conservative circles.

  4. The ruling does affirm that homosexual and transgender ‘groups’ do exist as we can see in the second quote we have given from the ruling. But these groups do not appear in the 1964 law and so functionally they do not exist. This is because there is “nothing beyond the text.” So, anyone in the transgender or the homosexual group does not exist qua group in the eyes of the law, they exist only as individuals. Their behaviors are then allocated to the groups that exist in the text of the law – in this case, to ‘sex.’

  5. We should be careful when considering public opinion on such matters. Public polling on such questions is a poor proxy for what people are actually thinking. Liberalism breeds a mindset of indifference in the general population, whose attitude on various topics is laissez faire by default. Because of this, an individual who might find postmodern gender theory bizarre if it were explained to them might nevertheless support some vaguely defined notion of ‘transgender rights’ or employment protection for transgender people on the grounds of the ‘live and let live’ metaideology that social institutions have bred into them. Postmodern gender theory is an elite discourse only engaged with by a small minority of the population; the intelligentsia, broadly defined. For this reason, the correct way to measure what people think about the ideology is to look at what people interested in the topic are actually reading.