Art Economics Low Politics Decline Political Theology Power Geopolitics

The question concerning AI

Thinking about inhuman pseudo-intelligence with Martin Heidegger

“We look into the danger and see the growth of the saving power. Through this we are not yet saved. But we are summoned to hope in the growing light of the saving power. How can this happen? Here and now and in little things, we may foster the saving power in its increase. This includes holding always before our eyes the extreme danger.”

Martin Heidegger. The Question Concerning Technology (1954).

Martin Heidegger’s famous lecture The Question Concerning Technology is a resounding echo of a concern we find throughout his massive corpus. For Heidegger, the phenomenologist, existence is threatened by the human tendency to use theory to exile experience. His broader aim, roughly stated, is to recover a deep sense of how things show themselves to us as themselves. My aim here is less ambitious. I simply want to think with Heidegger about AI, now that it seems to be on the rampage.

I need to clarify right here that I do not see so-called artificial intelligence—inhuman pseudo-intelligence—as being at all discontinuous with the broader problem of technology as Heidegger understands it. It is absolutely in keeping with a decline in experiential richness that has accompanied humanity since at least the industrial revolution. As Ted Kaczynski has infamously chimed: “The industrial revolution and its consequences have been a disaster for the human race.” If you think that was a disaster, just wait until you consider the state of mind that invites and celebrates and perpetuates that disaster.

We need to consider this especially since the technological society is the invisible world we were thrown into at birth. It is the only world any of us has ever known. The wildness in being was already thoroughly tamed even before we got here. By now, numbed as we are by and to the devil’s electric nervous system, it is normal to view the world only technologically. We are accustomed to regarding almost everything in terms of use-value; as so much raw material for technical operations. Yes, everything. Everything supernatural or natural can be transformed into the target and material and product of some technique.

The arrival of increasingly impressive AI, therefore, marks not a break from a norm but a deepening of an already pervasive tyranny against being. AI just happens to be new enough, for the moment, that it allows us to properly see, as if for the first time, something of the environment we inhabit. But it is no revolution and it will not magically transform into one even if the technological singularity arrives. Like the electricity it runs on, it enhances instantaneous discarnation, obsolesces linear logic and visual space, retrieves acoustic space and the subliminal, and reverses into ubiquitous tribalism when pushed too far. Its content is the internet and its form is that of a glorified spreadsheet. None of this is new. It is mostly just much, much more of the same; it the hell of the same, to use Baudrillard’s term.

The reason certain AI developments have caused such shockwaves recently is that the stakes are so horrifyingly low. Only a degenerate mind could mistake what the machine is doing for thinking. Still, the force of the quantitative increase in machine-made communication objects anticipates an impending digital flood. Given this, it is reasonable and natural to consider what sort of ark we need to build to survive it. It is for this very reason, in fact, that what Heidegger has already covered on technology is still relevant.

Heidegger suggests that we get busy questioning. Questioning is a quest and a request, a way to open up our horizon of understanding. In particular, we have to get past thinking of the whatness of technology and instead consider it via the question of how. To think only of the whatness of things is, in fact, to end up replacing a felt sense of their essence with just more disembodied abstractions. It is, in a way, to entrench a disorientating perplexity rather than wake us up to wonder. It is to be trapped by our own “clevverness”—a word used by Russell Hoban in his marvellously telling novel Riddley Walker to suggest a disturbingly efficient, mechanistic, devilish way of thinking. Clevverness, Hoban suggests, does not break the spell but casts it. It pretends to lead us beyond self-destruction even while it ensures self-destruction. And that is often the trouble with any obsession with mere whatness; with mere definitions. A definition of technology is likely only to lead to endarkenment and not to enlightenment.

We might think of AI, for example, as the simulation of human intelligence processes by computers. I know, this is not exactly meaningless. In fact, it is correct. It touches on a pertinent facet of AI as a technology. But it doesn’t help us to feel what it is doing; what it is doing to us; how it is affecting us. There is no negativity in the definition, no sense of the thing itself. It does not allow us to feel the resistance that is there between ourselves and what we’re questioning. It allows no apprehension and insight. “The merely correct,” says Heidegger, very rightly, “is not yet the true.”

When we ask about how technology is—that is, about what sort of path it opens up for us—we stand a chance of seeing more clearly the ravenous monster that we have let into our homes. We’re better primed to notice, therefore, that something in us grates against the deception that it wants to inflict upon us. In the how of technology, we notice that it is, as Heidegger says, an instrumentum. It is something that heaps up or builds up or arranges other things. That Latin word instrumentum implies instrumentality, contrivance, and adjustment. How do we use technology? We use it to control, to save labour, to speed things up, to relieve ourselves of the demands of a job, to bring distant things close, and to distance ourselves from what is near. Technology is the concrete expression of the theoretical state of mind over matter and of thought over things. Technology incarnates alienation from being. This is to say that technology uses us against ourselves.

If being refers, as it does for Heidegger, not to something that can be represented but to the meaningful disclosure of beings as beings, technology can more easily be recognised as an non-identical repetition, albeit a distorted version, of this disclosure. Technology is better considered as a way of revealing than as just a means to an obvious end. In Heideggerian language, “Technology comes to presence in the realm where revealing and unconcealment take place, where alētheia, truth, happens.” But what are the specifics of this happening? How does it reveal? It does so by shrinking revelation. What it shows is not the things themselves but only those aspects of the things revealed by the technological frame. And that gets us to Heidegger’s central contention. “The essence of modern technology is enframing,” he says. “Enframing belongs within the destining of revealing.” In other words, technology places a frame around everything; it artificially limits what sort of world of meaning we can access. It guarantees inaccessibility. The information we have isn’t the information we want; the information we want isn’t the information we need; and the information we need is impossible to get.

Does so-called artificial intelligence put an end to this? Certainly not! If anything, AI is enframing par excellence because of how it places everyone, from programmers to users, under its command as something that can be stockpiled, arranged, and controlled. AI builds on the transposition of the mere representation of being into the key of zeros and ones. Heidegger’s example from physics is instructive as an analogy for this: “Modern science’s way of representing pursues and entraps nature as [only] a calculable coherence of forces … calculable in advance.” It “therefore orders its experiments precisely for the purpose of asking whether and how nature reports itself when set up this way.” AI’s way of representing pursues and entraps communication, forcing it to fit into categories that have been decided in advance. It orders our experience along the lines of total transparency. What is hidden from the gaze of the machine and the machine-mind is regarded as meaningless and so also valueless.

Enframing directly interferes with how we interpret the world. It curtails how things are allowed to exist in our presence. This idea is rather suggestively explored in the 2014 film Ex Machina, which has a machine imprisoning a man through enframing. The machine seems human to him and that is where his trouble begins. Seeming and being are not the same. The man reads desire and love into the machine and it responds by turning him into a manipulable piece of code. The misapprehension of the human within the machine ends in the obliteration of the human.

A different visual trope around AI involves showing people abusing humanoid robots, and this potential inhumanity towards inhumans is perhaps one reason why there is a push happening in some scholarly circles for so-called AI-rights. Robot lives matter, etc, etc. However, people will assume AI has consciousness long before it actually gains consciousness, as we know has already happened. Fascinatingly, this perceptual error emerges precisely because of technological enframing. After all, it is not the unique depth of genuine personhood that programmers are trying to match in their AI but a technical conception of personhood that falls dramatically short of the real thing. If, for example, an AI passes a theory of mind test in such a way as to suggest that the AI can understand people’s mental states, it is more likely that the test is not testing what the test designers think it is testing than that the AI has a theory of mind. An implicit agreement with the technological frame is likely to make this fact more difficult to notice.

Technological framing has a few key characteristics. For one thing, it is demanding. It challenges whatever is subjected to its pressure to yield only what it is asking of it; “the maximum yield at the minimum expense,” says Heidegger. More than that, the demand technology places on being is reductive; it transforms everything into something that can be stockpiled like products in a warehouse waiting to be called upon or ordered. This stockpiling is ultimately more about regulating than revealing. As Friedrich Jünger suggests, the technological goal is always further rationalisation; technological processes become entirely self-serving. They celebrate means without considering genuine ends.

“Everywhere,” says Heidegger, “everything is ordered to stand by, to be immediately at hand, indeed to stand there just so that it may be on call for further ordering.” On the surface, by eradicating the delicate reciprocity that exists between man and his environment, this would appear to elevate us to a position of remarkable power over being. The enframing may even feel like a victory. But on closer examination, it is only oppressive. We have no real control over how things get arranged once modern technology rules; we become subject to the very processes that technology manifests. We ourselves become, in other words, as much a part of the warehouse as everything else. We are cut down to a more manageable, which is to say less human, size.

One thing that Heidegger notices about anything that it is allowed to show itself as itself, is that it refuses to be consumed; it does not let the appropriative grasp get a grip. Being reveals itself as concealing itself. Its mysteriousness, in other words, is never obliterated by our beholding but is enhanced. Mystery does not mean unknowable but endlessly knowable. There is something of the lover and the poet that awakens in us in the genuine presence of being. A contemplative lingering allows us to re-encounter the already-encountered.

But technological enframing is unerotic. It produces the pornographer and the rapist and does not nurture the romantic. It encourages the bureaucrat, not the artist. Having said this, there is, I realise, a tendency towards hyperbole when we look at technology through Heidegger’s eyes. He suggests, for instance, that mechanising the food industry is just as bad as setting up an extermination camp or building a hydrogen bomb. I don’t accept this equation but it does seem to me that the difference between industry and warfare is not nearly as great as some think it is. Heidegger helps us to notice this, even if he lacks temperance in the way he directs our attention.

In Jünger’s The Perfection of Technology (1946), which I am guessing Heidegger read, we have the reminder that technology, as manifest organisational thinking, distributes not riches but poverty. Technology is an avaricious and devouring force. Technology plunders. AI does not mine only digital resources. Nature is plundered for the sake of the hardware that it relies on to exist. With AI, we often think of the marvels of programming that replicates the efforts of human creators—mines their efforts, undermines their efforts—but forget, as is our custom with the devices that sustain these very words on screens, that much was taken from the earth to make such a deception possible. We are finding more and more ways to stay plugged in, not ways to give the earth a break. It is a terrible irony that so many of us would instead try to come up with more advanced technologies to counter environmental degradation than switch off our damned machines. We want, as they say in Alcoholics Anonymous, more of what isn’t working.

Jünger offers us the “singularly revolting” example of the whaling industry to make the point of what technological enframing is all about, as well as highlighting how technology distributes poverty. He notes the terrible degradation that has happened so that “great sea mammals, embodiments of the might, the abundance, the loveliness of their element” get thought of only as “train-oil and soap.” The counter-argument to this would be that not all technological thinking is as degenerate as the thinking behind that terrible and thankfully mostly-obsolete industry. But Jünger’s point, like Heidegger’s, is that degrees in this kind of thinking may differ but the kind of thinking remains the same. Reality becomes something to take from; what is given is not enough. Even where technology gives, it still takes more away than it can return.

Ultimately, the technological frame robs us of the world itself; that is, it robs us of the world as resonant and meaningful. Even if we are not literally destroying the world through our tools, we find ourselves cut off from it. And that is my worry about AI; that by simply compounding the already pervasive patchwork of computer-generated plagiarisms, endlessly repeating the frames offered by technologically-indoctrinated prompters, our concern for the real is further diminished. It is not that we necessarily want this but enframing demands it. Soon enough, in any case, what we really want becomes a side-issue. Technology metastasises because compulsion has replaced desire. Addiction has replaced willfulness. It is not even a question any more of whether AI should be developed. It can be, so it cannot be helped; it can be, so it must be.

Most alarming, though, is that this specific restriction of revelation prevents the revealing of ourselves to ourselves. We become products of technological enframing rather than epiphanies granted by the generosity of being. What Heidegger means by this is perhaps well captured in an example that a student of mine drew my attention to recently. A certain individual, as a way to deal with a terrible trauma that she experienced, decided to use AI to help her to design so-called ‘safe spaces’ for trauma sufferers. On the surface of things, this sounds like a fitting response to a hurt. But the frame, as I’ve said, prevents self-revelation as well as revelation. In this, safetyism goes from being an ideological to a technological frame. Or perhaps the latter preempts the former. Sometimes what a sufferer of trauma needs is not a safe space at all but something more like exposure therapy—a way to learn that she has the resources within her to face monsters instead of cowering in fear of them. And there’s also the fact that people experience traumas so differently that a one-size-fits-all approach is likely to do more harm than good. The frame plugged into and extended by AI—safety is needed when danger has been experienced—is more likely to cause harm than to bring about healing. The frame prevents the deeper needs of a person from being disclosed. The frame renders the invisible irrelevant.

This, by the way, is one reason why I have said before that wokism is rudimentary AI. In fact, generally speaking, ideology has a tendency to replicate algorithmic ways of thinking. It zero-sum-gamifies everything into a make-believe contest between so-called oppressors and the so-called oppressed. It is a scapegoating algorithm that creates a fake competition to help support the narcissism and spiritual corruption of its users. In saying this, I do not deny that there are very hurt people behind that particular frame but the frame itself resists actually dealing with the hurt. It masks personhood itself. What is the pronounification of people but the refusal to let being speak? That people tend to fail to notice this is a sign of just how normal it is these day to use an unphenomenological technological frame to force being into silence.

It is interesting, with all of the above in mind, that Heidegger’s approach does not involve merely fleeing from technology. There will be danger in our confrontation with technology but it is in this danger, he suggests, that salvation shows itself and grows. When we are aware of the way that the coming to presence of technology threatens revealing, we are at that very moment fully attentive to the danger. We can’t directly contradict this danger but we can reflect on it, ponder it, interpret, and understand it.

As Heidegger suggests, we may find some freedom from technology through a recovery of perception. We may be saved, in a sense, by rediscovering our ability to withhold our theoretical impositions from the world and to let it speak and even sing to us with its own lyricism and its own strange music. The solution to the technological society is not going to be just more technological thinking but less of it. In reality, a very different way of thinking from the kind that welcomes the simulacrum is called for. We need contemplation and imagination. Daydreaming should be encouraged. To answer to the devilry of unworlded clevverness, Hoban suggests that we need the “first knowing.” We need the knowledge that came before we got ensnared in the technological frame.

Heidegger, however, places a frame around the world to block out the possibility of genuine transcendence. In this, I contend, he remains something of a victim of the very enframing he wants to escape. It is for this reason, perhaps, that his conception of being remains bound to violence, as Duane Armitage has argued in Philosophy’s Violent Sacred. This is to say, Heidegger offers a weaker answer than what is available to him.

I think of that fairy tale told by the Grimm brothers about The Girl With No Hands. A desperate and deluded father accidentally makes a deal with a devil that results in him being forced to give his daughter away. In Hoban’s Riddley Walker, remember, the devil is the father of the clevverness. And, well, the devil doesn’t like the girl’s hands when he comes to collect his prize. Those hands reach longingly for the things of heaven and the devil cannot tolerate such purity. So the devil tells the father to cut them off. Later in the story, the girl is offered prosthetic silver hands by a king who falls in love with her. The king means well. He wants to restore her. But the technological substitute doesn’t do the trick. And that, fascinatingly, is what our technologies often are—they are substitutes for a loss of deep faith. They are an answer, not a very good one, to a spiritual need. Well, I think Heidegger offers something of a substitute in his philosophy instead of the hope of the real thing. He points in the right direction, towards poetry and art as returning a sense of the meaningful disclosure of being to us. And, yes, that’s a pretty brilliant start. What we have exiled through our technologies is best wooed back with beauty. But this is still an incomplete answer to our deepest questions and yearnings.

We might settle for simulations sometimes because we feel that they are better than nothing. But are they really? We might settle for artificial words and images and so many products of the technological frame because we feel that they’ll deliver the meaning that our own being so desperately craves. But will it? Later on, in that haunting fairy tale, after a complicated series of events including more deceptions by the devil and a journey back into the realm of intimate love, the girl with no hands has her real hands restored—by God. Only God can give back what the devil has taken. Heidegger does not seem to allow for this; at least, he does not say it out loud in his work. The best he offers is a theology of the godforsaken. And it is ultimately this that points to why I am not a Heideggerian. I don’t believe as he did that God is done with us. But the technological frame makes this difficult to see.

I nevertheless agree with Heidegger that technology forcefits perception into accepting narrow revelations. AI continues the trend. Yes, being remains threatened. However, we will find the answer to this not only in Heidegger’s philosophical attunement, even though we would do well to follow him in trying to destroy the predecided schemes that block the way. We need more than this, though. Natural theology can take us only so far but we need revelation to complete the picture. This is to say, we need to relinquish our theological godforsakenness and allow things to draw us into a reality deeper than the things themselves. I mean that we need to discover not just reflection and contemplation and poetry and art, but prayer.