The Singularity and the Pornography of the Human Condition

Vernor Vinge, 30 years ago, wrote, 

The Coming Technological Singularity: 
                      How to Survive in the Post-Human Era

I'm reading it now. 

the world acts as its own
         simulator in the case of natural selection

For Vinge, human intelligence acts as its own simulator, displacing that of the world and accelerating the process of selection. It is conceivable then that the intelligence of machines will in turn displace the simulator of the human mind and further accelerate this process.

We humans have the ability
         to internalize the world and conduct "what if's" in our heads...

The Great Acceleration, in technical means, technology, following the Second World War, becomes in the event of Singularity hyperacceleration.

Developments that
         before were thought might only happen in "a million years" (if ever)
         will likely happen in the next century

Vinge says he'd be surprised if the Singularity happens before 2005 or after 2030. Interestingly, the word he uses to mean the coming to be of a world-simulator to displace human intelligence is to wake, to wake up. He proposes 4 scenarios: the supercomputer wakes up; the network wakes; the interface with humans creates in users a superior intelligence; biological enhancement leading to the superhuman. (These last two, Vinge calls IA, achieving the Singularity through Intelligence Amplification.)

The Singularity is essentially historical. It marks a point beyond which existing reality ceases, its rules no longer apply and there is a new reality. It would seem that this new reality for Vinge is defined by the ability of intelligence to simulate it. It is both Messianic and Simulacral.

Here's the problem: Any [ultraintelligent machine able to self-replicate] 
         would not be humankind's "tool" -- any more than humans
         are the tools of rabbits or robins or chimpanzees.

We will be in
         the Post-Human era. And for all my rampant technological optimism,
         sometimes I think I'd be more comfortable if I were regarding these
         transcendental events from one thousand years remove ... instead of
         twenty.

Hardware parity with the biological brain is a condition for the Singularity. In 1993, this was thought to be 10 to 40 years away. What Vinge calls the Singularity is today usually referred to as achieving Artificial General Intelligence (AGI). In 2022, Jake Cannell proposed a 75% likelihood of brain parity, AGI, being reached between 2026 and 2032. [source] 

AGI does however lower the bar from Messianic paradigm-changing event to inter-exchangeability of human and machine intelligences. What is missing from AGI is world simulation. This lowering of horizons for brain parity has also to do with the demotion of both brain and human. 

The neurosciences are in the process of explaining many of the mysteries of the brain, and showing them to be mechanisms. The brain is more distributed than previously thought. There are neurons in the guts. 

At the same time as the notion of the privilege of the brain in the organism becomes questionable, along with claims for its privilege also grounding human exceptionalism, the notion of human exceptionalism in the cosmos is less and less defensible. And as soon as we expect less from the human animal, the supersession of its intelligence no longer looks to bring about the transformation of reality. The Singularity that was source of apocalyptic fear in 1993 is among other bathetic moments in our contemporary culture of disappointment.

Commercial digital signal processing might be awesome,
         giving an analog appearance even to digital operations, but nothing
         would ever "wake up" and there would never be the intellectual runaway
         which is the essence of the Singularity. It would likely be seen as a
         golden age ... and it would also be an end of progress.

But, says Vinge, if it can happen it will happen:

... the competitive advantage -- economic, military, even artistic
         -- of every advance in automation is so compelling that passing laws,
         or having customs, that forbid such things merely assures that someone
         else will get them first.


I think that performance rules 
         [for the superhuman entity] strict enough to be safe would also
         produce a device whose ability was clearly inferior to the unfettered
         versions (and so human competition would favor the development of the
         those more dangerous models).

If the Singularity can not be prevented or confined, just how bad
         could the Post-Human era be?

The physical
         extinction of the human race is one possibility. (Or as Eric Drexler
         put it of nanotechnology: Given all that such technology can do,
         perhaps governments would simply decide that they no longer need
         citizens!)

It's hard to see either the physical extinction of the human race or the decision of governments that citizens are no longer necessary as transcending the reality 30 years on. What scares us more now, going by online newsfeeds, is loss of copyright by artists. It's ironic then that Vinge tells of the combination of competences and human/computer symbiosis in art. 

And that he talks about human/computer teams at chess tournaments after Hans Niemann. And mobile computing as being an example of IA (Intelligence Amplification), when it's often seen as the opposite. And the worldwide internet, in the human/machine combo, is where progress is fastest and it may "run us into the Singularity before
              anything else."

Ironic too, considering how the feedback loop has actually worked between biological life and computers, is that he says,

much of the work in Artificial
         Intelligence and neural nets would benefit from a closer connection
         with biological life. Instead of simply trying to model and understand
         biological life with computers, research could be directed toward the
         creation of composite systems that rely on biological life for
         guidance or for the providing features we don't understand well enough
         yet to implement in hardware.

While on the deadliness of competition, that is on the deadliest aspects of human inclinations, there is some prescience:

We humans have millions of
         years of evolutionary baggage that makes us regard competition in a
         deadly light. Much of that deadliness may not be necessary in today's
         world, one where losers take on the winners' tricks and are coopted
         into the winners' enterprises. A creature that was built _de novo_
         might possibly be a much more benign entity than one with a kernel
         based on fang and talon. And even the egalitarian view of an Internet
         that wakes up along with all mankind can be viewed as a nightmare.

Vinge differentiates between weak and strong superhumanity, the weak superhuman being the one who is enslaved to the merely human. Since it has overtaken the human in intelligence, whom it has every right to regard as a human does either its pet, its slave or a bug, the strong is the one to be feared. It is so if you are writing in 1993 and not so much in 2023.

Strong superhumanity in the end links with strong cooperation, a notion of networking as involving connectivity at higher and higher bandwidths. Vinge thinks bandwidths along both digital and cerebral axes, in terms of both computation and cerebration, so they add or superadd to intelligence. Today we tend to constrain communication even at high bandwidths to being a technical matter, of formal or linguistic representation. 

A matter of having more information, bandwidth has to do with perception considered also as representation, specifically inner representation, to have to do with how the subject, whether human or nonhuman, represents to itself the world outside it, so determining how much information it has to work with. The higher or greater the bandwidth, the more a subject has intelligence of the world in which it is situated, the more of it it can represent to itself, but this today does not equate with higher intelligence as it does for Vinge. 

For him, at the highest bandwidths networked entities share higher cognitive functions. They are telepathic. There is a general ratchetting effect of intelligence on intelligence, information on information, perception on perception, accumulating and, in Vinge's view, running away far in advance of low bandwidth entities like us. Leading back to the Singularity.

The process of accumulation Vinge is talking about is what today tends to be talked about as growth. It is inextricably linked to the process of capital accumulation. And this process is fundamentally joined to technological advance. Together they form progress.

For Vinge, technological advance means the accumulation of intelligence not capital. Intelligence embodied in computers, IA and information networks reaches a point of runaway. This progress for him was inevitable, not so for us.

Or is it? Today's coupling of technological and economic forces, accumulation and extraction and exploitation, of both smart and manual labour, means progress to an inevitable point that resembles Vinge's inasmuch as it too is the occasion for fear, if not dread. It is the deadly prospect of human extinction that will be augured by the Sixth Great Extinction of nonhuman life on which human life depends, in the interconnectedness of all life on earth.

The idea that economic and technological forces can be uncoupled appears almost to be cause for optimism. Economic forces, those of extraction and exploitation, are in this view a shackle. Economics stymies progress, leading it off in another direction which appears to be as inevitable as the Singularity. Economic forces divert progress towards planetary rather than human extinction.

Uncoupled, unimpeded by the economic restriction placed on it by the demand for demand and its supply, delinked from the process of capital accumulation (and the current redistribution of wealth, South to North, national to civic, social to plutocrat) technology had to lead, in Vinge's view, towards superhuman intelligence. He doubted that once it was achieved this superhumanly intelligent being would deign to be further constrained by restrictions placed on it by humans. He did however entertain the thought of a benign superhumanity. Its ultraintelligence might indulge humanity, oversee it and shepherd it. Out into the stars, for example, as in his and Iain M. Banks's space opera.

More likely, he thought, was that a superhuman entity should not indulge the species inferior to it. He thought it was more likely, just as humans do to other species, for humans to be squashed. The reasoning resembles that consecrated in theories of economics, of the self-interested individual. What possible self-interest on the part of a superhuman being might be served by preserving the human race?

Fun? Entertainment? Qualities that derive from our animal origins? Inputs of affect and emotion? Sex? God as the Great Pornographer of the Human Condition? ... 

All these have been considered in speculative and science fiction, and explored in the holy books of mono- and pluri-theistic religious traditions. According to how they are treated by the gods, and, whether angry, absent, indifferent or loving, the Gods, anthropos might be thought to have something going for them. And there is room for optimism in light of the literature.

When it gets here, if it were allowed to get here (and economics and technology were uncoupled), the Singularity might solve some of the problems, caused by human practices, of the Anthropocene. Or, Superman might call on the waters of the deep to engulf all of humanity as his punishment. Or... Isn't this exactly what is happening?