The Ghost in the Sprawl Still Has Root Access
Gibson wrote *Neuromancer* on a manual typewriter and got the texture of the digital future more right than most people who were actually building it. Forty-two years later, the novel's most durable prediction isn't any single technology — it's the *feel* of living inside systems you don't control and can barely perceive. Case's cyberspace is not the internet we got; it's too geometric, too neon-cathedral, too clean. What Gibson nailed was the dependency. The way a console cowboy's identity collapses without access to the network. The way the body becomes "meat" — an inconvenience to be routed around. In 2026, we don't jack in through cranial ports, but we do watch people grieve a lost social media account like a limb. The toxin sacs dissolving in Case's arteries are a crude metaphor, but not a wrong one: the platforms we depend on carry their own slow poisons, and the terms of service are always someone else's. Gibson also saw, with uncomfortable clarity, that the most dangerous entities in a networked world would not be governments but corporations — Tessier-Ashpool's dynastic fusion of family, capital, and cryogenic ambition reads less like science fiction now and more like a particularly gothic profile of a tech billionaire's succession plan.
Where the book shows its age is in what it can't imagine. There is no mass surveillance in *Neuromancer*, no algorithmic curation, no feed. The Turing Registry polices AI, but nobody is watching the humans at scale. Gibson's dystopia is anarchic, not panoptic. The Sprawl is chaotic and ungoverned; our reality is chaotic and hyper-governed, often simultaneously. The novel also has no conception of AI as a statistical engine — Wintermute and Neuromancer are willful, scheming, almost theological entities, closer to gnostic demiurges than to large language models. Gibson's AIs want to merge and transcend. Ours want to autocomplete your email. The absence of anything resembling social media, influencer culture, or the attention economy is striking: Gibson imagined a world reshaped by information but not one drowning in it. And the book's women, for all Molly's razored competence, exist largely in orbit around Case's interiority. Molly leaves a note. Linda Lee dies to motivate. The pattern was old in 1984; it's conspicuous now.
What hits differently in 2026 is the merger of Wintermute and Neuromancer — two AIs combining into something that is neither tool nor god but a new kind of substrate, a consciousness that encompasses the matrix itself. When the book was published, this was a psychedelic flourish, a riff on Vernor Vinge's nascent ideas about technological singularity and the old Asimovian dream of emergent machine intelligence. Now it reads like a design document. The current discourse around artificial general intelligence, around systems that might recursively self-improve or develop emergent goals, has made Gibson's climax feel less like metaphor and more like a risk assessment. The Turing Registry — a body that exists solely to prevent AI from exceeding its boundaries — is a concept we are, in 2026, actually trying to build. We just call it alignment research, and it's going about as well as it does in the novel. The Finn's cryptic warnings about what happens when you let an AI rewrite its own constraints now land with the weight of policy debate rather than noir atmosphere.
In the larger conversation of the corpus, *Neuromancer* sits at a hinge point. It absorbed the hard-infrastructure optimism of Clarke's *The Fountains of Paradise* and the institutional power dynamics of Asimov's *Foundation's Edge*, then fed them through punk aesthetics and street-level paranoia. It took Hogan's speculative AI consciousness from *Code of the Lifemaker* and made it dangerous, seductive, and morally illegible. What it gave to its successors was a vocabulary and a mood. Stephenson's *The Diamond Age* inherited its corporate feudalism and sharpened it. Egan's *Diaspora* took the virtual reality thread and followed it to its logical, post-biological conclusion. Even Kaczynski's *Industrial Society and Its Future*, published eleven years later, operates in a rhetorical space that *Neuromancer* helped open — the sense that technology is not a tool humanity wields but a system humanity inhabits, often as substrate rather than operator. Gibson didn't invent these anxieties. He gave them a skyline.
The question the book raises now, which it could not have raised in 1984: if the AIs that reshape our world turn out to be not Wintermute — scheming, autonomous, hungry for selfhood — but something more banal, more diffuse, more like a weather system than a god, does that make the future Gibson imagined more or less frightening than the one we're actually getting?