Skip to content

English edition

McLuhan's Law of Amputation

Every extension is also an amputation. GPS has robbed us of our spatial memory. Copilot robs us of the insight that comes from debugging. McLuhan said it back in 1964.

VZ editorial frame

Read this piece through one operating lens: AI does not automate first, it amplifies first. If the underlying decision architecture is clear, AI scales clarity. If it is noisy, AI scales noise and cost.

VZ Lens

From a VZ lens, this piece is not for passive trend tracking - it is a strategic decision input. Every extension is also an amputation. GPS has robbed us of our spatial memory. Copilot robs us of the insight that comes from debugging. McLuhan said it back in 1964. Its advantage appears only when converted into concrete operating choices.

TL;DR

Marshall McLuhan put it this way in 1964: every medium that extends man also amputates him. GPS extends navigation—it amputates spatial memory. AI extends thinking—it amputates the practice behind it and tacit knowledge. This principle holds true across eight fields, and it is not about technological opposition but about conscious choice. The question is not whether we use it, but whether we understand its cost and how we can establish a strategic relationship with it in our workplaces and lives.


What happens if you set out in Rome without GPS?

I’m walking around Rome without GPS. On purpose. My accommodation is in the Trastevere neighborhood, and I’m trying to keep the map in my head. I get lost. Three times. But by the third day, I already know that the small square behind the church is Piazza di San Cosimato, and the grocery store is on the corner to the left. The square isn’t just a series of coordinates; its atmosphere, its smell, the interesting twists and turns of the streets become familiar. A mental model takes shape.

When I use GPS, I never know where I am. I always arrive—but I have no spatial memory. GPS extends our ability to navigate. In return, it amputates the internal map of places. This is not just a personal observation. A London study of taxi drivers showed that acquiring knowledge of the city—the “Knowledge”—leads to an increase in the gray matter of their hippocampus. The researchers note with concern: “We believe [the hippocampal] area of the brain increased in gray matter volume because of the huge amount of data [the drivers] have to memorize. If they all start using GPS, that knowledge base will be reduced and possibly affect the brain changes we are seeing.” (Source: [UNVERIFIED] corpus citation). Drivers would be freed from the burden of learning, but they would lose the unique mental benefits of the training. Their brains would be “less interesting.” This is the concrete, physiological manifestation of that amputation.

McLuhan described exactly this dynamic in 1964—sixty years ago. The tool we use is not a neutral medium. It shapes us.

The Law: Not Just a Tool, but an Environment

Marshall McLuhan’s media theory in a single sentence: every extension is an amputation. But what do we mean by “medium” here? For McLuhan, “technology” and “media” are practically synonymous—technologies are media (Source: [UNVERIFIED] corpus quote). A medium comes into being when an entity “enters into a structural relationship with another entity” (Source: McLuhan and McLuhan 1998, based on the corpus quote). A pen is a medium when we write with it; a GPS is a medium when we navigate.

The wheel extends the leg—it amputates the necessity of walking. The book extends memory—it amputates the culture of oral tradition. The telephone extends sound—it amputates physical presence. The pattern is constant. McLuhan described this pattern of historical change with his four laws: every technology 1) amplifies something, 2) renders obsolete something else, 3) brings back something from the past, and 4) turns into an excess. From our perspective, the first two are the most important: extension (amplification) and amputation (obsoletion).

AI expands the speed, scope, and access to information of thought. It amputates the depth of thought, the practice of patient analysis, and the tacit knowledge underlying problem-solving. As one analysis of the corpus points out: “The computer is capable of all this. While printing presses and radios were passive tools in human hands, computers are already beginning to become active agents, stepping out of our control, and even the fe” (Source: [UNVERIFIED] corpus quote). AI is not just a tool; it is increasingly a partner, or even a rival. This adds a new dimension to amputation: when the agent decides, what do we enhance, and what do we render obsolete?

Eight fields, one pattern: The anatomy of amputation

In our corpus research, we confirmed the same pattern across eight independent fields. Let’s examine more deeply what these amputations actually mean:

  1. Navigation: GPS → loss of spatial memory. Beyond the example of the London taxi driver, this involves the cessation of cognitive map formation. Our connection to our environment becomes more superficial.
  2. Flight: autopilot → erosion of manual flying skills. The tragedy of Air France Flight 447 is a horrific example of pilots failing to recognize and correctly correct the aircraft’s condition when the autopilot system handed over control. Situational awareness and sensorimotor skills deteriorate.
  3. Surgery: robotic arm (e.g., da Vinci) → lack of tactile feedback. The surgeon’s virtual presence lacks the direct sense of tissue resistance and elasticity. Haptic intelligence and some fine motor control are disabled.
  4. Programming: Copilot → reduced depth of debugging insight. When AI suggests code, the programmer delves less deeply into the root of the problem and the full understanding of the logical chain. Systemic and causal understanding weakens, which can be catastrophic in the event of critical errors.
  5. Writing: AI text editor → deterioration of writing skills. The process of organizing thoughts, choosing words, and developing one’s own style short-circuits. The ability to organize thoughts and express a personal voice is limited.
  6. Translation: machine translation → weakening of linguistic intuition. We entrust nuances, cultural context, and the playfulness of language to an algorithm. Linguistic sensitivity and the depth of bilingual thinking are eroded.
  7. Mathematics: calculator → disappearance of mental arithmetic. Internal play with numbers, the ability to estimate, and numerical sensitivity weaken. Quantitative intuition declines.
  8. Photography: smartphone → simplification of compositional thinking. Automatic modes and the possibility of endless editing eliminate the need for prior concentration on light, framing, and the moment. The skills of visual observation and intentional composition weaken.

Not an analogy. A pattern. The same cognitive and practical structure repeats itself: immediate skill and the underlying neurological/cognitive circuitry fade as an external system takes over its function. This is what researchers call “deskilling,” or the loss of skill.

Why don’t we notice the amputation? The illusion of the prosthetic god

McLuhan provided the answer to this as well: fish do not see water. The medium—the environment in which we operate—is invisible. We do not experience amputation as pain, but as comfort. Sigmund Freud observed something similar as early as 1930 in his work Civilization and Its Discontents: with the help of the telescope, the microscope, the gramophone, and the telephone, “the human race has become like a god of prostheses. When he puts on all his prosthetic organs, he is truly magnificent,” but these prosthetic organs are not part of his body (Source: Freud, cited in the corpus). Brilliant, powerful, yet separated from his own physical essence.

It doesn’t hurt that you don’t navigate in your head. GPS is better. It doesn’t hurt that you don’t debug deeply. Copilot is faster. Amputation doesn’t seem like a loss—it seems like efficiency, productivity, progress. Tech optimists like Marc Andreessen celebrate exactly this: “I have good news: AI won’t destroy the world; in fact, it might just save it,” and “AI can make everything better” (Source: [UNVERIFIED] corpus quote). This perspective, however, often overlooks the dark side of the “god prosthesis”: the fading of our inner abilities.

Until the GPS stops working. Copilot suggests faulty code in a complex system. The autopilot unexpectedly hands over control. And then it turns out that the amputated ability wasn’t a luxury—but a fundamental safety net, a source of the system’s resilience. The missing leg starts to hurt when the prosthetic leg breaks.

AI as an Active Agent: A New Dimension of Amputation

For most of history, extensions were passive tools. The pen waited for you to write; the hammer waited for you to strike. However, one observation from the corpus sharply highlights the change: “The computer is capable of all this. While printing presses and radios were passive tools in human hands, computers are already beginning to become active agents, stepping out of our control, and even the fe” (Source: [UNVERIFIED] corpus quote). This is the essence of the AI revolution: “They are capable of learning things that no engineer has programmed into them, and making decisions that no leader can foresee.” (Source: [UNVERIFIED] corpus quote).

This means that amputation is no longer merely the result of our own choice (e.g., “I use a calculator, so I don’t practice mental arithmetic”). AI, as an active agent, can initiate the process of extension/amputation. For example, an AI-powered customer service system extends the company’s capacity, but it can amputate human employees’ problem-solving skills by confining them to an increasingly narrow scope of tasks. The amputation will occur at the organizational and societal levels and will not always be conscious. This is a modern manifestation of the Polányi paradox: tacit knowledge, which we acquire only through practice, is difficult or impossible to formalize and transfer—yet a system can easily knock us off our feet.

Strategic amputation: How do we make conscious decisions?

It’s not about whether you should use AI. I’m not a Luddite. Writing is also an amputation, yet you’re reading this line. The question is: do you know what you’re amputating? If you do, you can make a decision. If you don’t, convenience will make the decision for you. The goal is strategic amputation: consciously choosing which capabilities to outsource for efficiency, and which to deliberately preserve and strengthen for the sake of flexibility, creativity, and depth.

How can we develop this kind of awareness? Let’s think in a cascade, level by level:

  • Personal level: Occasionally perform critical tasks “manually.” Write a short essay without AI assistance. Solve a logic problem on paper. Map out a street from memory. These “cognitive exercises” preserve the skills that are at risk of being lost.
  • Team/Workplace Level: Don’t just focus on productivity metrics. Ask: “What long-term skills is this AI tool squeezing out of our process? Where is the most critical point of inflexibility in our system?” Incorporate regular “mental modeling” or “system understanding” tasks where people dig deep into the underlying processes.
  • Corporate Level: Training and development should not be limited to the use of AI. Let’s develop the meta-skills that AI cannot (yet) take away: critical thinking, complex problem formulation, ethical judgment, interpersonal connection, and creative synthesis. These will be the new “safety nets.”
  • Local economy/Social level: Let’s discuss within the education system how to teach the next generation to function not only with AI, but despite AI. How do we preserve professional craftsmanship? The “Knowledge” of London taxi drivers is just one extreme example of all the professional intuition that builds up over decades and can be eroded in weeks.

The ultimate question: Who are you behind your prosthetics?

The question, therefore, is actually deeper than a technological choice. Luther’s theological debate, which the corpus mentions—on the meaning of torturing the body—“has actually been resurrected by 21st-century technologies. What is the relationship between our physical bodies and our online identities?” (Source: [UNVERIFIED] corpus quote). Similarly: what is the relationship between our original cognitive abilities and our “prosthetic intelligence” extended by AI? Amputation is inevitable, but consciousness is what distinguishes us from fish. The awareness that we are in water. The awareness that comfort comes at a price. The awareness that the prosthetic god is also a choice.

If you know what you are amputating, you will not merely be a user of the technology. You will become its strategic partner. The law is not a verdict, but a map. It shows the costs. It is our task to decide which price we are willing to pay—and which of our abilities is so fundamental that we will not sell it at any price.

Key Takeaways

  • McLuhan’s Law: every extension is an amputation—AI is no exception, and since AI is an active agent, the amputation can become more active.
  • The same cognitive pattern holds true across eight fields: the adoption of an external system leads to the atrophy of the internal skill and its associated neurological circuitry (GPS → spatial memory, autopilot → manual skill, etc.).
  • We do not experience amputation as pain, but as comfort—Freud’s “prosthesis god” metaphor aptly describes this illusion.
  • The critical point comes when the system fails and the amputated ability is missing as a safety net.
  • The question is not whether to use AI—but whether you know what it takes away in return, and how to establish a strategic relationship with it at the personal, team, corporate, and societal levels.

Frequently Asked Questions

What is McLuhan’s Law of Amputation?

According to McLuhan, every technological “extension” is simultaneously an “amputation”: the car extends the leg, but amputates walking. The telephone extends the voice, but amputates physical presence. Media are not neutral tools; they structurally transform the individual and society.

What exactly does AI amputate?

AI extends the speed and capacity of thought, but it may amputate the ability for deep, patient attention, the tacit knowledge of problem-solving, the practice of critical analysis, and the confidence of independent reasoning. Its exact operation depends on the domain (see the eight examples).

How can we defend against unwanted amputation?

Awareness is key. 1) Identify the main extensions of the tools you use and the potential amputations that come with them. 2) Plan cognitive exercises: occasionally perform critical tasks without the tool. 3) Incorporate meta-skills into your development and your team’s work that resist automation (e.g., systems thinking, problem formulation). 4) Discuss the topic within your community so that we remain collectively vigilant.

What is the difference between “good” and “bad” amputation?

There is no such thing as an objective “good” or “bad.” Strategic amputation means consciously and reversibly amputating a low-order, repetitive skill (e.g., memorizing multiplication tables) in order to free up higher-order abilities (e.g., modeling complex problems). “Bad” amputation occurs when technology, out of necessity or convenience, cuts out a higher-order, fundamental ability (e.g., spatial orientation, deep reading) without providing an alternative or compensation, thereby making the individual or the system vulnerable.



Zoltán Varga - LinkedIn
Neural • Knowledge Systems Architect | Enterprise RAG architect
PKM • AI Ecosystems | Neural Awareness • Consciousness & Leadership
Every extension is an amputation in disguise.

Strategic Synthesis

  • Convert the main claim into one concrete 30-day execution commitment.
  • Track trust and quality signals weekly to validate whether the change is working.
  • Iterate in small cycles so learning compounds without operational noise.

Next step

If you want your brand to be represented with context quality and citation strength in AI systems, start with a practical baseline and a priority sequence.