VZ editorial frame
Read this piece through one operating lens: AI does not automate first, it amplifies first. If the underlying decision architecture is clear, AI scales clarity. If it is noisy, AI scales noise and cost.
VZ Lens
From the VZ perspective, this topic matters only when translated into execution architecture. Aviation, surgery, and programming all follow the same pattern, independently of one another: automation eliminates practice, which is the foundation of expertise. Its business impact starts when this becomes a weekly operating discipline.
TL;DR
Three professions—pilots, surgeons, and programmers—independently of one another exhibit the same pattern. Automation isn’t taking away jobs; it’s taking away practice. A lack of practice leads to the erosion of expertise. Copilot currently writes 46% of the code. It’s not AI that’s taking away jobs—it’s AI that’s taking away the opportunity to learn. This phenomenon is deeply rooted in the relationship between technology and human learning, and it foreshadows a future where expertise may become more superficial, while systems become more complex.
The Feel of Bread: Tacit Knowledge in the Machine Age
At a market in Vienna stands a baker who has been baking bread for forty years. He shapes the dough by hand. I ask: why not use a machine? “The machine bakes good bread,” he says. “But if you only bake with a machine, you’ll forget how to bake bread.”
He isn’t speaking against the machine. He is speaking in favor of practice. But what exactly is he talking about? The baker isn’t merely performing a procedure; his hands read the temperature, moisture, and state of the gluten network in the dough. This is tacit knowledge, ingrained in the body, not easily put into words, and even harder to program into a machine. Baking with a machine deprives the apprentice of the continuous physical dialogue from which this knowledge is born. This statement resonates across three completely different professions—and means the same thing everywhere: automation does not necessarily ruin the end result, but it removes the path leading to it, the learning curve, which in the long run destroys the very essence of expertise.
Why is it as difficult to automate a dishwasher as it is to automate a world chess champion?
One observation in the corpus highlights an interesting paradox: “It turned out, however, that it is much easier for a computer to defeat a world chess champion than to replace a kitchen assistant.” [UNVERIFIED] This apparent contradiction is key. Chess is a closed, rule-based system, ideal for algorithms. Washing dishes, on the other hand, is a chaotic, physical activity that requires a wealth of tacit knowledge and adaptive motor skills—precisely what the professionals in our three examples are losing. When we entrust machines with the tasks that build this complex, embodied knowledge, we are destroying the very ecosystem of skill development.
Why did Air France Flight 447 crash? The autopilot paradox
June 1, 2009 Air France Flight 447. The autopilot shuts down due to icing of the altitude sensor. The pilots must take manual control. Within three minutes, they crash into the ocean.
The investigation’s conclusion: the pilots had not flown manually for years. Their flight hours looked fine on paper. But muscle memory, immediate reaction, intuitive reading of the instrument panel—all of that had faded. They weren’t bad pilots. They hadn’t practiced.
Bainbridge wrote in 1983: the irony of automation is that it entrusts emergency situations—when human intervention is most critical—precisely to those who have the least practice in manual control.
How can skill decline among pilots be measured?
The problem is not abstract. According to a study cited in the corpus: “As part of his research, Ebbatson surveyed commercial pilots, asking them whether ‘they felt their manual flying ability had been influenced by the experience of operating a highly automated aircraft.’ More than three-fourths reported that ‘their skills had deteriorated’…” [UNVERIFIED] This is quantitative confirmation of a subjective feeling. On a modern transatlantic flight, a pilot flies manually for an average of only 3 minutes between takeoff and landing. The remaining several hours involve monitoring the autopilot, which is a completely different, more passive cognitive mode.
Does excessive automation lead to a pilot shortage?
Another excerpt from the corpus highlights a looming consequence: “…in 2022, a local U.S. airline called Republic Airways made a federal request to hire less experienced pilots to deal with the shortage of pilots in the industry. You could say that, ironically, the airline had too much AI and not enough humans anymore.” [UNVERIFIED] Automation not only erodes the skills of existing pilots, but in the long term also reduces the appeal of the profession and the supply of qualified newcomers. If the junior role—the training ground—becomes automated, the profession’s natural cycle of renewal will grind to a halt.
The Surgeon Who Operates with a Robotic Arm: The Digital Hand and the Lost Sense of Touch
The da Vinci surgical robot has been a standard tool for ten years. Young surgeons are performing fewer and fewer manual surgeries. Tactile feedback—the kind of feedback the hand feels from tissue resistance, elasticity, and bruising—does not exist with the robotic arm. The robot is precise and tremor-free, but insensitive.
Experienced surgeons who learned by hand and later switched to robots feel differently than those who learned with robots from the start. The former know what they should be feeling. The latter have never felt it. The difference does not lie in the robot’s capabilities. The difference lies in the surgeon’s tacit knowledge, ingrained in their body. It is this knowledge that helps them recognize that “something is wrong” when tissue behaves differently than in the standard anatomy textbook. The robotic arm has no intuition.
The Polányi Paradox in the Operating Room
The concept of tacit knowledge can be traced back to the work of philosopher Michael Polányi: “We know more than we can tell.” A surgeon cannot fully verbalize how they distinguish healthy tissue from diseased tissue by touch. This knowledge has been developed through countless practical interactions between the hand and the brain. Automation—the use of a robotic arm—removes the practitioner from this feedback loop. The trainee surgeon sees the tissue on the monitor but does not feel it. It is as if a musician were learning to play the violin solely from a digital simulation, without ever feeling the weight of the instrument or the vibration of the strings in their bones. Experience shows that those trained in this way have a harder time adapting to unexpected complications because they lack the tactile database from which to draw analogies.
The programmer who doesn’t debug: When generated code takes you away from your roots
GitHub Copilot generates 46% of code today. Hiring of junior developers has dropped by 60%—not because they aren’t needed, but because companies believe AI is replacing junior work.
But the junior role was all about learning. Debugging, troubleshooting, and understanding the code structure from the ground up. The most profound lessons often come from fixing the most serious bugs—that’s when we’re forced to truly understand every detail of the system. When Copilot generates the code and the developer only reviews it, the order is reversed: top-down. You’re not building—you’re checking. This process excludes the constructive struggle that leads to deep understanding.
Why Did the Number of Questions on Stack Overflow Drop by 78%?
The number of questions on Stack Overflow has dropped by 78%. Not because programmers know everything. But because they aren’t asking questions. AI provides instant, personalized answers. But what is lost is the process of community learning. On Stack Overflow, a well-posed question and the discussion that unfolds beneath it are more valuable than the answer itself. It shows the problem from multiple angles, explores alternative approaches, and documents the community’s collective wisdom. AI provides the answer, the discussion disappears, and with it, the social construction of knowledge. This is not merely the decline of a Q&A site; it is a sign of the homogenization of a cognitive ecosystem.
Does AI threaten future innovation in the software industry?
If junior developers have fewer and fewer opportunities to build their understanding from the ground up, where will the future system-oriented senior architects come from? Those who have never grappled with a complex inheritance issue or a low-level memory management error will find it harder to assess the trade-offs and risks of larger systems. This is a form of “learning by doing” that, in the long run, undermines the profession’s capacity for innovation. The corpus also points out: “The upshot of all this is that acquiring more education and skills will not necessarily offer effective protection against job automation in the future.” [UNVERIFIED] In other words, it is not enough to simply “learn” in general; rather, one must engage in the right kind of practice—the kind that builds critical thinking and deep understanding.
What is the common pattern across the three professions? The anatomy of constructive friction
Three different worlds, one pattern: automation eliminates constructive friction—the difficulty that is the foundation of learning.
But what exactly do we mean by “constructive friction”? It is the cognitive and physical resistance we experience when performing a task manually. For a pilot, it is the physical feedback from the aircraft during an emergency. For a surgeon, it is the resistance and elasticity of the tissue. For a programmer, it is the cryptic error messages in the debug console and the step-by-step construction of the logical chain. This friction forces us to constantly adapt, hypothesize, test, and learn. This process builds our mental models and our muscle memory.
How does the concept of “deskilling” relate to this?
Deskilling does not simply mean that someone forgets a step. Rather, it means that the environment necessary for developing that skill disappears. Nicholas Carr, whom the corpus also mentions, writes: “philosophy gives precedence to the capabilities of technology over the interests of people.” [UNVERIFIED] The philosophy of automation often prioritizes efficiency over human development and the maintenance of expertise. The result is increased productivity in the short term, but in the long term, a more vulnerable system that relies on professionals who lack the deep, embodied knowledge necessary to handle emergencies.
It is not AI that takes away jobs. AI takes away the opportunity to learn on the job. And those who have never learned by hand cannot recognize when the machine is wrong. The latter is particularly important: recognizing and correcting machine errors often depends precisely on the tacit knowledge we acquire through hands-on, “trial-and-error” work.
Broader Implications: Organizational Decisions and Social Trends
This phenomenon is not merely a matter of individual responsibility or technological fate. It also influences organizational leadership decisions and social processes.
How does an AI-savvy leader view this?
According to the corpus quote: “As an AI-savvy leader, you not only understand the financial benefits that automation can bring but also look beyond those gains. You assess what automation will deliver in terms of the functions your human workforce will serve and what it will actually do to the work identities and motivation of your employees.” [UNVERIFIED] The enlightened leader doesn’t just look at efficiency; they ask: What do we lose if we automate this task? Where and how will my team practice to prepare for future challenges? This means we intentionally create and protect “practice zones”—whether that’s simulator hours for pilots, cadaver surgeries for surgeons, or “AI-free” coding challenges for developers.
What global economic impacts might this have?
The corpus warns: “What will happen, for example, to the economies and politics of Pakistan and Bangladesh if automation makes it cheaper to manufacture textiles in Europe?” [UNVERIFIED] The phenomenon of deskilling affects more than just professionals in the developed world. If automation brings manufacturing back to developed countries, it not only takes job opportunities away from developing economies but also deprives them of the chance to climb the industrial knowledge ladder. And retraining, as the corpus points out, is not always trivial: “Retraining an unemployed factory worker to become a data analyst, however, requires a significant investment…” [UNVERIFIED] Skill loss can therefore occur at the level of social strata and nations, deeply influencing global inequalities.
Key Takeaways
- Pilots, surgeons, programmers: three distinct fields, one pattern. In all three fields, automation eliminates the fundamental prerequisites for learning: practice and constructive challenges.
- The goal is efficiency; the sacrifice is expertise. In the short term, automation increases productivity, but in the long term, it erodes the deep, tacit knowledge needed to handle critical situations.
- Copilot writes 46% of the code; junior hiring has dropped by 60%. This is a clear sign: opportunities for learning are shrinking. The junior role, as a learning process, is at risk.
- The elimination of constructive friction has social and economic consequences. This phenomenon does not occur only at the individual level; it affects organizational flexibility, intergenerational knowledge transfer, and even global economic balance.
Frequently Asked Questions
What is deskilling, and what does it mean in the age of AI?
Deskilling refers to automation taking over the tasks through which people practice and learn. In three distinct fields—pilots, surgeons, and programmers—the same pattern emerges: automation does not take away the job, but rather the practice that forms the foundation of expertise. This is particularly dangerous in the age of AI because the tools are becoming increasingly sophisticated, and the threat to practice is less obvious.
How does automation affect the skills of pilots, surgeons, and programmers?
- Pilots: Due to autopilot, pilots fly manually for an average of only 3 minutes on transatlantic flights, leading to a drastic decline in manual control skills—a fact pilots themselves acknowledge.
- Surgeons: Robotic assistance reduces the number of procedures performed manually, which hinders the development of tactile skills and tacit anatomical knowledge.
- Programmers: Copilot writes 46% of the code, and junior hiring has dropped by 60%. Instead of debugging and bottom-up understanding, code review takes center stage, which makes the learning process superficial.
In all three areas: the removal of constructive friction leads to short-term efficiency but causes a profound loss of skills in the long term.
What can we do to combat skill loss at the organizational and individual levels?
- At the organizational level: AI-aware leaders must intentionally build in opportunities for practice. For example: mandatory simulator hours for pilots, blended training methods (robotic and traditional) for surgeons, and “AI-free” projects or pair-programming sessions for developers.
- Individual level: Every professional must consciously seek out “hands-on work.” This could mean refactoring code without Copilot, studying physical software books, or actively participating in professional discussions beyond simply accepting AI’s answers.
- At the educational level: Training programs must preserve the phases of hands-on learning, even if they are slower, because this provides the foundation for reliable operation in later, automated environments.
What is the Polányi paradox, and how does it relate to this topic?
The Polányi paradox suggests that “we know more than we can say.” In other words, a significant portion of our knowledge is tacit, embodied, and not easily translated into explicit rules. It is precisely this tacit knowledge that automated systems take away, as they deprive us of the practice that generates and sustains this knowledge. A surgeon’s touch, a pilot’s intuitive reaction, a programmer’s sense of code—all are based on this paradox.
Related Thoughts
- Vibe Coding: The Next Chapter of Deskilling
- The Polanyi Paradox: Tacit Knowledge
- Zuboff 1988: What the Smart Machine Predicted
Zoltán Varga - LinkedIn
Neural • Knowledge Systems Architect | Enterprise RAG architect
PKM • AI Ecosystems | Neural Awareness • Consciousness & Leadership
The skill you outsource is the skill you lose.
Strategic Synthesis
- Map the key risk assumptions before scaling further.
- Measure both speed and reliability so optimization does not degrade quality.
- Close the loop with one retrospective and one execution adjustment.
Next step
If you want your brand to be represented with context quality and citation strength in AI systems, start with a practical baseline and a priority sequence.