VZ editorial frame
Read this piece through one operating lens: AI does not automate first, it amplifies first. If the underlying decision architecture is clear, AI scales clarity. If it is noisy, AI scales noise and cost.
VZ Lens
From a VZ lens, this piece is not for passive trend tracking - it is a strategic decision input. Fifty-five years ago, Herbert Simon said: the abundance of information creates a poverty of attention. In 2026, the flood of AI-generated content is the fulfillment of that prediction. Its advantage appears only when converted into concrete operating choices.
TL;DR
In 1971, Nobel laureate Herbert Simon wrote: an abundance of information inevitably leads to a scarcity of attention. The flood of AI-generated content is the perfect fulfillment of this prediction—fifty-five years late. The reality, however, runs deeper: today, it is not only our attention resources that are dwindling, but also our deep, meaning-forging consciousness. The solution does not lie in filtering even more information, but in the radical willingness to deliberately decide what does not deserve our attention.
The Silence of Baker Street
I am sitting on the platform of the Baker Street subway station, past midnight. The cool, sewer-scented air is broken by the footsteps of a single late passenger, then thick silence returns. The illuminated advertisements flicker at the corner of my eye, messages flashing and fading—each begging for my attention. An old ceramic tile on the wall points the way: toward the heart of the city. But I stay here by the motionless escalator, listening to the silence. This is exactly the silence that is missing elsewhere. A question circles in my head: what happens when everything speaks, but nothing calls out? The advertisements keep flickering. The echo of my question bounces emptily off the tiles.
An old journal: The Time Machine Message
I find Simon’s 1971 essay in a yellowed journal at an antiquarian bookstore. The sentence that stops me: “In an information-rich world, the wealth of information means a dearth of something else: a scarcity of whatever it is that information consumes. What information consumes is rather obvious: it consumes the attention of its recipients.”
In an information-rich world, the wealth of information signifies a scarcity of something else: the scarcity of whatever it is that information consumes. Information consumes the attention of its recipients.
He wrote this fifty-five years ago. I’m reading it today. And it’s more accurate than any AI analysis from 2026. The situation becomes even more vivid, however, when we understand who Herbert Simon was. A giant of the 20th century, who achieved significant results in political science, economics (where he received a Nobel Prize), the laying of the foundations of computer science and artificial intelligence, as well as cognitive psychology. When he spoke of the scarcity of attention, we are not reading the lay speculation of a philosopher, but the perceptive diagnosis of a systems scientist who deeply understood the limits of decision-making, computation, and the human mind. Discovering this study is not merely a historical curiosity; it is as if we had received a message from a time machine that precisely describes the root of our digital chaos. Simon did not prophesy, but deduced an inevitable consequence based on the economics of the volume and processing of information.
The Fulfillment of the Prediction: The Curve Leading to Infinity
In Simon’s time, the abundance of information meant the expansion of libraries, the spread of television, and the proliferation of scientific journals. This alone was enough for him to realize: information is not the bottleneck. Attention was. A quote from the corpus perfectly captures this historical shift: “a hundred years ago you could stand on a street corner in a city and start preaching, and people would probably stop and listen. They had the time and attention to spare.” The scarcity of information was the default, so every new piece of information was valuable and received attention. Capacity was abundant.
By 2026, AI will generate more text in a single day than Simon could have read in his entire lifetime. This flood of content isn’t a metaphor—it’s a fact. ChatGPT generates text through hundreds of millions of interactions every day. Every single piece of text demands attention: to be read, evaluated, and judged for its usefulness. But this isn’t just about summarization. Imagine a curve where the amount of information rises exponentially, almost vertically, toward infinity. Next to it is a completely flat, horizontal line: the cognitive capacity of human attention, which has barely changed over the past several thousand years due to biological limitations. Simon’s equation is modified as follows: if the amount of information tends toward infinity, the value of attention tends toward infinity. But the capacity of human attention is constant.
This gap is the fundamental tension of our daily lives today. Another quote from the corpus states: “information exposure kills attention.” Exposure to information does not increase attention; it kills it. AI produces this exposure on an industrial, automated scale, so Simon’s prediction has not only come true but has been realized in a new dimension that the prophet himself might not have been able to foresee.
The Poverty of Awareness: When the Surface Swallows the Depth
Simon spoke of attention. But what is happening in 2026 runs deeper. It is not just attention that is becoming impoverished—awareness is as well.
Attention means: I notice. Consciousness means: I understand what I notice. Attention is the gatekeeper that selects what enters the space of consciousness. Consciousness is the inner workshop where we process information, connect it to our existing knowledge, question it, and imbue it with meaning and intention. When I read fifty AI-generated summaries a day, my attention is working—I’m processing the information. But my consciousness isn’t working—there’s no time to question it, put it into context, or make it my own.
Let’s think of this with an analogy: attention is like the speed of a conveyor belt. And awareness is the quality inspector who examines the products moving along the conveyor belt. When the conveyor belt’s speed (the flow of information) reaches the limit of human capacity, the quality inspector is unable to check anything—they just let the items rush by. In this state, even though we “read” or “listen,” we are not actually building knowledge or forming opinions; we remain merely passive consumers of information. Simon predicted a lack of attention. What has come to pass is a poverty of consciousness. This explains why we often feel emptiness and meaninglessness after being bombarded with vast amounts of information, and why it is so difficult to deeply understand anything amidst the deluge of superficial encounters.
Why Does AI Repeat the Simon Paradox? The Structure of the Vicious Cycle
AI’s response to information overload: more information. Summaries, filters, curated content. But a summary is also information. A filter also demands attention. Curation also requires decisions.
This is a classic, systemic paradox that creates a vicious cycle. The structure of the problem is as follows: 1. There is too much information. 2. This distracts us and makes decision-making difficult. 3. We use AI to help manage the information (summarizing, filtering, recommending). 4. The operation of AI generates even more information (summaries, reports, notifications). 5. This additional layer of information further strains our attention and raises expectations for AI tools. 6. Back to point 1.
AI does not solve Simon’s paradox. It repeats it—one level higher. Let’s take a corporate example: a manager is overwhelmed by the volume of daily reports. As a solution, they introduce an AI system that automatically generates daily summaries from all the reports. However, it soon becomes clear that interpreting the summaries also takes time, the new questions raised by the AI trigger further analyses, and monitoring the AI system itself (is the output correct?) adds another layer of attention-demanding tasks. The paradox applies here as well: the tool introduced to manage information becomes a source of information in its own right.
What would Simon say today about the flood of AI content? The next scarce resource
Perhaps this: the next scarce resource is not attention. Rather, it is the ability to deliberately not pay attention. Knowing what is NOT worth paying attention to. Knowing how to tune it out. Knowing how to decide: this information is not for me.
This statement points to a higher-order navigational skill. In the corpus quote, a bear analogy highlights this: “But just like the bear must be selective in its diet (digging all day for a few measly ants would hardly be worthwhile), so must informavores carefully navigate the glut of information.” We must be information predators (informavores) who hunt selectively, not scavengers that consume everything. Deliberate inattention is the active, conscious practice of this selectivity. It is not merely the allocation of attention, but its radical withdrawal from certain areas.
The cascading effects of this skill manifest at every level:
- Digital/Individual Level: Notification detox, conscious app restriction, and protecting time blocks dedicated to “deep work.”
- Workplace/Corporate level: Rethinking meeting culture (fewer but more effective meetings), streamlining internal communication channels (instead of sharing everything with everyone), critically evaluating the introduction of AI tools: does it truly reduce cognitive load, or does it merely shift it?
- Local economy/Social level: Reforming education to focus not on memorizing information, but on teaching critical filtering, source criticism, and concentration skills. An evaluation of the leisure economy, where non-attention (rest, daydreaming, time spent in nature) is not a luxury but a fundamental human and economic resource.
He said it fifty-five years ago. We just didn’t listen—because there was too much other information. Today, the question is whether we will learn from it before our consciousness is completely dissolved in the superficial flow of data.
Key Takeaways
- In 1971, Simon predicted that the abundance of information would lead to a scarcity of attention. This is not speculation, but a fundamental principle derived from the economics of decision-making and information.
- In 2026, the flood of AI-generated content will be the perfect fulfillment of this prediction—taking it one step further. AI does not solve the problem; rather, it amplifies it exponentially, adding new layers.
- It is not just attention that is becoming impoverished—awareness is too: there is no time to understand what we perceive. The multitude of superficial interactions crowds out deep processing and meaning-making.
- The next scarce resource is the ability to intentionally not pay attention. The challenge of the future is not how to consume more information, but how to regulate what enters our conscious space and what we consciously keep out.
Frequently Asked Questions
Who said that “information abundance = attention scarcity”?
Nobel Prize-winning economist Herbert Simon in 1971: “The abundance of information causes a scarcity of attention.” (The exact quote: “In an information-rich world, the wealth of information means a dearth of something else: a scarcity of whatever it is that information consumes. What information consumes is rather obvious: it consumes the attention of its recipients. Hence, a wealth of information creates a poverty of attention.”) Fifty years later, this statement is more relevant than ever, because AI is driving precisely this dynamic to an unprecedented scale.
How is this relevant in the age of AI?
AI exponentially increases the amount of available information, while human attention remains biologically unchanged. The gap is widening faster than at any time in history. Moreover, AI designed to alleviate information overload often merely adds another layer of (generated) information to what already exists, thereby perpetuating or exacerbating Simon’s paradox rather than resolving it.
How can we practically protect our attention and awareness?
The key is developing the skill of intentional non-attention. This can be achieved in several ways:
- Technical limits: Use apps to limit the time spent on social media and news sources, and turn off non-essential notifications.
- Time blocks: Designate and protect uninterrupted periods of the day dedicated to “deep work.”
- Information diet: Consciously curate your sources. Prioritize quality and relevance over quantity. Ask yourself: “Does this truly contribute to my goals or my meaningful understanding?”
- Organizational culture: Radically reducing the number and length of meetings at work, prioritizing asynchronous communication, and testing the introduction of any new information channel or AI tool with the question: “Does it truly reduce cognitive load?”
What is the connection between the economics of attention and Simon’s ideas?
Simon laid the foundation for the concept of the economics of attention. He pointed out that when information is abundant, what is truly limited is attention, and thus it becomes the most valuable currency. The entire business model of today’s platforms (social media, streaming services) is built on this: they lure our limited attention resources with free information (content), and then sell that attention to advertisers. Simon’s observation is therefore not merely a cognitive observation, but a foreshadowing of one of the fundamental drivers of modern digital capitalism.
Related thoughts
- AI Slop: Attention Grabbing on an Industrial Scale
- The Algorithmic Self: Digital Identity
- The Consciousness Economy: What Comes After the Attention Economy
Zoltán Varga - LinkedIn
Neural • Knowledge Systems Architect | Enterprise RAG architect
PKM • AI Ecosystems | Neural Awareness • Consciousness & Leadership
Attention is the only scarce resource.
Strategic Synthesis
- Map the key risk assumptions before scaling further.
- Measure both speed and reliability so optimization does not degrade quality.
- Use a two-week cadence to update priorities from real outcomes.
Next step
If you want your brand to be represented with context quality and citation strength in AI systems, start with a practical baseline and a priority sequence.