How Do We Quantify Novelty?
The Quest to Quantify Novelty
The last truly paradigm-shifting discoveries in physics occurred nearly a century ago with quantum mechanics and relativity. In biology, DNA structure was elucidated over 70 years ago. Even the much-vaunted "information revolution" peaked decades ago with the invention of the internet.
This observation isn't merely anecdotal. Researchers have documented exponentially increasing costs per breakthrough discovery, longer intervals between major innovations, and a shift toward optimization rather than fundamental innovation across multiple scientific domains.
A Quick Primer: What is Varney's Law?
In any finite system of exploration or observation, the rate of novel information gain asymptotically approaches zero as the observer’s interaction with the system becomes saturated by prior knowledge and systemic constraints.
Proposed by me, Varney's Law provides a mathematical framework for understanding how discovery rates evolve in bounded systems. The law states:
dI/dt = α S(t) exp(−β E(t)) (1 − K(t)/K_max)
Where:
dI/dt = instantaneous rate of novel information acquisition
α = scaling constant for discovery potential
S(t) = search capacity (population, technology, effort)
β = exposure suppression constant
E(t) = cumulative exposure/knowledge
K(t) = current knowledge at time t
K_max = maximum possible knowledge in the domain
The elegance of Varney's Law lies in its three key components:
Search Capacity S(t): More people with better tools can discover more
Exposure Suppression exp(−β E(t)): Each discovery makes subsequent discoveries harder
Approaching Limits (1 − K(t)/K_max): Discovery slows as we approach maximum knowledge
The law predicts that discovery rates will eventually approach zero through two mechanisms: either we exhaust discoverable knowledge (K → K_max) or accumulated exposure makes further discovery prohibitively difficult (E → ∞).
Testing Varney's Law Using Crude Novelty Quantifiers: A Multi-Scale Simulation
To test these predictions, I developed a comprehensive simulation incorporating multiple factors that affect knowledge discovery:
Historical Population and Resource Constraints
This simulation modeled human population growth from 50,000 BCE to 2300 CE, including:
Logistic growth to current levels (~8 billion people)
Population collapse starting around 2075, declining to 20% of peak by 2200
Resource constraints based on Earth's ecological overshoot (currently consuming ~1.7 Earths annually)
Multi-Tier Novelty Structure
Rather than treating all discoveries as equal, the model incorporates six tiers of novelty magnitude:
Micro-scale (magnitude 1): Personal learning, small optimizations
Small-scale (magnitude 5): Incremental improvements, new applications
Medium-scale (magnitude 25): Methodological advances, significant optimizations
Large-scale (magnitude 125): Platform technologies, field-shifting developments
Kuhnian (magnitude 625): Paradigm shifts that alter our understanding of reality
Mega-scale (magnitude 3,125): Civilization-transforming discoveries
Each tier becomes increasingly rare as the novelty pool shrinks, with higher-magnitude discoveries disappearing first.
The simulation results reveal several striking patterns:
Peak Discovery and Plateau Timing
Peak discovery rate: Occurred around 47,300 BCE at 2.00×10⁻⁷ units per year
Discovery plateau: Began effectively immediately, with very slow knowledge accumulation throughout human history
Final knowledge achieved: Only 1.1% of the theoretical maximum by 2300 CE.
Here’s why that early peak occurs:
Early in human history, the accessible novelty pool N(t)N(t) is essentially untouched and at its absolute maximum N0N0. High-magnitude novelty events (mega/kuhnian tiers), with their large impacts and relatively high initial frequencies, dominate novelty harvesting.
Even though population and epistemic effort E(t)E(t) are very low in prehistoric times, the magnitude of novelty events is so large and the novelty pool so fresh that the effective harvest rate—weighted by magnitude and rarity—is surprisingly high.
As population grows from near zero in deep prehistory to substantial numbers later, the total epistemic effort scales up, but the novelty pool N(t)N(t) shrinks as discoveries accumulate.
High-magnitude novelty events become increasingly rare due to rarity exponents (αjαj) that cause their frequencies to drop sharply with depletion of N(t)N(t).
This creates a pattern where the effective discovery rate, driven by magnitude x frequency x effort, reaches its maximal value early on when large discoveries are still abundant and rarity is low.
After that, despite rising population and effort, the depletion of accessible high-impact novelty and increasing difficulty of breakthroughs cause a net decline in discovery rate over millennia, as the system transitions into incremental optimization rather than transformative shifts.
Put simply, the model suggests that the most intense phase of big, impactful discovery happens in a deep prehistoric epoch before large portions of novelty are consumed, even with relatively few people. Later increases in population enable more incremental and small to medium discoveries, but cannot compensate for the loss of the large novelty "reservoir."
This early peak may seem counterintuitive given the accelerating knowledge explosion in recorded history, but the model captures different scales: the peak reflects the initial burst of fundamental, high-magnitude novelty exploration when the discovery "frontier" was widest.
Also, factors like cognitive capacity, tool use, and social organization are abstracted away but would modulate actual historical rates. The peak at 47,300 BCE is therefore a theoretical maximum in the model’s framework given assumptions about novelty scaling, rarity, and normalized population effort, rather than a literal historical prediction.
In essence, it emphasizes that maximal novelty availability and impact, combined with the mechanism of rarity-induced slowdown, mathematically produce an early peak discovery rate well before the recorded scientific revolutions we usually consider.
Validation of Varney's Law Predictions
The simulation confirms three key predictions:
Discovery rate decline: Rates decrease as knowledge approaches limits
Population growth effects: Larger populations initially boost discovery capacity
Exposure suppression dominance: Cumulative knowledge eventually constrains further discovery
Tier-Specific Patterns
The most revealing aspect is how different novelty tiers behave:
Mega-scale discoveries: Effectively extinct before recorded history
Kuhnian paradigm shifts: Last significant occurrences in early prehistory, then vanishing
Large-scale breakthroughs: Declining through the current era, disappearing during the projected population collapse
Small/medium discoveries: Dominate late-phase innovation, but also declining
Fact-Checking and Academic Context
The Discovery Plateau Hypothesis isn't merely speculative—it builds on established research in multiple fields:
Documented Innovation Slowdowns
Academic literature supports several key observations:
Patent analysis shows decreasing "breakthrough" innovations relative to incremental improvements
Research cost scaling demonstrates exponential increases in costs per fundamental discovery
Citation networks reveal patterns consistent with approaching knowledge limits in specific domains
Information-Theoretic Foundations
Varney's Law draws from solid theoretical ground:
Shannon information theory provides the mathematical framework for quantifying novelty
Landauer's principle establishes the thermodynamic costs of information processing
Kolmogorov complexity theory suggests a finite informational structure underlying discoverable patterns
Historical Precedent
The concept of discovery limits has precedent in various fields:
S-curve patterns are well-documented in technological development
Plateau effects have been observed in medicine, psychology, and economics
Diminishing returns to research effort are empirically documented across scientific domains
Critical Implications for Civilization
If Varney's Law accurately describes knowledge acquisition, the implications are profound:
The End of Science as We Know It
Fundamental breakthroughs may become increasingly rare, not due to lack of effort, but due to mathematical necessity
The "optimization treadmill" awaits: endless refinement of existing knowledge with diminishing returns
AI-assisted discovery may represent civilization's attempt to extract remaining accessible novelty more efficiently
Resource and Population Constraints
The simulation's inclusion of ecological limits and population collapse adds urgency:
Current resource overshoot (1.7 Earths annually) constrains sustainable knowledge production infrastructure
Population decline reduces collective epistemic effort, potentially extending the timeline but reducing total achievable knowledge
The window may be closing for achieving maximum knowledge before resource constraints force civilization into a lower-energy state
Limitations and Criticisms
The model has important limitations:
Parameter Sensitivity
The simulation results are highly sensitive to the chosen parameters. Different values for the scaling constant α or exposure suppression β could yield dramatically different timelines and knowledge accumulation rates.
Simplification of Complexity
Real knowledge discovery involves:
Interdisciplinary synthesis creating new possibility spaces
Technological breakthroughs expanding what's discoverable
Social and cultural factors affecting how knowledge is organized and built upon
Future Research Directions
Validating Varney's Law requires:
Empirical Calibration
Historical discovery mapping: Quantifying major breakthroughs across different time periods
Cross-domain validation: Testing the model against specific scientific fields
Parameter optimization: Using real data to calibrate α, β, and tier structures
Expanded Modeling
Network effects: How discoveries enable other discoveries
Quality vs. quantity: Distinguishing between information and understanding
Emergence and synthesis: How combining existing knowledge creates new possibilities
Predictive Testing
The model makes specific predictions about:
Timing of next major paradigm shifts
Effectiveness of AI-assisted discovery
Relationship between population decline and innovation rates
Knowledge, Mortality, and Cosmic Perspective
Varney's Law offers a sobering mathematical perspective on human knowledge, even with the crude way of quantifying novelty.
This simulation suggests we've barely scratched the surface—achieving only 1.1% of possible knowledge, yet are already experiencing the early stages of discovery plateau dynamics.
This isn't necessarily pessimistic; it’s a step toward the more realistic. If true, it suggests that reaching a knowledge plateau represents profound success—evidence that a civilization has become so effective at understanding reality that it's exhausting discoverable structure.
Perhaps most provocatively, my framework suggests that the apparent fine-tuning of our universe for complexity generation isn't coincidental, but purposeful, that we exist within a simulated multiverse designed to harvest the very novelty that our reality seems destined to exhaust.
Whether or not you accept this philosophical extension, Varney's Law forces us to confront fundamental questions about the nature of knowledge, the trajectory of civilization, and our place in the cosmos.
If you want to read the whole paper for free, click here.
If you’re impressed by what you’ve read, consider buying the Amazon ebook as a way of leaving a tip or simply writing a review.
About the Analysis: This investigation used a multi-scale mathematical model incorporating historical population data, resource constraints, and hierarchical novelty structures to test the predictions of Varney's Law. While the specific parameter values are provisional, the general framework provides a first step toward quantifying novelty.