The more we learn, the less we know

Fong Yi Hao
3 min readDec 9, 2023

--

Photo by Sachin Khadka on Unsplash

The more we learn, the more we know. This seems so obviously true, but is it really?

Every time we encounter a new piece of knowledge, we place it through the following tests before accepting them into our system of beliefs:

1. The sense-check test Does it make sense?

We apply our best judgement to determine whether the new information seems reasonable. This is by no means an objective test. As long as it intuitively feels right, it passes.

2. The credibility test Should I believe it?

Then we try to determine whether the source is worthy of belief.

  • Would they know better than me? Why?
  • What biases might they hold?
  • Do they have ulterior motivations for presenting this information?

It passes if we manage to find enough reasons to believe the new piece of information.

3. The cross-check testDoes this contradict anything I’ve accepted as true?

Our system of beliefs are a bank of accepted ideas that we use as the benchmark for evaluating new information. Each new idea is held up against all existing beliefs to check for disagreements.

If the no disagreement arises, the new knowledge is accepted as long as it passes 1 & 2.

If a disagreement arises, we assess how one measures up against the other before accepting the stronger (& more likely) idea. Two contradictory ideas cannot exist within a coherent belief system. Because of this, every accepted idea becomes a potential barrier to every new idea.

This operates inversely to Metcalfe’s Law.

Metcalfe’s law states that the value of a telecommunications network is proportional to the square of the number of connected users (nodes) of the system.

2 nodes → 1 connection
5 nodes → 10 connections
12 nodes → 66 connections

The more connections you can make, you get out of the network. The utility of a network gets exponentially stronger with each additional user.

Our system of beliefs operates the same way. Each accepted piece of knowledge is a node, and each new node must agree with every existing node within the system to be accepted. With each additional node, the system becomes stronger and increasingly difficult to penetrate.

In the system of ideas, the ease of accepting new ideas is inversely proportional to the square of the number of nodes in the system.

A second order effect is that each accepted idea is more likely to be in line with what we already know. This confirms our biases further instead of leaving room for opposition. Echo chambers are especially deadly since we’re fed with ideas that are in agreement on a regular and frequent basis.

The more you learn → the more you know → the more closed you become to new ideas → the less you know in the long run.

This is the knowledge paradox.

This phenomenon is a necessary evil. It can only be prevented if we don’t hold on to knowledge with certainty. But that’s impractical.

Knowledge is only useful if it’s dependable — we need to be able to base our decisions on it. It is dependable only when we believe it’s true.

The sun may not rise tomorrow. But living as though it might not is madness. A >99% chance that the sun will rise makes the cognitive load of preparing for it not happening a risk that we gladly take.

We know that no knowledge is 100% certain. But to not go mad we have to accept some of it as fact. What’s the point of learning things if we can’t rely on them to be true?

There is a special class of knowledge that transcends this framework — principles. Taking these as fact, helps to keep you open-minded. Here’s a non-exhaustive list:

  • Question everything
  • Remain open to being wrong
  • Nothing is certain
  • Paradoxes are a naturally occurring phenomenon (two contradicting ideas can be both true)

--

--

No responses yet