From the nature of the brain, through the nature of the mind, we now move on to the last of this particular triumvirate: the nature of intelligence.
A good definition of intelligence follows relatively cleanly from my previous two posts. Since the brain is a modelling subsystem of reality, it follows that some brains simply have more information-theoretic power than others. However, I believe that this is not the whole story. Certainly a strictly bigger brain will be able to store more complex abstractions (as a computer with more memory can do bigger computations), but the actual physical size of human brains is not strongly correlated with our individual intelligence (however you measure it).
Instead I posit the following: intelligence, roughly speaking, is related to the ability for the brain to match new patterns and derive new abstractions. This is information-theoretic compression in a sense. The more abstract and compact the ideas that one is able to reason with, the more powerful the models one is able to use.
The actual root of this ability is almost certainly structural with the brain somehow, but the exact mechanics are irrelevant. It is more important to note that the resulting stronger abstractions are not the cause of raw intelligence so much as an effect: the cause is the ability to take disparate data and factor out all the patterns, reducing it down to as close to raw Shannon entropy as possible.
2 thoughts on “Matching Patterns: The Nature of Intelligence”