Consider: We’ve evolved to solve environments using as little information as possible. This means we’ve evolved to solve environments ignoring as much information as possible. This means we’ve evolved to take as much of our environments for granted as possible. This means evolution has encoded an extraordinary amount of implicit knowledge into our cognitive systems. You could say that each and every one of us constitutes a kind of solution to an ‘evolutionary frame problem.’
Thus the ‘Knowledge of Wisdom Paradox.’ The more explicit knowledge we accumulate, the more we can environmentally intervene. The more we environmentally intervene, the more we change the taken-for-granted backgrounds. The more we change taken-for-granted backgrounds, the less reliable our implicit knowledge becomes.
In other words, the more robust/reliable our explicit knowledge tends to become, the less robust/reliable our implicit knowledge tends to become. Has anyone come across a version of this paradox anywhere?…
View original post 34 more words