I’ve got to hand it to Evan Williams. It’s not an accident that he’s been at the start of these large Internet developments.
“OM: Do you think that the future of the Internet will involve machines thinking on our behalf?
Ev: Yes, they’ll have to. But it’s a combination of machines and the crowd. Data collected from the crowd that is analyzed by machines. For us, at least, that’s the future. Facebook is already like that. YouTube is like that. Anything that has a lot of information has to be like that. People are obsessed with social but it’s not really “social.” It’s making better decisions because of decisions of other people. It’s algorithms based on other people to help direct your attention another way.”
This is exactly where I think we are in the evolution of the extension of the mind by global computer networks. We are individual nodes or modules taking in information and producing behaviors. We’re individually limited in bandwidth both coming in and going out. Those are human constraints that have some fixed limit based on biology and time.
There’s an aggregated metalevel that’s too big for us to see individually, but can be looked at computationally by our machines and fed back to us as processed information. By analogy, the visual part of the brain has only indirect access to somatosensory input from the fingers, but gets access to the synthesized information to refine visual analysis.
This isn’t just limited to the internet. Another good example is how the genome, endless strings of four bases, can’t be interpreted by inspection. Its just two much information. But we very quickly developed computational methods to show us the patterns for human interpretation.
These are examples of complex, networked systems that require computation analysis for human understanding.