Some now speculate that the first real “thinking” machine will be the meta-computer, the Internet. This suggestion bears a particular resemblance to the suggestion made by Douglas Adams in The Hitchiker’s Guide to the Galaxy, that the Earth is really a supercomputer designed to calculate the question to the answer to the meaning of life, the universe and everything. In Adam’s plot we are also a part of the machine, an integral part of what makes this mega-computer function.

If we consider that the Earth is in fact itself a massive organism, consisting of the various biological, ecological and industrial spheres, the logical deduction is that the Internet is in fact a massive supercomputer consisting of the various Personal Computers and Servers that connect through it. But the real intelligence of this supercomputer comes not from the PCs that are connected to it, but rather the users of those PCs and how we use them to communicate via the Internet.

Consider the flow of information over the Internet when a natural disaster - such as the 2004 Tsunami - occurs. If we are to perform a search for information regarding Tsunami, we can see a general impression of how we collectively think about the event, and how it has affected the entire world.

What we are really seeing every time we hit Google is a primitive interface to the collective human intelligence evolving on the Internet. In its current form we still must manually decode this intelligence (i.e. sort the wheat from the chaff), but you can expect new interfaces in the future that don’t just give you references to words and phrases, but rather actually understand what you are asking.

There have already been a number of attempts to provide a more user-friendly search interface (e.g. Ask Jeeves), but these approaches were ultimately performing the same search as everyone else. What is required is not an improved way of specifying what we are asking, but rather an improved analysis of the results that are returned. Current search techniques provide very little analysis of a collection of results as a whole, which is where the real intelligence of the Internet lies.

Whilst current search engines may provide results ordered with considerably advanced techniques, generally these results are provided as a flat listing, or possibly grouped together by host name. Future interfaces will find improved ways to group together “clusters” of results, providing a more hirearchical relationship between web pages based on this emerging intelligence.

Ultimately we are (inadvertently) building an Intelligent Internet, all we need to do is recognise that intelligence and everything else will happen automatically.