The Web Bot Project, developed in the late 1990's, was created to assist in making stock market predictions.
The technology uses a system of spiders to crawl the Internet and search for keywords, much like a search engine does. When a keyword is located, the bot program takes a snapshot of the text preceding and following the keyword. This snapshot of text is sent to a central location where it is then filtered to define meaning.
The projects concept is aimed at tapping into the "collective unconscious" of the universe and it's inhabitants. As well, there is an interesting time concept involved and an unusual concept of a "tipping point" regarding the past, current, and future times. It goes a bit deeper than viewing what those of us on the Internet are saying.
But in 2001, bot operators began to notice that stock market predictions were not the only matters being accurately predicted by the program. They began to take notice of coincidence with occurrences and explored it further.More