Monday, January 11, 2010

WEB BOTS

computer-code
Since its conception in the late 1990’s, the Web Bot Project has made a number of very accurate and insightful predictions regarding coming events. Originally designed to track stock market trends, the Web Bot uses a system of spiders that crawl the internet looking for patterns of behavior, trends and chatter pertaining to coming events. This tool is believed to be able to forecast the future by tapping into the collective unconscious of society.One of the first accurate predictions from the Web Bot program took place in June of 2001. The program predicted that a life-altering event would take place within the next 60-90 days. An occurrence of such proportion that it's effects would be felt worldwide. The program based its prediction on "web chatter" and regrettably, the prediction proved accurate when the Twin Towers fell on 9/11/2001.Throughout its short existence, the Web Bot has accurately predicted many other natural and man made disasters such as the 2001 anthrax attack on Washington DC, the East Coast power outage in 2003 and the earthquake which lead to the December 26 2004 tsunami. It is even credited with predicting hurricane Katrina and the devastating events that followed. In addition, the Web Bot has foretold of a global devastating event expected to take place in late December 2012.Even more remarkable is a series of predictions made back in late 2006 about the coming events of late 2008 through mid 2009. Although these predictions may or may not pan out to be completely accurate, it is interesting to evaluate each one based on its own merit and relationship to the times in which we live.

heres a tech explanation of them
http://urbansurvival.com/simplebots.htm

heres the site itself
http://halfpasthuman.com/

heres a radio show with the maker of the webbots


Reblog this post [with Zemanta]

No comments:

Post a Comment