A Spider is a program that reads web page content (usually to add to a search engine database) and analyses the links to other pages / sites, and in turn goes to those pages and reads them & analyses the links etc.
It follows page links from site to site, sometimes known as crawling the web - as in the World Wide Web'.
Thing that crawls through or over a web = Spider...
A search engine spider collects information from web pages as a means to cache, or save copies of, web pages onto the search engine's servers. Each search engine has its own spider, designed and coded by the search engine, that continually crawls the web, updating the cached versions of websites and looking for new pages added to websites. Once a page has been seen by a search engine spider, it is said to have been "spidered/indexed."
Search engine spider gather the information from website and store it on the search engine server. It collect all the information about the website and store the snaps of your website and index it on its database of the search engine.
Spider (Search Engine Optimization) is one kinds of program which analyze robots.txt.
Spider is program / Algorithm that visits websites to obtain information of content
Spiders are small programs to analyze your website robots.txt files.
For very best SEARCH ENGINE OPTIMIZATION results My partner and i recommend applying Google Search Bot: http://is.gd/gsbsoftware Applying that software package I've got ranked my site to very first page of Google on a extremely high competing keyword.
I want to know what is a spider in a search engine. and what type of programs are there.