Discovery is the process by which search engines find new pages. The search engines use robot web crawlers which go out periodically to all the pages the search engine already knows about (frequency based on importance!) and collects the latest information on that page (since pages change over time). Among other items they collect are the links on the page. The links are compared to the list of known pages and any unknown pages are put into another queue (based on the importance of the referring page!) to be crawled by the robot.
How relevant is any given page to the requested search? Relevance really ties together all the strands of search engine optimization work. A page is relevant if the link text indicates other sites label it as relevant, if it is important (i.e. trusted), and if the page title and text indicate the page has contents specifically about those keywords.
How relevant is any given page to the requested search? Relevance really ties together all the strands of search engine optimization work. A page is relevant if the link text indicates other sites label it as relevant, if it is important (i.e. trusted), and if the page title and text indicate the page has contents specifically about those keywords.
0 comments until now.
Post a Comment