ECCouncil Ethical Hacking and Countermeasures V8 EC0-350 Question # 137 Topic 14 Discussion

ECCouncil Ethical Hacking and Countermeasures V8 EC0-350 Question # 137 Topic 14 Discussion

EC0-350 Exam Topic 14 Question 137 Discussion:
Question #: 137
Topic #: 14

WWW wanderers or spiders are programs that traverse many pages in the World Wide Web by recursively retrieving linked pages. Search engines like Google, frequently spider web pages for indexing. How will you stop web spiders from crawling certain directories on your website?


A.

Place robots.txt file in the root of your website with listing of directories that you don't want to be crawled


B.

Place authentication on root directories that will prevent crawling from these spiders


C.

Enable SSL on the restricted directories which will block these spiders from crawling


D.

Place "HTTP:NO CRAWL" on the html pages that you don't want the crawlers to index


Get Premium EC0-350 Questions

Contribute your Thoughts:


Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.