Deep Linking - Opt Out

Opt Out

Web site owners wishing to prevent search engines from deep linking are able to use the existing Robots Exclusion Standard (robots.txt file) to specify their desire or otherwise for their content to be indexed. People in favor of deep linking often feel that content owners who fail to provide a robots.txt file are implying that they do not object to deep linking either by search engines or others who might link to their content. People against deep linking often claim that content owners may be unaware of the Robots Exclusion Standard or may not use robots.txt for other reasons. Deep linking is also practiced outside the search engine context, so some participating in this debate question the relevance of the Robots Exclusion Standard to controversies about Deep Linking. The Robots Exclusion Standard does not programmatically enforce its directives so it does not prevent search engines and others who do not follow polite conventions from deep linking.

Read more about this topic:  Deep Linking