summaryrefslogtreecommitdiff
path: root/public/robots.txt
Commit message (Collapse)AuthorAgeFilesLines
* Update robots.txt to exclude group_members and project_members, which can ↵otheus2018-11-291-0/+2
| | | | expose sensitive user information to the web. Please see https://developers.google.com/search/reference/robots_txt for the correct wildcard format.
* Update robots.txt (#51167)Moritz Schlarb2018-09-071-0/+2
|
* Add the /help page in robots.txtrobots-helpAchilleas Pipinellis2018-03-221-0/+1
| | | | | | | The /help page has docs which we don't want to be crawled as we prefer the docs website instead. Related https://gitlab.com/gitlab-org/gitlab-ce/issues/44433
* correct User_agent placement in robots.txteric sabelhaus2017-01-181-2/+3
|
* update robots.txt disallowMatt Harrison2016-09-231-1/+1
| | | | | Allows projects in groups starting with "s" while still disallowing the snippets short urls.
* Disallow search engines from indexing uploads from a GitLab project.Connor Shea2016-05-161-0/+1
| | | | This can sometimes include sensitive information from private projects and confidential issues. It shouldn't be indexed. Resolves #15551.
* allow crawling of commit page but not patch/diffsBen Bodenmiller2015-10-041-1/+2
| | | Commit page has valuable information that search engines should be allowed to crawl however the .patch and .diff pages have no new information that is not on commit page
* disallow irrelevant pages by default in robotsBen Bodenmiller2015-08-171-1/+62
| | | | | | | Update default robots.txt rules to disallow irrelevant pages that search engines should not care about. This will still allow important pages like the files, commit details, merge requests, issues, comments, etc. to be crawled.
* init commitgitlabhq2011-10-091-0/+5