# As a condition of accessing this website, you agree to abide by the following # content signals: # (a) If a content-signal = yes, you may collect content for the corresponding # use. # (b) If a content-signal = no, you may not collect content for the # corresponding use. # (c) If the website operator does not include a content signal for a # corresponding use, the website operator neither grants nor restricts # permission via content signal with respect to the corresponding use. # The content signals and their meanings are: # search: building a search index and providing search results (e.g., returning # hyperlinks and short excerpts from your website's contents). Search does not # include providing AI-generated search summaries. # ai-input: inputting content into one or more AI models (e.g., retrieval # augmented generation, grounding, or other real-time taking of content for # generative AI search answers). # ai-train: training or fine-tuning AI models. # ANY RESTRICTIONS EXPRESSED VIA CONTENT SIGNALS ARE EXPRESS RESERVATIONS OF # RIGHTS UNDER ARTICLE 4 OF THE EUROPEAN UNION DIRECTIVE 2019/790 ON COPYRIGHT # AND RELATED RIGHTS IN THE DIGITAL SINGLE MARKET. # BEGIN Cloudflare Managed content User-Agent: * Content-signal: search=yes,ai-train=no Allow: / User-agent: Amazonbot Disallow: / User-agent: Applebot-Extended Disallow: / User-agent: Bytespider Disallow: / User-agent: CCBot Disallow: / User-agent: ClaudeBot Disallow: / User-agent: Google-Extended Disallow: / User-agent: GPTBot Disallow: / User-agent: meta-externalagent Disallow: / # END Cloudflare Managed Content # # robots.txt # # This file is to prevent the crawling and indexing of certain parts # of your site by web crawlers and spiders run by sites like Yahoo! # and Google. By telling these "robots" where not to go on your site, # you save bandwidth and server resources. # # This file will be ignored unless it is at the root of your host: # Used: http://example.com/robots.txt # Ignored: http://example.com/site/robots.txt # # For more information about the robots.txt standard, see: # http://www.robotstxt.org/robotstxt.html User-agent: * # CSS, JS, Images Allow: /core/*.css$ Allow: /core/*.css? Allow: /core/*.js$ Allow: /core/*.js? Allow: /core/*.gif Allow: /core/*.jpg Allow: /core/*.jpeg Allow: /core/*.png Allow: /core/*.svg Allow: /profiles/*.css$ Allow: /profiles/*.css? Allow: /profiles/*.js$ Allow: /profiles/*.js? Allow: /profiles/*.gif Allow: /profiles/*.jpg Allow: /profiles/*.jpeg Allow: /profiles/*.png Allow: /profiles/*.svg # Directories Disallow: /core/ Disallow: /profiles/ # Files Disallow: /README.txt Disallow: /web.config # Paths (clean URLs) Disallow: /admin/ Disallow: /comment/reply/ Disallow: /filter/tips/ Disallow: /node/add/ Disallow: /search/ Disallow: /user/register/ Disallow: /user/password/ Disallow: /user/login/ Disallow: /user/logout/ # Paths (no clean URLs) Disallow: /index.php/admin/ Disallow: /index.php/comment/reply/ Disallow: /index.php/filter/tips/ Disallow: /index.php/node/add/ Disallow: /index.php/search/ Disallow: /index.php/user/password/ Disallow: /index.php/user/register/ Disallow: /index.php/user/login/ Disallow: /index.php/user/logout/ User-agent: GPTBot Disallow: / User-agent: ChatGPT-User Disallow: / User-agent: Google-Extended Disallow: / User-agent: ClaudeBot Disallow: / User-agent: Claude-Web Disallow: / User-agent: Claude-User Disallow: / User-agent: PerplexityBot Disallow: / User-agent: Perplexity-User Disallow: / User-agent: CCBot Disallow: / User-agent: Amazonbot Disallow: / User-agent: Applebot-Extended Disallow: / User-agent: Facebookbot Disallow: / User-agent: Meta-ExternalAgent Disallow: / User-agent: Meta-ExternalFetcher Disallow: / User-agent: Omgili Disallow: / User-agent: Omgilibot Disallow: / User-agent: Bytespider Disallow: / User-agent: TikTokSpider Disallow: / User-agent: diffbot Disallow: / User-agent: ImagesiftBot Disallow: / User-agent: YouBot Disallow: / User-agent: TurnitinBot Disallow: / User-agent: Timpibot Disallow: / User-agent: AI2Bot Disallow: / User-agent: DataForSeoBot Disallow: / User-agent: AwarioBot Disallow: / User-agent: AwarioSmartBot Disallow: / User-agent: AwarioRssBot Disallow: / User-agent: Google-CloudVertexBot Disallow: / User-agent: PanguBot Disallow: / User-agent: Kangaroo Bot Disallow: / User-agent: Sentibot Disallow: / User-agent: img2dataset Disallow: / User-agent: Meltwater Disallow: / User-agent: Seekr Disallow: / User-agent: peer39_crawler Disallow: / User-agent: cohere-ai Disallow: / User-agent: cohere-training-data-crawler Disallow: / User-agent: DuckAssistBot Disallow: / User-agent: Scrapy Disallow: / User-agent: Cotoyogi Disallow: / User-agent: aiHitBot Disallow: / User-agent: Factset_spyderbot Disallow: / User-agent: FirecrawlAgent Disallow: /