# This robots.txt file controls crawling of URLs under https://igel-muc.de/video/ # All crawlers are disallowed to crawl files in the "/video/sub-bavaria" directory, such # as .css, .js. Googlebot needs them for rendering, so Googlebot especially AND all other User-agents are disallowed # to crawl them. This is ruled twice in case Googlebot will change its' Agents name somewhen... User-agent: * Disallow: /video/sub-bavaria/ User-agent: Googlebot DisAllow: /video/sub-bavaria/ # But we do want w3.org link checker to be allowed to crawl User-Agent: W3C-checklink Disallow: #In the future there may be some sitemap parsed to xml, untill then keep commented # Sitemap: https://example.com/sitemap.xml