# This robots.txt file controls crawling of URLs under https://soramichi.sakura.ne.jp/ # All crawlers are disallowed to crawl files in the "includes" directory, such # as .css, .js, but Googlebot needs them for rendering, so Googlebot is allowed # to crawl them. User-agent: * Disallow: Sitemap: https://soramichi.sakura.ne.jp/sitemap.xml