# ── 1) Allow all Google bots ─────────────────────────────────────────────────── # (Google ignores crawl-delay; leaving these fully open is recommended.) User-agent: Googlebot Allow: / User-agent: Googlebot-Image Allow: / User-agent: Googlebot-Video Allow: / User-agent: Google-InspectionTool Allow: / User-agent: AdsBot-Google Allow: / User-agent: AdsBot-Google-Mobile Allow: / User-agent: Mediapartners-Google Allow: / # ── 2) Block commonly bad/abusive crawlers ───────────────────────────────────── # (These are known heavy scrapers/SEO bots; adjust as needed.) User-agent: AhrefsBot Disallow: / User-agent: SemrushBot Disallow: / User-agent: MJ12bot Disallow: / User-agent: DotBot Disallow: / User-agent: BLEXBot Disallow: / User-agent: Barkrowler Disallow: / User-agent: Bytespider Disallow: / User-agent: CCBot Disallow: / User-agent: PetalBot Disallow: / User-agent: DataForSeoBot Disallow: / User-agent: spbot Disallow: / User-agent: ZoominfoBot Disallow: / # Block generic scraping libraries that identify themselves plainly User-agent: Scrapy Disallow: / User-agent: Python-requests Disallow: / User-agent: curl Disallow: / User-agent: wget Disallow: / # ── 3) Default: allow others but slow them down ──────────────────────────────── # (Some crawlers respect Crawl-delay; Google does not. Value is in seconds.) User-agent: * Allow: / Crawl-delay: 10 Sitemap: https://maledelusioncalcu.com/sitemap_index.xml