User-agent: * Crawl-Delay: 20 Disallow: /wp-admin/ Disallow: /pentagrama/ Disallow: /verificacion/ Disallow: /wp-login.php Disallow: /?s= Disallow: /search/ Disallow: /comments/ Disallow: /xmlrpc.php Disallow: /?attachment_id* Disallow: /wp-content/plugins/ Disallow: /wp-content/themes/ Disallow: /wp-includes/ Disallow: /*/attachment/ Allow: /wp-admin/admin-ajax.php sitemap: https://www.correodelcaroni.com/sitemap_index.xml sitemap: https://www.correodelcaroni.com/news-sitemap.xml To block SemrushBot from crawling your site for different SEO and technical issues: User-agent: SiteAuditBot Disallow: / To block SemrushBot from crawling your site for Backlink Audit tool: User-agent: SemrushBot-BA Disallow: / To block SemrushBot from crawling your site for On Page SEO Checker tool and similar tools: User-agent: SemrushBot-SI Disallow: / To block SemrushBot from checking URLs on your site for SWA tool: User-agent: SemrushBot-SWA Disallow: / To block SemrushBot from crawling your site for Content Analyzer and Post Tracking tools: User-agent: SemrushBot-CT Disallow: / To block SemrushBot from crawling your site for Brand Monitoring: User-agent: SemrushBot-BM Disallow: / To block SplitSignalBot from crawling your site for SplitSignal tool: User-agent: SplitSignalBot Disallow: / To block SemrushBot-COUB from crawling your site for Content Outline Builder tool: User-agent: SemrushBot-COUB Disallow: / User-agent: MojeekBot Disallow: / User-agent: YandexBot Disallow: / User-agent: Amazonbot Disallow: / User-agent: Amazonbot Disallow: / # Bloqueo de trackbacks User-agent: * Disallow: /trackback Disallow: /*trackback Disallow: /*trackback* Disallow: /*/trackback # Bloqueo de feeds para crawlers User-agent: * Allow: /feed/$ Disallow: /feed/ Disallow: /comments/feed/ Disallow: /*/feed/$ Disallow: /*/feed/rss/$ Disallow: /*/trackback/$ Disallow: /*/*/feed/$ Disallow: /*/*/feed/rss/$ Disallow: /*/*/trackback/$ Disallow: /*/*/*/feed/$ Disallow: /*/*/*/feed/rss/$ Disallow: /*/*/*/trackback/$ # Ralentizamos algunos bots que se suelen volver locos User-agent: noxtrumbot Crawl-delay: 20 User-agent: msnbot Crawl-delay: 20 User-agent: Slurp Crawl-delay: 20 # Bloqueo de bots y crawlers poco utiles User-agent: MSIECrawler Disallow: / User-agent: WebCopier Disallow: / User-agent: HTTrack Disallow: / User-agent: Microsoft.URL.Control Disallow: / User-agent: libwww Disallow: / User-agent: Orthogaffe Disallow: / User-agent: UbiCrawler Disallow: / User-agent: DOC Disallow: / User-agent: Zao Disallow: / User-agent: sitecheck.internetseer.com Disallow: / User-agent: Zealbot Disallow: / User-agent: MSIECrawler Disallow: / User-agent: SiteSnagger Disallow: / User-agent: WebStripper Disallow: / User-agent: WebCopier Disallow: / User-agent: Fetch Disallow: / User-agent: Offline Explorer Disallow: / User-agent: Teleport Disallow: / User-agent: TeleportPro Disallow: / User-agent: WebZIP Disallow: / User-agent: linko Disallow: / User-agent: HTTrack Disallow: / User-agent: Microsoft.URL.Control Disallow: / User-agent: Xenu Disallow: / User-agent: larbin Disallow: / User-agent: libwww Disallow: / User-agent: ZyBORG Disallow: / User-agent: Download Ninja Disallow: / User-agent: wget Disallow: / User-agent: grub-client Disallow: / User-agent: k2spider Disallow: / User-agent: NPBot Disallow: / User-agent: WebReaper Disallow: / # Previene problemas de recursos bloqueados en Google Webmaster Tools User-Agent: Googlebot Allow: /*.css$ Allow: /*.js$