Linux 软件免费装
Banner图

Better Robots.txt - AI-Ready Crawl Control & Bot Governance

开发者 the-rock
pagup
freemius
更新时间 2026年3月31日 21:59
PHP版本: 7.4 及以上
WordPress版本: 6.9
版权: GPLv2 or later
版权网址: 版权信息

标签

seo robots.txt bot blocker ai crawlers llms.txt

下载

2.0.4 3.0.1 3.0.0 1.0.0 1.1.1 1.5.2 1.0.2 1.1.0 1.0.1 2.0.2

详情介绍:

Better Robots.txt replaces the default WordPress robots.txt workflow with a smarter, structured version you can configure and preview before publishing. Instead of a blank textarea, you get a guided wizard with presets, plain-language explanations, and a final Review & Save step so you can inspect the generated robots.txt before it goes live. Built for beginners and advanced users alike, Better Robots.txt helps you control how search engines, AI crawlers, SEO tools, archive bots, bad bots, social preview bots, and other automated agents interact with your site. Trusted by thousands of WordPress sites, Better Robots.txt is designed for the AI era without resorting to hype, vague promises, or hidden rules. Better Robots.txt is available in Free, Pro, and Premium editions. The free plugin covers the guided workflow and essential crawl control features, while Pro and Premium unlock additional governance, protection, and AI-ready modules. Some screenshots on the plugin page show features from all three editions. A quick overview https://vimeo.com/1169756981 Why Better Robots.txt is different Most robots.txt plugins fall into one of three categories: Better Robots.txt goes further. It gives you a complete, guided crawl control workflow so you can: What you can control Better Robots.txt helps you manage: Editions Better Robots.txt is available in three editions: Some options shown in the interface are marked Free, Pro, or Premium so users can immediately understand which modules belong to each edition. Presets Setup starts with four modes: For many sites, one preset plus a quick review is enough. Built for beginners and experts Beginners get: Advanced users get: AI-ready, without hype Better Robots.txt includes features for modern AI-related crawl governance, including: These features help you express how you want automated systems to use your content. However, Better Robots.txt does not claim to control AI by force. Like robots.txt itself, these signals are most useful with compliant systems and good-faith crawlers. What Better Robots.txt is Better Robots.txt is: Technical reference for advanced users: Better Robots.txt also maintains a public GitHub repository with product definition, governance notes, and machine-readable artefacts. What Better Robots.txt is not Better Robots.txt is not: It helps you publish a clearer crawl policy. It does not replace infrastructure-level protection. Typical use cases Use Better Robots.txt if you want to: Key Features About the publisher Better Robots.txt is developed and maintained by Pagup, a digital readability firm based in Quebec, Canada. Pagup helps organizations become correctly understood by search engines, generative AI systems, and autonomous agents. The robots.txt file is the first surface that AI crawlers read when they discover a site. A well-structured robots.txt that references governance files such as llms.txt, ai-manifest.json, and interpretation policies helps AI systems understand your site faster and more accurately. Better Robots.txt is one component of a broader digital readability practice that includes semantic content architecture, AI governance and machine readability, and interpretive SEO. Part of the Pagup ecosystem

安装:

  1. Upload the plugin files to the /wp-content/plugins/better-robots-txt/ directory, or install Better Robots.txt through the WordPress Plugins screen.
  2. Activate the plugin through the Plugins screen in WordPress.
  3. Open the Better Robots.txt settings page from your WordPress dashboard.
  4. Choose a preset or configure each module manually.
  5. Follow the wizard until the final Review & Save step.
  6. Review the generated robots.txt preview.
  7. Save your changes.

屏幕截图:

  • Step 1 - Search engine visibility controls.
  • Step 2 - AI and LLM governance settings.
  • Step 4 - Bad bot protection options.
  • Step 8 - WooCommerce cleanup settings.
  • Step 10 - Social media crawler controls.
  • Step 13 - Advanced settings and output options.
  • Step 14 - Review & Save preview screen.

常见问题:

Does this plugin create or manage robots.txt?

Yes. Better Robots.txt generates and manages your WordPress robots.txt through a guided interface, with a preview before you apply changes.

Is this only for advanced users?

No. The plugin is designed for both beginners and advanced users. Presets make the first setup easy, while experts can fine-tune individual modules and directives.

Can I preview the result before saving?

Yes. The final Review & Save step shows you the generated robots.txt before you publish it.

Can I block AI crawlers?

You can configure how AI-related crawlers and tools are treated and publish AI usage preferences. Respect and enforcement still depend on each crawler's behavior, just like with robots.txt.

Does llms.txt guarantee that AI systems will follow my rules?

No. llms.txt and similar policy signals help you express intent more clearly, but they are not a hard technical barrier.

Can I keep search engines allowed while restricting other bots?

Yes. Better Robots.txt helps you differentiate between crawler categories instead of using a simple all-or-nothing approach.

Can I change preset decisions later?

Yes. Presets are a starting point. You can revisit the settings page, adjust modules, and regenerate your robots.txt at any time.

Are all screenshots from the free version?

No. The screenshots reflect the current product family and may include Free, Pro, and Premium features. Features marked Pro or Premium in the interface require a paid edition.

Is the free version still useful on its own?

Yes. The free edition includes the guided workflow, essential crawl control, and the final preview step. Pro and Premium are for sites that need broader governance and stricter protection.

Does this plugin help WooCommerce sites?

Yes. Better Robots.txt includes WooCommerce-related cleanup options to reduce unnecessary crawling of dynamic, low-value, or duplicate URLs.

Who develops Better Robots.txt?

Better Robots.txt is developed by Pagup, a digital readability firm based in Quebec, Canada. Pagup specializes in helping organizations become correctly readable by search engines, AI systems, and autonomous agents.

Why does robots.txt matter for AI readability?

Your robots.txt is the first file that AI crawlers read when they visit your site. It determines what content they can access and what governance signals they discover. In 2026, AI systems such as ChatGPT, Perplexity, Gemini, and autonomous agents rely on robots.txt to understand how to interact with your site. A robots.txt that references your llms.txt, ai-manifest.json, and governance policies helps these systems interpret your organization more accurately. Learn more about AI governance and machine readability and why digital readability goes beyond traditional SEO.

What is digital readability?

Digital readability is the capacity of a website to be correctly understood by all four reading layers: humans, search engines, generative AI systems, and autonomous agents. Traditional SEO addresses only the search engine layer. Digital readability covers all four. Learn more at pagup.com.

更新日志:

1.0.0 1.0.1 1.0.2 1.1.0 1.1.1 1.1.2 1.1.3 1.1.4 1.1.5 1.1.6 1.1.7 1.1.8 1.1.9 1.1.9.1 1.1.9.2 1.1.9.3 1.1.9.4 1.1.9.5 1.1.9.6 1.2.0 1.2.1 1.2.2 1.2.3 1.2.4 1.2.5 1.2.5.1 1.2.6 1.2.6.1 1.2.6.2 1.2.6.3 1.2.7 1.2.8 1.2.9.2 1.2.9.3 1.3.0 1.3.0.1 1.3.0.2 1.3.0.3 1.3.0.4 1.3.0.5 1.3.0.6 1.3.0.7 1.3.1.0 1.3.2.0 1.3.2.1 1.3.2.2 1.3.2.3 1.3.2.4 1.3.2.5 1.4.0 1.4.0.1 1.4.1 1.4.1.1 1.4.2 1.4.3 1.4.4 1.4.5 1.4.6 1.4.7 1.5.0 1.5.1 1.5.2 2.0.0 2.0.1 2.0.2 2.0.3 2.0.4 3.0.0 3.0.1