News
crawler_engine.v1 ACTIVE
โ–  ethical crawl patterns enabled
ยง Docs About llms.txt

What is it?

The llms.txt file is a proposed standard for providing information to help Large Language Models (LLMs) use a website at inference time. It's designed to be easily readable by both humans and machines.

Ethical Crawling

Our generator strictly adheres to robots.txt rules and applies a mandatory 1-second delay between requests to prevent server stress.