What does the site scan for?
The site scans for various standards, including robots.txt, Markdown negotiation, OAuth, and other elements that determine a website's readiness for AI agents.
How can I improve my site's agent readiness?
Start by implementing a valid robots.txt file and ensuring your site has the necessary discovery headers. Follow the recommendations provided after the scan for further improvements.
Is there a cost associated with using this tool?
The website does not specify any pricing information. Users are encouraged to check the official site for any potential costs.
What should I do if I encounter issues with the recommendations?
Since the recommendations are AI-generated, it's important to use professional judgment when implementing them. If issues arise, consult with a web development professional.