Kagibot is the web crawler for the Kagi search engine.
Requests from Kagibot set the User-Agent to:
Mozilla/5.0 (compatible; Kagibot/1.0; +https://kagi.com/bot)
The requests will originate from the following IPs:
216.18.205.234 amd.kagibot.org
35.212.27.76 donna.kagibot.org
104.254.65.50 knet.kagibot.org
209.151.156.194 ogi.kagibot.org
Standard directives in robots.txt that target Kagibot are respected. For example, the following will allow Kagibot to crawl all pages, except those under /private/:
User-Agent: Kagibot
Allow: /
Disallow: /private/
If there is no rule targeting kagibot, but there is a rule targeting Googlebot, then Kagibot will follow the Googlebot directives. For example, Kagibot will fetch all pages, except those under /private/ with the following robots.txt:
User-Agent: *
Disallow: /
User-Agent: Googlebot
Allow: /
Disallow: /private/
If you have any questions, or if you think Kagibot is misbehaving on your site, please do not hesitate to contact us at [email protected].