[openstreetmap/openstreetmap-website] Expand robots.txt to stop pages robots are getting stuck in (PR #6875)
Paul Norman
notifications at github.com
Wed Mar 11 09:57:19 UTC 2026
I reviewed some of the bot user-agents requests to osm.org and noticed them requesting every single page for some pages. With the pagination changes these are relatively fast, but every time they go back to the page a new item has changed the pagnation points so the bot can try to re-scan the entire list.
There's also a couple endpoints bots shouldn't use because they are themselves searches or they're only used for authentication, which a bot doesn't do.
You can view, comment on, or merge this pull request online at:
https://github.com/openstreetmap/openstreetmap-website/pull/6875
-- Commit Summary --
* Expand robots.txt to stop pages robots are getting stuck in
-- File Changes --
M public/robots.txt (6)
-- Patch Links --
https://github.com/openstreetmap/openstreetmap-website/pull/6875.patch
https://github.com/openstreetmap/openstreetmap-website/pull/6875.diff
--
Reply to this email directly or view it on GitHub:
https://github.com/openstreetmap/openstreetmap-website/pull/6875
You are receiving this because you are subscribed to this thread.
Message ID: <openstreetmap/openstreetmap-website/pull/6875 at github.com>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.openstreetmap.org/pipermail/rails-dev/attachments/20260311/8d2f37bc/attachment.htm>
More information about the rails-dev
mailing list