All of my hosts use an IPWhitelist that only allows a few IPs in. But some search engine keep trying again and again even if traefik2 responds Forbidden always. I'd like to set up a robots.txt for all (or if I can't, server the robots.txt of each host). This is how it's currently set up:
robots:
image: nginx:alpine
container_name: robots
volumes:
- ./robots.txt:/usr/share/nginx/html/robots.txt:ro
labels:
- traefik.enable=true
- traefik.http.routers.robots.rule=HostRegexp(`{host:.*}`) && Path(`/robots.txt`)
- traefik.http.routers.robots.entrypoints=https
- traefik.http.routers.robots.priority=99
However, that doesn't seem to work as it still forwards to the service I call, resulting in being forbidden.
What am I missing? I am on 2.5.1 and the robots service & router does show up (though a mention of the priority).
This is what the log shows when I hit the host on /robots.txt:
time="2021-08-27T12:12:54Z" level=debug msg="rejecting request &{Method:GET URL:/robots.txt Proto:HTTP/2.0 ProtoMajor:2 ProtoMinor:0 Header:map[Accept:[*/*] User-Agent:[curl/7.68.0] X-Forwarded-Host:[subdomainredacted.domainredacted.com] X-Forwarded-Port:[443] X-Forwarded-Proto:[https] X-Forwarded-Server:[831fe7e3f88e] X-Real-Ip:[REDACTED]] Body:0xc0013669f0 GetBody:<nil> ContentLength:0 TransferEncoding:[] Close:false Host:subdomainredacted.domainredacted.com Form:map[] PostForm:map[] MultipartForm:<nil> Trailer:map[] RemoteAddr:REDACTED:58602 RequestURI:/robots.txt TLS:0xc000644f20 Cancel:<nil> Response:<nil> ctx:0xc0009e7a70}: \"45.79.181.120\" matched none of the trusted IPs" middlewareName=home-ipwhitelist@file middlewareType=IPWhiteLister
and beforehand:
time="2021-08-27T12:11:16Z" level=debug msg="Added outgoing tracing middleware robots-base-watcher" entryPointName=https routerName=robots@docker middlewareName=tracing middlewareType=TracingForwarder
time="2021-08-27T12:11:16Z" level=debug msg="Creating server 0 http://172.18.0.3:80" entryPointName=https routerName=robots@docker serviceName=robots-base-watcher serverName=0
time="2021-08-27T12:11:16Z" level=debug msg="Creating load-balancer" entryPointName=https routerName=robots@docker serviceName=robots-base-watcher
time="2021-08-27T12:11:16Z" level=debug msg="Creating middleware" entryPointName=https routerName=robots@docker serviceName=robots-base-watcher middlewareName=metrics-service middlewareType=Metrics
time="2021-08-27T12:11:16Z" level=debug msg="Creating middleware" entryPointName=https routerName=robots@docker serviceName=robots-base-watcher middlewareName=pipelining middlewareType=Pipelining
time="2021-08-27T12:11:16Z" level=debug msg="Added outgoing tracing middleware robots-base-watcher" entryPointName=https routerName=robots@docker middlewareName=tracing middlewareType=TracingForwarder
time="2021-08-27T12:11:16Z" level=debug msg="Creating server 0 http://172.18.0.3:80" entryPointName=https routerName=robots@docker serviceName=robots-base-watcher serverName=0
2021-08-27 08:11:16
time="2021-08-27T12:11:16Z" level=debug msg="Creating load-balancer" entryPointName=https routerName=robots@docker serviceName=robots-base-watcher
time="2021-08-27T12:11:16Z" level=debug msg="Creating middleware" entryPointName=https routerName=robots@docker serviceName=robots-base-watcher middlewareName=metrics-service middlewareType=Metrics
time="2021-08-27T12:11:16Z" level=debug msg="Creating middleware" entryPointName=https routerName=robots@docker serviceName=robots-base-watcher middlewareName=pipelining middlewareType=Pipelining