I have a simple condition in my HAproxy config (I tried this for frontend and backend):
acl no_index_url path_end .pdf .doc .xls .docx .xlsx
rspadd X-Robots-Tag:\ noindex if no_index_url
It should add the no-robots header to content that should not be indexed. However it gives me this WARNING
when parsing the config:
acl 'no_index_url' will never match because it only involves keywords
that are incompatible with 'backend http-response header rule'
and
acl 'no_index_url' will never match because it only involves keywords
that are incompatible with 'frontend http-response header rule'
According to documentation, rspadd
can be used in both frontend and backend. The path_end
is used in examples within frontend. Why am I getting this error and what does it mean?
Select the web site where you want to add the custom HTTP response header. In the web site pane, double-click HTTP Response Headers in the IIS section. In the actions pane, select Add. In the Name box, type the custom HTTP header name.
An HAProxy ACL lets you define custom rules for blocking malicious requests, choosing backends, redirecting to HTTPS and using cached objects.
Starting in HaProxy 1.6 you won't be able to just ignore the error message. To get this working use the temporary variable feature:
frontend main
http-request set-var(txn.path) path
backend local
http-response set-header X-Robots-Tag noindex if { var(txn.path) -m end .pdf .doc }
Apparently, even with the warning, having the acl
within the frontend works perfectly fine. All the resources with .pdf, .doc, etc are getting the correct X-Robots-Tag
added to them.
In other words, this WARNING
is misleading and in reality the acl
does match.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With