In my lighttpd.conf:
$HTTP["host"] =~ "^(www\.|foo\.|bar\.)?domain\.com$" {
url.rewrite-once += (
"^/favicon.ico$" => "/images/favicon_cd.ico",
"^/course/?$" => "/course/index.php",
"^/course/([^./]+)$" => "/course/index.php?w=$1"
)
}
...
# don't allow spiders to crawl subdomains
$HTTP["host"] =~ "^(foo\.|bar\.)?domain\.com$" {
url.rewrite-once += (
"^/robots.txt$" => "/robots_nocrawl.txt"
)
}
We have www.domain.com
as well as foo.domain.com
and bar.domain.com
for specific clients. And plain domain.com
should work too.
The first set of rewrite rules send .../course/x
to .../course/index.php?w=x
. Those work for domain.com
and www.domain.com
, but for foo.domain.com
and bar.domain.com
the rewrite doesn't work. I can see the rewrite is not happening by turning on lighttpd's debug logging.
If I disable the second block of rewrite-once
rules that just try to prevent robots crawling foo.domain.com
and bar.domain.com
the first set of rewrites will work for all subdomains.
I am pretty sure url.rewrite-once += (...)
works because I have an earlier set of global rewrite-once rules that are applied.
Any idea why the last set of rewrite-once rules that are only supposed to be applied to foo and bar subdomains would prevent the earlier rules working?