1

If I receive a request from a Spider, I kick off a Phantom JS process and render back dynamic HTML. (Using a OnExecuting filter and setting the ActionResult)

But the OutputCache filter is in place on this method as well and it is getting in the way!.

E.G:

step 1. Load page with normal user agent. (Output cache caches the URL) step 2. Load page with spider user agent. (the previous cached response is sent to the spider, and my Phantom JS filter never runs)

4

1 回答 1

1

Use VaryByCustom to force a 'Cache Miss' when the request is from a Search Engine Crawler.

In your Controller/Action:

[OutputCache(VaryByCustom="Crawler")]
public ActionResult Index()
{
     // ...
     return View();
}

Then in your Global.asax:

public override string GetVaryByCustomString(HttpContext context, string arg)
{
    if (arg == "Crawler" && context.Request.Browser.Crawler)
           return Guid.NewGuid().ToString();

    return base.GetVaryByCustomString(context, arg);
}
于 2013-11-14T06:44:47.083 回答