我需要缓存一个网页,然后对于未来的请求,检查缓存(使用 url 作为键),如果找到,从缓存中返回网页而不是发出请求。
我正在使用 Smiley 的 ProxyServlet,servlet 写入 OutputStream 的方法似乎非常适合缓存。我只添加了两行代码:
/**
* Copy response body data (the entity) from the proxy to the servlet client.
* TODO: CACHE entity here for retrieval in filter
*/
protected void copyResponseEntity( HttpResponse proxyResponse, HttpServletResponse servletResponse,
HttpRequest proxyRequest, HttpServletRequest servletRequest ) throws IOException
{
HttpEntity entity = proxyResponse.getEntity();
if ( entity != null )
{
String key = getCurrentUrlFromRequest( servletRequest ); // 1
basicCache.getCache().put( key, proxyResponse.getEntity() ); // 2
OutputStream servletOutputStream = servletResponse.getOutputStream();
entity.writeTo( servletOutputStream );
}
}
它有点工作,它确实将 HttpEntity 存储在缓存中。但是当我返回浏览器并再次请求相同的 url 时,当代码返回到我的过滤器中时,我使用 url 作为键获取 HttpEntity,并将其写入响应,但我得到一个“流已关闭“ 错误:
java.io.IOException: Stream closed
at java.base/java.util.zip.GZIPInputStream.ensureOpen(GZIPInputStream.java:63) ~[na:na]
at java.base/java.util.zip.GZIPInputStream.read(GZIPInputStream.java:114) ~[na:na]
at java.base/java.io.FilterInputStream.read(FilterInputStream.java:107) ~[na:na]
at org.apache.http.client.entity.LazyDecompressingInputStream.read(LazyDecompressingInputStream.java:64) ~[httpclient-4.5.9.jar:4.5.9]
at org.apache.http.client.entity.DecompressingEntity.writeTo(DecompressingEntity.java:93) ~[httpclient-4.5.9.jar:4.5.9]
at com.myapp.test.foo.filters.TestFilter.doFilter(TestFilter.java:37) ~[classes/:na]
这是过滤器:
@Component
@WebFilter( urlPatterns = "/proxytest", description = "a filter for test servlet", initParams = {
@WebInitParam( name = "msg", value = "==> " ) }, filterName = "test filter" )
public class TestFilter implements Filter
{
private FilterConfig filterConfig;
@Autowired BasicCache basicCache;
@Override
public void doFilter( ServletRequest servletRequest, ServletResponse servletResponse, FilterChain filterChain )
throws IOException, ServletException
{
String url = getCurrentUrlFromRequest( servletRequest ); // 1
HttpEntity page = (HttpEntity) basicCache.getCache().get( url ); //2
if ( null != page ) // 3
{
OutputStream servletOutputStream = servletResponse.getOutputStream(); // 4
page.writeTo( servletOutputStream ); // 5 stream closed :(
}
else
{
filterChain.doFilter( servletRequest, servletResponse );
}
}
public String getCurrentUrlFromRequest( ServletRequest request )
{
if ( !( request instanceof HttpServletRequest ) ) return null;
return getCurrentUrlFromRequest( (HttpServletRequest) request );
}
public String getCurrentUrlFromRequest( HttpServletRequest request )
{
StringBuffer requestURL = request.getRequestURL();
String queryString = request.getQueryString();
if ( queryString == null ) return requestURL.toString();
return requestURL.append( '?' ).append( queryString ).toString();
}
@Override
public void destroy()
{
}
@Override
public void init( FilterConfig filterConfig ) throws ServletException
{
this.filterConfig = filterConfig;
}
}
哦,还有 BasicCache 类以防万一:
@Component
public class BasicCache
{
private UserManagedCache<String, HttpEntity> userManagedCache;
public BasicCache()
{
userManagedCache = UserManagedCacheBuilder.newUserManagedCacheBuilder( String.class, HttpEntity.class )
.build( true );
}
public UserManagedCache getCache()
{
return userManagedCache;
}
public void destroy()
{
if ( null != userManagedCache )
{
userManagedCache.close();
}
}
}
我坚持使用这种非常本地化/手动/任何你想称之为缓存的方式——我不能使用明显的“只需连接 ehcache / redis / 任何东西,让它做它的事情”。因此,虽然我知道那些精细的缓存可以缓存整个网页,但我不知道它们是否允许我以这种公认的不同寻常的方式工作。
所以我希望 SO 可以告诉我如何完成这项工作。我首先尝试为我的基本缓存连接一个 ConcurrentHashMap ,但这也不起作用,所以我想看看我是否可以利用大型缓存枪所拥有的任何魔力,但到目前为止我还不能。
谢谢你的帮助!