我一直在研究一个 java 类,它从文本文件中读取大量 URL,打开HttpURLConnection
每个 URL,然后检查它们的 HTTP 代码以查看站点是否正确连接。此功能运行良好,问题是我在这里处理数十万个链接,而这种方法太慢而无法处理所有这些。我尝试使用线程来提高进程效率,但无论出于何种原因,这似乎进一步减慢了进程。对潜在解决方案有什么建议吗?谢谢!
public static boolean validate(String urlStr, Proxy proxy)
{
boolean valid = false;
HttpURLConnection conn = null;
try
{
if(proxy==null)
conn = (HttpURLConnection)new URL(urlStr).openConnection();
else
conn = (HttpURLConnection)new URL(urlStr).openConnection(proxy);
conn.setFollowRedirects(false);
conn.setConnectTimeout(7 * 1000);
conn.setRequestMethod("GET");
conn.setRequestProperty("User-Agent", "Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.9.1.2) Gecko/20090729 Firefox/3.5.2 (.NET CLR 3.5.30729)");
conn.connect();
final int code = conn.getResponseCode();
valid = (Arrays.binarySearch(errorCodes, code) == -1);
}
catch(IOException e)
{
// Ignore this for now. Eventually we should add a logger
} finally {
if (conn != null)
{
conn.disconnect();
}
}
return valid;
}