0

我的代码如下,用于从 url 下载 xml 内容,这在 wifi 网络中下载需要更多时间,我的 xml 只有 29.2kb。我对此使用 AsyncTask。

InputStream getInputStreamForUrl(String url) {
        BufferedHttpEntity bufferedEntity = null;
        InputStream is = null;
        try {
            bufferedEntity = download(url);
            if (bufferedEntity != null) {
                is = bufferedEntity.getContent();
                if (is != null) {
                    BufferedReader feedReader = new BufferedReader(new InputStreamReader(is, Utility.UTF_ENCODING),
                            16 * 1024);
                    Utility.cacheFeed(feedReader, url);
                }
            }
        } catch (NetworkNotAccessable e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        } catch (IOException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        } finally {
            try {
                if (bufferedEntity != null) {
                    bufferedEntity.consumeContent();
                }
            } catch (IOException e) {
                e.printStackTrace();
            }
        }
        return (url != null) ? Utility.getInputStreamForCache(url) : null;
    }

我使用 HttpGet 请求的下载(url)方法如下:

public BufferedHttpEntity download(String url)
            throws ClientProtocolException, IOException, 
                    IllegalStateException, NetworkNotAccessable {
        HttpGet get = new HttpGet(url);
        HttpResponse response = mDefaultHttpClient.execute(get);
        int status = response.getStatusLine().getStatusCode();
        if (status != 200) {
            throw new NetworkNotAccessable(url + "error code:" + status);
        }   
        HttpEntity entity = response.getEntity();           
        BufferedHttpEntity bufHttpEntity = new BufferedHttpEntity(entity);      

        while (bufHttpEntity.isStreaming()) {
            try {
                bufHttpEntity.wait(500);
            } catch (InterruptedException e) {
                e.printStackTrace();
            }
        }
        return bufHttpEntity;
    }

请让我知道是否有任何最好的方法来压缩整个 url 并下载它。

4

1 回答 1

2

如果您说实际下载是在“下载(url)”方法中发生的,那么恐怕我看不到这种情况发生,下载方法也是从 getInputStream 方法调用的,您返回的输入流看不到任何原因那...

还有你为什么要使用 bufHttpEntity.wait(500); 这是一个阻塞状态(可能导致严重延迟)

在您的下载方法中使用以下代码来检索 xml:

    URL url = new URL(url);
       HttpURLConnection urlConnection = (HttpURLConnection) url.openConnection();
       try {
         InputStream in = new BufferedInputStream(urlConnection.getInputStream());
         byte buffer[] = new byte[4096];

         int count;
         String xmlData = "";
         while( (count = in.read(buffer)) != -1){
           xmlData += new String(buffer, 0, count);
       } finally {
         urlConnection.disconnect();
       }


             Log.d(TAG, " Data: " + xmlData );
于 2013-03-22T13:59:10.470 回答