0

I have a JSON array that I want to write it into a file. Everything works perfectly about writing the file but the problem is when I try to read it gives me a lot of uncoded text as ……. The text is in Arabic language.

This is my code to read the file:

        FileInputStream fis = openFileInput("File");
        BufferedInputStream bis = new BufferedInputStream(fis);
        StringBuffer sb = new StringBuffer();
        while(bis.available() != 0){
            char c = (char) bis.read();
            sb.append(c);
        }
        JSONArray read = new JSONArray(sb.toString());          
        for(int x = 0; x < read.length(); x++){
            JSONObject readOb = read.getJSONObject(x);
            String id = readOb.getString("id");
            String name = readOb.getString("name");
            Toast.makeText(Main.this, "Id: " + id + "Name: " + name , Toast.LENGTH_LONG).show();
        }
        bis.close();
        fis.close();

It would be great if anybody would suggest any solution to make text appear perfectly.

Edit:

This is what I tried to append it to StringBuffer.

            fis = openFileInput("Test");
        InputStreamReader isr = new InputStreamReader(fis, "UTF-8");
        StringBuffer sb = new StringBuffer();
        while(isr.read() != 0){
            sb.append(isr.read());
        }
4

2 回答 2

1

this

char c = (char) bis.read();

is wrong. JSON is (most of the time) encoded in UTF-8, meaning a byte is not a char. You need to use a Reader, which decodes a stream using an encoding.

For example:

InputStreamReader isr = new InputStreamReader(fis, "UTF-8");

Then you can read, it will return a complete UTF-8 character, which can be 1 to 4 bytes.

Here is how to completely and efficiently read the file:

FileInputStream fis = openFileInput("My Books");
Reader reader = new InputStreamReader(fis, "UTF-8");
char[] buf = new char[1024];
int red = -1;
StringBuffer sb = new StringBuffer();
while ((red = reader.read(buf)) != -1) {
    sb.append(buf, 0, red);
}
fis.close();

(which is quite close to the Apache EntityUtils implementation of reading an InputStream to a String)

I strongly discourage using BufferedReader.readLine, as seen in many places, for 2 reasons:

  1. You don't know that there is even a line break in your file. It could be a single zillion-character-long-line. You'd have to wait for it to be fully buffered, then read it, then append it. 3 instances in memory at once when you really need one.
  2. You'll irreversibly loose some characters. Namely, \n and \r. They will both be considered a line break, without any possibility for you to know which it was. In some cases, you'll want to know
  3. A server that is worried about its bandwidth will remove all extraneous characters from a JSON file. tabs, spaces, line breaks. goto case 1.

BufferedReader is interesting when reading from the network, and readLine when reading bounded-size lines, because it limits the number of read calls when the link is slow, and allow for line-by-line treatment.

于 2013-09-25T13:54:15.803 回答
0

While reading the data from the file try to encode it

Charset.forName("UTF-8").encode(sb.toString())
于 2013-09-25T12:21:48.450 回答