4

I have a huge JSON file (1GB) which is basically an array of objects in the below format

[{"x":"y", "p":"q"}, {"x1":"y1", "p1":"q1"},....]

I want to parse this file such the all the data is not loaded in memory.
Basically I want to get for eg: first 1000 objects in the array to memory process it and then get the next 1000 objects into the memory process it and so on util all data is read.
Is there any JSON library that supports this use case? I currently use Gson. However it loads all the data to memory when I call gson.fromJson()

Thanks in advance for the help.

4

2 回答 2

1

看起来 Gson 有一个流 API,这就是你想要的:https ://sites.google.com/site/gson/streaming

于 2013-08-14T11:46:16.300 回答
1

使用Jackson,您可以使用JsonParser对象使用类似 SAX 的方法(流式传输),在您的情况下,它将是这样的:

JsonFactory jsonFactory = new JsonFactory();
JsonParser parser = jsonFactory.createParser(new File("/path/to/my/jsonFile"));

// Map where to store your field-value pairs per object
Map<String, String> fields = new HashMap<String, String>();

JsonToken token;
while ((token = parser.nextToken()) != JsonToken.END_ARRAY) {
    switch (token) {

        // Starts a new object, clear the map
        case START_OBJECT:
            fields.clear();
            break;

        // For each field-value pair, store it in the map 'fields'
        case FIELD_NAME:
            String field = parser.getCurrentName();
            token = parser.nextToken();
            String value = parser.getValueAsString();
            fields.put(field, value);
            break;

        // Do something with the field-value pairs
        case END_OBJECT:
            doSomethingWithTheObject(fields)
            break;
        }
    }
    parser.close();
于 2013-08-14T12:07:14.473 回答