我想将 stackexchange 原始数据处理成 BigQuery,但首先数据使用 7z 压缩格式,所以我解压缩数据以将其移植为 gz 格式,但内部文件是 xml。所以我需要将文件从xml转换为json。有任何想法吗?我使用 p7zip 解压缩并使用 xml2json 尝试移植 xml 文件但不起作用。
<?xml version="1.0" encoding="utf-8"?> <comments> <row Id="1" PostId="1" Score="3" Text="We need to all post more questions. Last time, we kinda "rushed" to get a w hole bunch of people to sign up at the last minute (and pulled some funny stuff" CreationDate="2014-02-12T01:01:14.257" UserId="52" />..
我用 xml2json xml2json -t json2xml -o xxx.xml yyy.json
使用 xml-json 的其他测试 **David 推荐
通过以下命令使用来自 stackoverflow.com-Users.7z 的此文件 Users.xml(大小 895M):xml-json Users.xml row > Users.json
xml-json Users.xml row > Users.json /usr/local/lib/node_modules/xml-json/node_modules/xml-nodes/index.js:19 this.soFar += String(chunk)
RangeError: Invalid string length
at XmlNodes._transform (/usr/local/lib/node_modules/xml-json/node_modules/xml-nodes/index.js:19:15)
at XmlNodes.Transform._read (_stream_transform.js:183:22)
at XmlNodes.Transform._write (_stream_transform.js:167:12)
at doWrite (_stream_writable.js:265:12)
at writeOrBuffer (_stream_writable.js:252:5)
at XmlNodes.Writable.write (_stream_writable.js:197:11)
at Duplexify._write (/usr/local/lib/node_modules/xml-json/node_modules/pumpify/node_modules/duplexify/index.js:197:22)
at doWrite (/usr/local/lib/node_modules/xml-json/node_modules/pumpify/node_modules/duplexify/node_modules/readable-stream/lib/_stream_writable.js:237:10)
at writeOrBuffer (/usr/local/lib/node_modules/xml-json/node_modules/pumpify/node_modules/duplexify/node_modules/readable-stream/lib/_stream_writable.js:227:5)
at Writable.write (/usr/local/lib/node_modules/xml-json/node_modules/pumpify/node_modules/duplexify/node_modules/readable-stream/lib/_stream_writable.js:194:11)
at ReadStream.ondata (_stream_readable.js:539:20)
at ReadStream.emit (events.js:107:17)
at readableAddChunk (_stream_readable.js:162:16)
at ReadStream.Readable.push (_stream_readable.js:125:10)
at onread (fs.js:1581:12)
at Object.wrapper [as oncomplete] (fs.js:482:17)