Find centralized, trusted content and collaborate around the technologies you use most.
Teams
Q&A for work
Connect and share knowledge within a single location that is structured and easy to search.
当我在我的 BigSQL 表中加载数据时,超过 32762 个字符的数据被截断。我的表定义如下:
CREATE hadoop TABLE schema_name.table_name ( column1 VARCHAR(50), column2 INTEGER, column3 STRING, loaddate TIMESTAMP ) 存储为 PARQUET;
for column3 被截断。有没有办法存储完整的数据?
也许 CLOB 是您正在寻找的答案